Small startup team around a conference table

Fieldwork, a B2B research platform based in Austin, had a tool stack problem. At 12 employees, they were paying for a project management platform, a separate client communication tool, and a standalone form-and-survey product. Each one did its specific job reasonably well. The problem was they didn't talk to each other.

Every time a new research project kicked off, someone on the operations side — usually their head of client success, Maya — had to manually create a project in the PM tool, copy the client brief into it, set up a client portal in the communication tool, create intake forms in the survey product, and send onboarding materials by hand. It took about two hours per new project. They were launching four to six projects a week.

Eight to twelve hours per week. Just on project setup. For a 12-person company.

The audit that surfaced the problem

Maya had been at Fieldwork for 18 months when she finally sat down and timed herself through a full project kickoff. She knew it was time-consuming. She didn't know it was that bad. Two hours and four minutes, start to finish. Every time. Consistent as a bad meeting.

When she mapped out what she was actually doing, about 90 minutes of those two hours was pure data transfer — copying the same information from one system to another. Client name, project scope, timeline, contacts, deliverables. Same fields, different tools, manual copy-paste, over and over. The remaining 30 minutes was actual work: reviewing the brief, personalizing the onboarding message, flagging anything unusual to the research lead.

The question became: why are we paying three separate vendors when what we actually need is for those three tools to behave like one?

What they built

The solution they landed on wasn't to replace the tools. They kept all three — clients were already using the portal, and switching had its own cost. What they changed was the glue between them.

The new workflow: when a signed contract came in (tracked via a field in their CRM), it triggered an automation that pulled the project details and did the following, in sequence, automatically:

  • Created a new project in the PM tool, pre-populated with the standard task template and the contract details
  • Set up a client portal in the communication tool with the client's name, project timeline, and a personalized welcome message using a template Maya had written once
  • Generated the intake forms in the survey product based on the project type (three different templates, matched by a field in the contract)
  • Sent the client a welcome email with portal access and the intake form link
  • Notified the assigned research lead on Slack with a summary of the project and a link to the PM board

Total time from contract signed to everything set up: about three minutes. Unattended.

The economics

At four to six projects per week and 2 hours per project, Maya was spending 8–12 hours on project setup. After the automation, she spends about 30 minutes — the parts that actually require judgment. The rest runs automatically.

That's 7–10 hours per week returned to the team. Maya now uses that time on strategic account management and a new research quality review process she'd been wanting to build for months. The company didn't reduce headcount. They got more output from the same team.

On the cost side, they also found they could downgrade one of the SaaS tools to a lower tier — they'd been paying for features they only needed because the manual process required them. Net cost reduction of around $340/month, plus whatever Maya's time is worth to the business.

What they learned along the way

It took about three weeks from idea to fully running automation, with a few false starts. The main lessons:

Data consistency matters more than you think. The automation relied on contract fields being filled in consistently. Early runs broke when someone entered "B2B qualitative" vs "B2B - Qualitative" for project type. They standardized the field to a dropdown. Problem solved, but it required going back and fixing the input step.

Test with low-stakes projects first. They ran the first three automations on internal test projects before turning it on for real clients. The welcome email had a formatting issue that would have been embarrassing to send to an actual client. Caught it in testing.

Build in visibility. They added a step that logged every project kickoff in a simple spreadsheet: project name, timestamp, which steps completed, any errors. This made debugging easy and gave Maya confidence that the automation was running correctly.

Fieldwork now has five active workflows running across the same three tools they had before. The tools didn't change. The way they connect changed. That's the move most companies miss — the power isn't in any one tool, it's in how they fit together.

Your tools could work like this too

NocodeBase connects your existing stack without replacing anything. Free trial, set up in minutes.

Start Free Trial

Ready to automate your first workflow?

Join 3,200+ teams who've stopped doing things manually.

Start Free Trial