Reducing Zapier Task Counts

I’ve managed to get Knack to connect to an API we need to access.

But Zapier wants a zap for every record it creates in knack. Last month I had 12,000 rows. Therefore I was looking for creative ideas to reduce this.

One way I wondered is whether I can take the API request I get in Zapier and just publish the whole lot to a Google sheet and then have that data imported into knack.

Has anyone tried that, or an approach that means where you have biggish data files, you can more easily import them without having to remortgage your house for more tasks!

Cheers.

Hi,

You could use Make.com instead of Zapier to do this and it should cost less than $15 month vs $150+ on Zapier.

Sending the data to a Google Sheet would cost you the same in terms of operations as sending the data directly to Knack. Plus it adds a manual operation to import the data via the builder. On Make you can add data in batches to a Google Sheet but the set-up is a bit more involved if you are new to the tool.

Thanks, does Make have the ability to send and configure a RAW API request? I need to be able to connect to a non-supported API to get the payload and then map it to knack.

Yes absolutely, you need to use the app called HTTP and there are several modules to pick from depending on the type of authentication (OAuth, Basic, …).

Thanks I appreciate that. As I have just cooked my brain for 2 days getting Zapier to work, I’ll give the brain a rest and then look at MAKE next. Thanks again for your help.

Hey GSH,

My team has run into similar issues and we often skip Zapier or make and instead use custom scripts. They allow us to dodge the task limits, fine tune the process for max efficiency, and are more flexible and cost effective.

The downside is that you need someone more technical to write and maintain the scripts, but we often find that the benefits are worth the trade-offs.

Just wanted to provide another potential solution.

Let me know if you have any questions

Thanks for this feedback and it is valid!

Less pieces in the funnel to deliver your data, less places for it to break. If you fancy a script share, DM me. I’ve got an API programmer but if I can accelerate their understanding of how to spoon the data into knack, connecting to the datasource with a web-hook is the easy part. Might have something tha you like too.

Otherwise, thanks for sharing your experience and advice.

I second this. If we have an action that I know will run thousands of times over a month, I deploy a python script instead to AWS. Otherwise if I know the action isn’t going to be run as often OR if I value the error logging, I put it on Zapier.