I have an workflow that adds lead forms data to my activecampaign account using the ActiveCampaign API, and adds it’s data to an MySQL database.
I use 3 or 4 activecampaign nodes for each run.
Create contact (1 API call)
Adds contact to my master list (1 API call)
Would retrieve the complete activecampaign tag list, but I exported it to an MySQL database to speedup and minimize API requests.
If tag exists: Add tag to contact (1 API call) or If tag does not exist: create tag and add to contact (2 API calls) + Update tag database with new tag data.
Adds contacts data to database
As the Activecampaign API have it’s limit at 5 requests per second, if 3 or more leads send data at the same time (it’s really possible) I may reach it’s limit and some executions could fail.
Is there a way to limit parallels requests in different executions? Like, maximum N parallels executions in a timespan of N seconds. I know that it`s possible to limit executions by batches with http request node, but as far as I know, it only works with data within the same execution.
Hi @Fernando_Arata, there is a way but it’s a bit cumbersome. Essentially, you’d use the Split In Batches node to split up your total items into small enough batches and then loop through each batch. This allows you to also add Wait nodes into your loop to avoid hitting any external rate limit.
This will also work for single items. Here’s a quick example workflow showing the idea:
You can switch the Operation value of the Customer Datastore example data node to “One person” if you’d like to see this workflow processing one item only:
@MutedJam Works fine for one workflow entry, but what If 10s single entries executes at the same time? Does it knows that there is multiple workflows using those nodes and queue their exectutions, or that delay only works for that execution itself?
It is a per workflow setting so the workflow running itself wouldn’t be queued, If you had 10 items in one workflow it would do the loop but if you were to call the same workflow 10 times instead you would have a different result.
If there is a chance that the workflow will be started multiple times at once and you don’t want it to trigger the request you would need to build out a queuing system so your webhook would take the data and save it to a database or similar then another workflow would connect to that datasource on a schedule and process everything in it.
My workflow uses up to 5 API calls and the Activecampaign API rate limit is 5 calls/second. If I build the webhook/workflow like you said, I can make it queue all entries in an database and call N entries from the database being N the interval in seconds for each consult using Interval node.