I have an workflow that adds lead forms data to my activecampaign account using the ActiveCampaign API, and adds it’s data to an MySQL database.
I use 3 or 4 activecampaign nodes for each run.
Create contact (1 API call)
Adds contact to my master list (1 API call)
Would retrieve the complete activecampaign tag list, but I exported it to an MySQL database to speedup and minimize API requests.
If tag exists: Add tag to contact (1 API call) or If tag does not exist: create tag and add to contact (2 API calls) + Update tag database with new tag data.
Adds contacts data to database
As the Activecampaign API have it’s limit at 5 requests per second, if 3 or more leads send data at the same time (it’s really possible) I may reach it’s limit and some executions could fail.
Is there a way to limit parallels requests in different executions? Like, maximum N parallels executions in a timespan of N seconds. I know that it`s possible to limit executions by batches with http request node, but as far as I know, it only works with data within the same execution.
Hi @Fernando_Arata, there is a way but it’s a bit cumbersome. Essentially, you’d use the Split In Batches node to split up your total items into small enough batches and then loop through each batch. This allows you to also add Wait nodes into your loop to avoid hitting any external rate limit.
@MutedJam Works fine for one workflow entry, but what If 10s single entries executes at the same time? Does it knows that there is multiple workflows using those nodes and queue their exectutions, or that delay only works for that execution itself?
It is a per workflow setting so the workflow running itself wouldn’t be queued, If you had 10 items in one workflow it would do the loop but if you were to call the same workflow 10 times instead you would have a different result.
If there is a chance that the workflow will be started multiple times at once and you don’t want it to trigger the request you would need to build out a queuing system so your webhook would take the data and save it to a database or similar then another workflow would connect to that datasource on a schedule and process everything in it.
My workflow uses up to 5 API calls and the Activecampaign API rate limit is 5 calls/second. If I build the webhook/workflow like you said, I can make it queue all entries in an database and call N entries from the database being N the interval in seconds for each consult using Interval node.
This is what I’m doing, Workflow1 to receive HTTP Requests and store them in Baserow, then Workflow2 to process the Baserow data.
Workflow2 could still get parallel executions, so some Nodes using N8N API to only let it run if the current execution is the current older active/waiting execution.
By the time Workflow2 finishes there could be new data in Baserow to process, so I tried using a Webhook to re-start the Workflow, but it doesn’t work because the Execution that just made the HTTP request is still considered as active.
So we need an intermediary Workflow3 with a wait node for a few seconds.
The issue with this is that it increases Active Workflow and Executions, which is the way N8N cloud is currently priced.
@Jon sorry to tag you directly, but is there a better alternative to avoid parallel processing?
I can’t understand how people survive without this, we have critical things that can only be done one time, like invoicing… if there’s a duplicate http call for the same order, it will issue two invoices as the “check if already invoiced” Node becomes unreliable.
At the moment there is no better way, the way I handled this before was to check the input before saving it so I could ignore a duplicate before it hits the database and also setting a known value to be unique so for invoicing the reference / purchase order could be treated as a unique field and your database would only allow it once.
I would then also check when pulling from the database on some of the content fields to make sure something wasn’t submitted twice with different ids by mistake.
It isn’t ideal and it would be great to have an option to only allow one instance of a workflow to be running at once.
Ah so I don’t think Baserow has this as an option yet but I would recommend putting in a feature request with them for it, You could try manually doing by getting the value from the data first to see if it exists but there is a potential overlap there which is where the processing workflow can try and detect any duplicates by looked at different fields to see if they match.