K8 with 10 workers and dedicated webhook (docker image)
Version 1.18.0
Postgresql managed
EXECUTIONS_PROCESS - main
My problem 3rd party website/server that has limited resources
(8gb Cloudpanel / WP CLI & REST API)
Running a webhook initiated workflow
I have instances where I may get several webhooks at once which over work the 3rd party server which is a node in the flow, with the server load blowing out and things have a knock on effect.
What is the best way to throttle how many executions can happen on a workflow at once?
Need some sort of queue / throttle, with the queue only proceeding when the 3rd party server load is < XYZ
My only thought would to put the webhooks into a temp database, then have a scheduled workflow to check the database and if records, check the server load via ssh, then loop through the database records - checking load in each loop.
This setting does not exist on a per-workflow basis, but you could consider setting up a designated n8n instance for this special workflow if you don’t want to risk other workflows being throttled.
Need some sort of queue / throttle, with the queue only proceeding when the 3rd party server load is < XYZ
My only thought would to put the webhooks into a temp database, then have a scheduled workflow to check the database and if records, check the server load via ssh, then loop through the database records - checking load in each loop.
This is more complicated. If a fixed concurrency value as above is not sufficient, such an approach can indeed be a viable alternative.
Hi @Dwayne_Taylor, tbh I can’t think of any useful improvements here. But if you have a bit of time it would be great if you can document your use case and the currently missing functionality over in the Feature Requests. This hopefully allows our product team to consider simplifying things going forward