Large CSV Processing with SplitInBatches and HTTP Request

I’m currently working on a workflow to process a large CSV file (~126K rows) in n8n. My goal is to split the file into manageable batches and send each batch to a webhook for further processing. However, i’m running into issues related to rate limits when sending requests.

Please share your workflow

Share the output returned by the last node

Screenshot 2024-12-02 at 11.03.30

Information on your n8n setup

  • n8n version: 0.222.3
  • Database : postgresql
  • n8n EXECUTIONS_MODE: queue
  • Running n8n via: docker
  • Operating system: Macos

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @Mostafa_Nabawy

Thanks for posting here and welcome to the community! :cake:

To start with, I would recommend you update your n8n version to our latest (1.71.1) :sweat_smile:

You can use the Batching option of the HTTP node directly, that also has a time interval you can customise to slow down your requests.

image

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.