Issue with Loop Over Items Node Processing Multiple Files Instead of One at a Time

TL;DR

Trying to process FTP files sequentially, but the Loop Over Items node seems to accumulate and reprocess previous files with each iteration, causing exponential data growth.

The Problem

I have a workflow that should:

  1. Get a list of file paths from FTP

  2. Process one file at a time through the workflow

  3. Store the data in a database

  4. Move to the next file

But instead, this is happening:

  • Iteration 1: Processes file1.csv :white_check_mark:

  • Iteration 2: Processes file1.csv + file2.csv :x:

  • Iteration 3: Processes file1.csv + file2.csv + file3.csv :x:

  • And so on…

Current Workflow Structure


Get File Paths → Loop Over Items → SFTP Download → Process File → Database Insert → (loop back)

Observed Behavior

With each iteration, the workflow is:

  1. Taking the entire accumulated list of files

  2. Sending them all to the SFTP node

  3. Downloading all files again

  4. Processing all files again

  5. Inserting all data again

This creates an exponential growth pattern where:

  • First run: 1 file processed

  • Second run: 2 files processed

  • Third run: 3 files processed

  • By run 10: Processing 10 files simultaneously

Questions

  1. How can I ensure the Loop Over Items node only processes one new file per iteration?

  2. Is there a way to prevent the node from accumulating previous files?

  3. What’s the correct workflow structure for sequential file processing?

Environment

  • n8n version: [your version]

  • Node type: Split In Batches (Loop Over Items)

  • Configuration:

  • Batch Size: 1

  • Reset: Enabled

Any guidance would be greatly appreciated! :pray:

Hi,

Can you pin some actual run data to your workflow please?

What we think should happen in the code is not always the outcome so it would be easier to see the data structure in order to be able to troubleshoot.

Regards
J.

@Michael_McReynolds I was able to bypass this issue by creating a sub workflow, and that did the trick. Weird bug I guess.

1 Like

Your current workflow structure causes each iteration to re-use the previous outputs, because the node after SplitInBatches (like SFTP Download) is using all previous inputs if not carefully isolated.

This leads to:

Iteration 1: Processes 1 item

Iteration 2: Adds 1 more, processes 2 items

Iteration 3: Adds 1 more, now 3 items… and so on

This causes exponential execution across all child nodes, which is not the desired behavior for true 1-by-1 processing.

1 Like