TL;DR
Trying to process FTP files sequentially, but the Loop Over Items node seems to accumulate and reprocess previous files with each iteration, causing exponential data growth.
The Problem
I have a workflow that should:
-
Get a list of file paths from FTP
-
Process one file at a time through the workflow
-
Store the data in a database
-
Move to the next file
But instead, this is happening:
-
Iteration 1: Processes
file1.csv
-
Iteration 2: Processes
file1.csv
+file2.csv
-
Iteration 3: Processes
file1.csv
+file2.csv
+file3.csv
-
And so on…
Current Workflow Structure
Get File Paths → Loop Over Items → SFTP Download → Process File → Database Insert → (loop back)
Observed Behavior
With each iteration, the workflow is:
-
Taking the entire accumulated list of files
-
Sending them all to the SFTP node
-
Downloading all files again
-
Processing all files again
-
Inserting all data again
This creates an exponential growth pattern where:
-
First run: 1 file processed
-
Second run: 2 files processed
-
Third run: 3 files processed
-
By run 10: Processing 10 files simultaneously
Questions
-
How can I ensure the Loop Over Items node only processes one new file per iteration?
-
Is there a way to prevent the node from accumulating previous files?
-
What’s the correct workflow structure for sequential file processing?
Environment
-
n8n version: [your version]
-
Node type: Split In Batches (Loop Over Items)
-
Configuration:
-
Batch Size: 1
-
Reset: Enabled
Any guidance would be greatly appreciated!