Can we run a workflow for every file in a folder?

I have a folder with say 10 CSV files. I want to run a workflow that reads one CSV file at a time and process it. How can I achieve this without using the Read Binary Files node as reading all files together in the same instance of the workflow will cause memory issues. The file names and quantity are dynamic and thus I need to be able to execute the same workflow once for each file and process the file.

Also can a workflow take in parameters via n8n CLI or otherwise in which case I can write an external script that will start the workflow via CLI with filename as a parameter for each file in the directory?

Any guidance is much appreciated.

Thank you,

It sounds like you can either use Split In Batches, n8n CLI, or possibly rethink how you are trying to solve the problem? Might be time for a queuing system?

Here is an example using CSV file and local file trigger. You place the file in the folder and n8n processes the csv individually and iterates on the rows. See example below:

1 Like

@djangelic Yes Local File Trigger can pass path to Read Binary Node. Thank you soo much!

@cgsmith Split In Batches only works once the entire file is read. Due to string size limitation we are required to split large files outside n8n and process the individual files. n8n CLI is viable but we need to pass the name of the file to the workflow as file names are dynamic. And thus the second part of my question: Can a workflow take in parameters(file name) via n8n CLI?

1 Like

I don’t see why not, see documentation here: CLI commands - n8n Documentation

@djangelic Workflows don’t accept parameters do they? The docs don’t mention any file path parameters. The only parameter is the workflow file itself. I am referring to csv file name as a parameter to workflow execution via CLI…

I don’t think it is possible to pass args to the workflow when using the cli, if you are processing the file outside of n8n to split it though you could put the chunks into a folder being watched by the file trigger and do them one at a time.

@Jon I agree. That’s exactly what I am doing now.

Another workaround would be setting an environment variable, you can then read it in n8n via $env.VARIABLE_NAME

1 Like

@jan Oooh. That’s good stuff.

Any idea after running workflow using cli, how it is possible to save workflow output so I need to perform some verification on each of the workflow’s nodes outputs

@Wessam.hessien No need to use CLI. In UI create a workflow with a function node that uses fs libray and get all file names. Then use split in batches followed by workflow trigger. The workflow that gets triggered will process the files individually. For debugging run the nested workflow with a hardcoded file and for production set file path as input from parent workflow.

I haven’t tried function node with fs library to query all files in a local folder. Go ahead and let us know if it worked else we can come up with an alternate to using function node.

Alternatively you could also introduce Local File Trigger Node and drop the files one after the other in the target folder triggering an instance of the workflow for each new incoming file. Just make sure you activate the workflow to spawn an instance for every file.

Also note using S3 simplifies things as AWS S3 node has get all files with options to pass folder keys in which case you won’t need any function node.

Thanks @Gowthaman_Prabhu for your fast reply. Point is I’m doing integration test using Jest framework to test my company’s custom node along with n8n and I’m not using UI.

Basically I have workflows’ jsons files which includes many of the custom node. I run these files and then I would need to capture the output of each node to be able to verify that it contains the expected output

I created this topic for more details
How to run workflow and capture the output of all nodes - Questions - n8n