Describe the problem/error/question
I’m reading a list of files from disk, using read binary file with a file mask like *.txt. It seems to pick up all files as I can see n items in the editor.
Then I need to process each file, and insert every line to a table in postgresql. I use Set to break lines (\n), and Item Lists to build a list of rows to insert. Then use Postgresql. This works fine.
But the issue is - it seems Item Lists accumulates all files, and I need to process 1 file at a time…
You can see we start with 20 files, and turn that into 2020 items which is incorrect. I have around 100 lines per file, so Item Lists should produce only 100 items.
I’m still figuring n8n out, so I’m not sure there isn’t a simpler way to do this: read a folder with files, import each file at a time, each line in a file into a record in a postgresql table. (I couldn’t use Spreadsheet file since it always tries to parse each line into separate columns which I don’t want to do.)
Information on your n8n setup
- n8n version: 0.222.3
- Database (default: SQLite): default: SQLite
- n8n EXECUTIONS_PROCESS setting (default: own, main): default
- Running n8n via (Docker, npm, n8n cloud, desktop app): npm
- Operating system: ubuntu 22.04