How to process large CSV files?

Describe the problem/error/question

We have a workflow that processes 20 CSV files and uploads them into Postgres. While the workflow successfully processes all files, it gets stuck when handling a CSV file containing 250k rows and 27MB in size.

No error message is displayed; the workflow simply stops at the “Extract from CSV” node after some loading.

What is the error message (if any)?

No error message. The workflow just stops processing.

Please share your workflow

This is how the workflow logic looks like:

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.70.3
  • Database : SQLite
  • n8n EXECUTIONS_PROCESS setting: own, main
  • Running n8n via: n8n cloud
  • Operating system: Windows

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Following up on this one, would appreaciate your help.

All small files are being proccessed, but a 27mb file (which is the biggest) doesnt.

Hi @Dmytro_Rudnik, @Nym_Sirion,

This looks to be a memory issue given the size of the file.
Would you be able to share a time when you last encountered this error and I can confirm in the instance’s logs if there is a spike in memory usage that leads to the execution failure.

The email you sent to the Support inbox is not from an email address linked to your instance, would you be able to send me your instance username in a private message for me to check your logs? :slight_smile:

Meanwhile, I’m thinking the best workaround is to split the biggest data load into two smaller loads and see if that bypasses the issue.

I would also advise reading our guide on memory management for cloud instances: Cloud data management | n8n Docs

Let us know how you get on! :raised_hands:

Hey @mariana-na !

I have encountered this today when processing 37 CSV files. The max size of one is 2MB. Can you please check the logs and let us know is this really a memory issue?
If yes, how do I adjust the workflow so it somehow splits the files in smaller chunks automatically?
Thanks.

@Dmytro_Rudnik Sure thing! Could you just please confirm in a PM what if the username of your instance? :slight_smile:

Hey @Dmytro_Rudnik, to process large CSV files, you can split them into smaller chunks using a method like this:

2 Likes

Hey! I have sent you a message via email! Thanks!

Got it!
I’m taking a look as soon as possible and will get back to you on it :slight_smile:

Hi @Dmytro_Rudnik ,

I’m sorry for the wait.

Looking at the logs from Jan 8th, when you saw the issue come up again, we can see your instance restarted a few times after hitting above its memory limits, after which it stabilised.

The instance also hit another OOM event on the 17th and was stable up until today.
There was a restart of the instance today, but that looks to have been intentional, judging by the logs

Have you meanwhile implemented the changes of splitting the CVS files into smaller bits? If so, have you been seeing the same behavious regardless?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.