Creating large CSV file from an SQL query. Help please!

Describe the problem/error/question

Hey guys - I can’t figure this out and wonder if someone has already solved it.

I am querying an SQL DB, which for some queries returns a large number of rows (15k - 20k) with lots of columns.

I need to turn this into a CSV file for use later in the flow.

Initially I have used:
SQL query > Convert to CSV
It’s been fine for smaller numbers of rows (a few thousand).

For the larger numbers of rows, it’s too large for the process to handle and it errors / crashes.

Does anyone know how I can handle this in a different way?

My initial thoughts are around looping the process to extract batches of rows, creating a CSV and the joining all the CSVs at the end.

Loose example below:
I understand that here each file created would not transfer to the ‘done’ branch.

Anyone done something similar? Is there any other way to handle this?

Thanks all,
Rob

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @robsim,

Batching and looping is usually a good solution, did you manage to get this to work?

The workflow crashes probably due to not having enough memory to handle all the data. So one way to avoid this would be to increase your memory by upgrading to a higher plan if you’re on Cloud or increase available memory if you’re self-hosting by adjusting the resources allocated to your server or container.

Hope that helps!

Hi there Aya - no luck yet but thanks for your message.

It’s self hosted so we could increase resource that way but was wondering if there’s a way to handle it in smaller chunks.

Will keep at it.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.