Running out of memory on cloud pro instance

Describe the problem/error/question

Hi there,

I have a cloud pro instance running some workflow that periodically pulls a fairly large csv (13MB) (link) and needs to compare this data to a dataset in MySQL. However, it runs out of memory upon getting the csv.

My self hosted instance of n8n doesn’t have this issue.

Because I do not own the source csv (bot I do have permission to use the csv for this particular case) I cannot really split it into smaller chunks.

How can we handle this particular issue? Currently this limits our wish to migrate to the cloud version.

Thanks in advance for your response.

Hi @Jelle_de_Rijke :wave:

Sorry that you’re running into this! I’m not to sure which version of our Pro offering you’re on, but it’s worth noting here that the second tier of Pro has a higher memory capacity than the first tier.

I understand that you can’t split up your file - If you have a large workflow this file is going into, can you split the workflow up into into sub-workflows?

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.