I have a cloud pro instance running some workflow that periodically pulls a fairly large csv (13MB) (link) and needs to compare this data to a dataset in MySQL. However, it runs out of memory upon getting the csv.
My self hosted instance of n8n doesn’t have this issue.
Because I do not own the source csv (bot I do have permission to use the csv for this particular case) I cannot really split it into smaller chunks.
How can we handle this particular issue? Currently this limits our wish to migrate to the cloud version.
Thanks in advance for your response.