Reading big CSV on self hosted business plan: Invalid string length

Thanks for your response!

I think there is some misunderstanding here - maybe because I did not share my workflow. I am posting below now.

According to what I analyzed so far, I assume it’s a problem with finishing the workflow, not a memory leak on the way. I am already reading in chunks.

As written in the original question I use the linked approach Convert Binary file To JSON or CSV - #5 by MutedJam which uses sed -n start, end in the execute Command node to read the file in chunks.

I am pretty sure that this technique doesn’t use much memory in the n8n node process - but only for the actual chunk of data.

I tried with reading 10000 and 1000 lines chunks and a single chunk seems to fit properly in n8n’s memory. I read such still quite big chunks because I also want to pass data to the Shopware 6 sync API in chunks and not line by line. The lines are pretty short (one line ~ 64 Bytes, but 2,920,269 lines in total, full size 189MB, but a 1000 line chunk would only be 65KB)

As you can see in the workflow I also added a progress logging which prints out the current progress to the docker’s console (when executed via the webhook for testing, instead of a manual test run)

There I can see that the batching runs well until the full file is read (until line 2921001) as printed in the original log (I renamed the Node from earlier “Code1” to “Log current status”)

So I think reading of the file is actually working. But then after everything was read, I get the above error message and I don’t know why.

I think it has to to something with the finalization of the workflow - because it it’s a memory problem during the reading process, I would be a big coincidence that it just happens after everything was read?