Handling High Volume Excel Files

Hello,

I’m attempting to bring in ~50k records from multiple excel files. If these files can only be imported (ex: no API available), what version and tier would be best suited for this? I have not been able to get this to work in the cloud version or the community version using npm. The “Extract from File” node with 50k records does not return an output. The “Extract from File” node with 6 records returns the expected output. What would the recommendation be here?

Thanks,
Colin

I am unsure of your initial question, but, we have worked with a client that needed to handle thousands of (custom) CSVs with millions of entries. A lot of the nodes would just crash or hang if not optimized logically. But available system resources was the first problem.

For immediate resources:
One step is to upgrade to the pro plan which provides more system resources (ram cpu). The other option is to use docker/docker compose (recommended) on a custom server and you can allocate as much RAM as you’d like.

The other help, is to design the scenario in more efficient chunks were possible. Working on smaller subsets of data at a time.

1 Like

Was able to move this to Docker. In doing so, realized that there also was an issue with the file name and n8n reading the file. Once the file name was updated - this worked like a charm.

1 Like