Processing CSV files more than 100k records in Batches

One thing I wanted to highlight was I said your fastest (not only) solution was to upgrade or self-host. I was able to run your workflow with just a minor tweak on a Pro tier cloud plan. That being said, there are alternative paths you could go down for preprocessing your file in such a way that you wouldn’t ever encounter a memory warning even on our lower tier paid plans, if that’s your desired approach.

You can find a number of these mentioned by community members here if you search for CSV-related issues. Example:

Regarding your memory-related questions, I encourage you to reach through these articles.

I hope this helps!