We have a workflow that processes 20 CSV files and uploads them into Postgres. While the workflow successfully processes all files, it gets stuck when handling a CSV file containing 250k rows and 27MB in size.
No error message is displayed; the workflow simply stops at the “Extract from CSV” node after some loading.
What is the error message (if any)?
No error message. The workflow just stops processing.
This looks to be a memory issue given the size of the file.
Would you be able to share a time when you last encountered this error and I can confirm in the instance’s logs if there is a spike in memory usage that leads to the execution failure.
The email you sent to the Support inbox is not from an email address linked to your instance, would you be able to send me your instance username in a private message for me to check your logs?
Meanwhile, I’m thinking the best workaround is to split the biggest data load into two smaller loads and see if that bypasses the issue.
I have encountered this today when processing 37 CSV files. The max size of one is 2MB. Can you please check the logs and let us know is this really a memory issue?
If yes, how do I adjust the workflow so it somehow splits the files in smaller chunks automatically?
Thanks.
Looking at the logs from Jan 8th, when you saw the issue come up again, we can see your instance restarted a few times after hitting above its memory limits, after which it stabilised.
The instance also hit another OOM event on the 17th and was stable up until today.
There was a restart of the instance today, but that looks to have been intentional, judging by the logs