Do you know how to troubleshoot the problem? I have tried to put the Wrror Workflow, but it does not trigge, and the workflow did process huge amount of JSON data (per batch it is generating about 10MB Json file to put to MySQL):
Played around with both “Webhook” and “Execute Workflow”, the unexpected error sometime still appear when I use “Execute Workflow” - maybe due to same memory limit so I changed it to Webhook, currently it have been running (using Webhook) well synchronizing between API and MySQL report for reporting purpose.
The EPIPE problem seems to go away as I increase the max connections from 150 to 1000, and max idle connections time to 30s - looking forward to the commit in the next version for the MySQL node (to close the connections after activity)
Loving this product as this make aggregating data between system, and acting as middleware/ translator for reporting much easier than my previous approach using excel and vba.
Ah very strange that it still crashes with “Execute Workflow”. Are you making sure that not huge amounts of data get returned with the last node of the called workflow? Because if the last node does not return only very little data (like by overwriting it with a Set-Node), all the data ends up again in the Main-Workflow and so we are almost back where we started.
So it seems then really seems to be the connection issue. Will later today release the new version which should then fix that problem properly.
Great to hear that you enjoy using n8n. Everything is still early, so hopefully, we can make many things easier and more stable in the future!
i have a similar issue with webhook array ayload coming in with about 35MB of json data, and then just trying to split into items and process, using docker I can see its not out of memory of CPU but the flow dies, any ideas on how I can process this, I am unable to reduce the incoming size of the JSON webhook payload.