The problem is that every time I launch my workflow that processes large data,
I get the display error page while N8N is still running.
I’ve tested this with Firefox, but the results are still the same. I had to refresh my page
and I could no longer see my N8N interface on the browser side. Sometimes, the workflow process may work until the end, sometimes not. What’s certain is that the process is still running.
With processes that are short and don’t require the management of voluminous data, there’s no problem with its execution.
In fact, to say that I’ve reached the N8N block limit, I’ve managed to split batch this data, but the problem is that the client browser display has a display problem.
Here is my dataset workflow which has a display problem.
This process consists in reading a file in the third-party database (Baserow same local server as N8N in docker)
And insert the data by Line and manage several lines of data.
It looks like you might need to break the workflow down into smaller chunks and use subworkflows to handle some of the data. There is no trigger on your node either so if you are loading a lot of data the browser is likely to crash if you get over the browsers memory limit and I suspect you are hitting both.
Thanks for your feedback,
What is the second limit other than browser memory please?
Yes, eventually there will be a trigger that will run every day, but I can also add that and reduce my nodes.
So if I want to divide the worflows into several sub-workflows, and I retest and see the outputs in the logs to track them. How can I debug the data by node? And also, how to increase memory in the N8N environment (docker file)?
NODE_OPTIONS=“–max-old-space-size=4096” n8n start but docker-compose file level?
The second limit and the one you are probably seeing in the workflow is n8n not having enough memory on the server, This can be tweaked using the option you found as long as the server has it but rather than doing that it might be better to tweak the workflow first so yiou can.
Annoyingly though we don’t have a way to see the data size while making the workflows but if you break it down to batches you will in theory have a batch number size you can tweak as needed.