Good day good community.
My workflow is recentrly throwing erros regarding to memory capacity:
Problem in node ‘Item Lists‘
There might not be enough memory to finish the execution. Tips for avoiding this here
I know my data is huge. I wish I could split or lower the amount of data of each query but my API provider does not allow it.
I’m using n8n cloud and I wonder if by changing to a self hosted version in AWS I could set up a higher memory for each workflow and solve this issue.
In theory if you are self hosting n8n it will use as much memory as you give it, Our cloud plans have a limited amount of memory on them but if you are comfortable with managing a server you could set up an instance of n8n on a $24 a month VPS with 4GB of memory to play with.
It is worth noting though that we do consider self hosting an advanced option and outside of the application itself our support can be fairly limited.
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.