Memory / Performance issues

how do u guys handle workflows with a lot of data?

i have one that keeps crashing because of memory issues it seems like.
even after i split it up in 5 sub-workflows and not the full amount of data.

how can i maximize performance in this?
i can share the workflow if it helps.

what i tried:
i installed locally on my mac via docker
my computer with i9, 16gb, etc should be powerful enough
but it’s still laggy / doesn’t go through

thanks !!!

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

hello @tom-aera

Better to share the workflow. IT hard to say what is the issue

@tom-aera Could you show me the workflow? It seems to need to optimize the code

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.