Can n8n handle large amount of data set?

I wanted to build automation for a supply Chain industry and they deal with large amount of data I wanted to ask you can n8n handle large amount of data without giving an error?

Describe the problem/error/question

What is the error message (if any)?

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

yes, as long as you have proper workflow design. :slight_smile:

Hi @abwahab,

Forgive me, but your question is too generic.

Try to gather and provide as much information as relevant to the question.

For instance

  • What amount of data would n8n have to process, records/sizes
  • What would be the source, binary files or an API
  • Are you already running the flow(s) and having issues processing data
1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.