N8n execution stopped – may have run out of memory

Describe the problem/error/question

We encountered an error in n8n while processing a binary file in a Code node.
The execution stops with the following message:
“Execution stopped at this node – n8n may have run out of memory while running this execution.”
In our workflow, we are converting a binary file (PDF) to a Base64 string using JavaScript in the Code node.
Example logic used:
Get binary data from the item
Convert it to Base64 using binaryData.toString(‘base64’)
Store the result in items[i].json.pdfBase64
However, when running the workflow, the execution stops with a memory error.
We would like to understand:
Whether converting binary data to Base64 in this way can cause memory issues in n8n
If there is a recommended approach to handle large binary files
Whether there is a configuration or limit related to memory usage
Any suggestions or best practices would be appreciated.

What is the error message (if any)?

Please share your workflow


Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:
1 Like

Hi @fapdeveloper

Yes, converting binary data to Base64 inside a code node can cause memory issues, especially with larger files. What I usually recommend instead is avoiding manual conversion in a Code node and letting n8n handle the binary data when possible. Many nodes accept binary directly, so you can pass the file through without converting it.

If Base64 is strictly required (for an API, for example), another option is to process the file outside the Code node or ensure the file size stays within reasonable limits.

If you can share the approximate file size and how the Base64 is used afterwards, it would help suggest a more specific approach.

Hi @fapdeveloper Welcome!
If that is a self hosted instance try setting this env variable N8N_DEFAULT_BINARY_DATA_MODE=filesystem to store binary data on disk instead of in RAM so that this would be more reliable with large binary files, you can also increase the node.js heap with NODE_OPTIONS=--max-old-space-size=2048 also if you are a cloud user then consider getting a higher plan from your current one so that this does not happen.