Best Practice for large files

Describe the problem/error/question

I am working with large images, downloading them and editing them. Already in a Loop to batch it. We are quickly running into memory issues. I also tried to cloud version, but there it happened even quicker. No chance on processing the images.

Therefore I have a few questions:

  1. Are there best practices on how to deal with large files (50MB - 100MB)? In cloud as well as self-hosted
  2. Are there options/limits in the cloud version to set, to improve the editing of such files?
  3. Are such issues fixed in the enterprise version of n8n?

What is the error message (if any)?

Ran out of memory

Information on your n8n setup

  • n8n version:
    • Community and Cloud
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
    • We tried both versions

Hey @Kiven hope all is good.

The resource limitations for the cloud per plan are mentioned here:

And here you can read about scaling and performance for self-hosted version wrt handling binary data:

Hi Jabbson,
Thanks for the resources, I will check them out!

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.