Ran out of space in n8n cloud - how to clear out temp files?

I ran out of space in n8n Cloud and I don’t know how to clear the temporary file directory. My workflow is now failing entirely as my instance must download a file.

The error message is “ENOSPC: no space left on device, write at write”

My workflow is downloading many video files to move them from Gong to Google Drive. Once they are uploaded to Google Drive, they can be removed from n8n but I’m not sure how to do that.

Now I’m out of space and the workflow will not run and I’m stuck.

Has anyone encountered this?

I did write to n8n support but I’m outside of their support hours given it’s the weekend.

Debug info

core

  • n8nVersion: 1.72.1
  • platform: npm
  • nodeJsVersion: 20.18.0
  • database: sqlite
  • executionMode: regular
  • concurrency: 20
  • license: community
  • consumerId: 00000000-0000-0000-0000-000000000000

storage

  • success: all
  • error: all
  • progress: false
  • manual: true
  • binaryMode: filesystem

pruning

  • enabled: true
  • maxAge: 720 hours
  • maxCount: 25000 executions

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

I manually deleted some past executions that downloaded files and that has helped. I’m not sure how to monitor storage usage in n8n to avoid this happening again for a large batch run. It makes the workflows more “fragile”.

Any suggestions are appreciated.

Welcome to the community @pdame !

Tip for sharing information

Pasting your n8n workflow


Ensure to copy your n8n workflow and paste it in the code block, that is in between the pairs of triple backticks, which also could be achieved by clicking </> (preformatted text) in the editor and pasting in your workflow.

```
<your workflow>
```

That implies to any JSON output you would like to share with us.

Make sure that you have removed any sensitive information from your workflow and include dummy or pinned data with it!


I can see you got a response from the Support team already. To avoid this issue you can stop saving the executions which is controlled by the workflow settings.

Also note that downloading large files also affects memory and might cause problem downloading. You might consider upgrading you account to PRO-2 plan which should double your workspace memory as per Cloud data management | n8n Docs.

Also bear in mind that saving binary files to disk (in case you do that) was not allowed in n8n Cloud (to avoid problems of this nature). This was enabled unintentionally and now we just have to live with this. You might consider self-hosting n8n to overcome the n8n Cloud hosting threshold limitation.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.