Describe the problem/error/question
What is the error message (if any)?
Is there a storage accessible that we can store files generated from the workflow within n8n ?
Please share your workflow
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)
Share the output returned by the last node
Information on your n8n setup
- n8n version: [email protected]
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app): 8n cloud
- Operating system:
Hi @siboniso_makhaye
Yes, n8n does have storage options for files generated inside workflows, but it’s important to understand how binary data is handled:
- By default, any binary data (e.g. images, PDFs, CSVs) that flow through a workflow is written to the local filesystem.
~/.n8n/tmp/ or /home/node/.n8n/tmp (if you’re using Docker)
These files exist only for the lifetime of the workflow execution. Once the execution is finished and depending on your cleanup settings, n8n may remove them.
If you want to keep generated files accessible later, you have a few approaches:
- Database Storage
- If you’re using Postgres/MySQL as your execution database, you can enable EXECUTIONS_DATA_SAVE_MANUAL_EXECUTIONS and configure n8n to persist binary data inside the DB.
- This can make files retrievable later via the Executions list in the n8n UI.
- Local File System (File node)
- Use the Write Binary File node to save the file permanently on disk (outside of tmp).
- Example path in Docker:
/home/node/.n8n/storage/
- Then mount that folder as a volume:
-v /your/local/folder:/home/node/.n8n/storage
Files will persist and be accessible as long as the container/volume exists.
- External Storage (best practice for large/important file handling.)
- Push files to Google Cloud Storage, S3, MinIO, etc.
- You can use different nodes like AWS S3, GCP, Dropbox, etc.
Hope this answers your question. 