Hi there,
I want to build workflows where custom presentations are generated on a per lead/contact basis which adds up large amounts of binary data quite quickly.
Now, I self host N8N with the following settings:
sudo docker run -d --restart unless-stopped -it \
--name n8n \
-p 5678:5678 \
-e N8N_HOST="myn8n.your-domain.com" \
-e WEBHOOK_TUNNEL_URL="https://myn8n.your-domain.com/" \
-e WEBHOOK_URL="https://myn8n.your-domain.com/" \
-e N8N_ENABLE_RAW_EXECUTION="true" \
-e NODE_FUNCTION_ALLOW_BUILTIN="crypto" \ # adding Javascript Package Crypto just to show how the packages would be added
-e NODE_FUNCTION_ALLOW_EXTERNAL="" \ # needed for external
-e N8N_PUSH_BACKEND=websocket \
-v /home/mygoogleaccount/.n8n:/home/node/.n8n \
n8nio/n8n
My question is what would be the best way to setup my instance in a way (which has limited RAM and Storage) that it does not store the data in the filesystem (N8N_DEFAULT_BINARY_DATA_MODE ) but also does only load it in the RAM for the time it needs it within a flow and get immidiately replaced by the next binary document keeping the instance from crashing instantly.
Information on your n8n setup
- n8n version: 1.93
- Database (default: SQLite): SQLite
- n8n EXECUTIONS_PROCESS setting (default: own, main): own
- Running n8n via (Docker, npm, n8n cloud, desktop app): Google Cloud Project
- Operating system: Windows10