Describe the problem/error/question
I have self-hosted n8n with a supabase postgres database. I saw a warning in supabase that I had exceeded my quota and had 1.2GB of data in my database.
Turns out that it was largely my binaries. I’ve got a big workflow that in the process of making a video ends up with 20-40 MB of downloads. What I did not realise is these actual binaries are being stored in the execution_data table - hence blowing me WAY over my limit.
Even now, after I deleted all my executions, and did a VACUUM execution_data in the database, there’s still 62MB of data hiding out somewhere that I just cannot find. By my reckoning there’s maybe 0.5 MB of data in my visible tables. Maybe there are some hidden ones?
Anyway… my question … how can I do any of:
- Move my binary storage in the execution data OFF the database and on to disk.
- Delete the binary data as I go, removing it once I have it on S3. I asked Chat GPT and it told me there was a Delete Binary Data node, which there isn’t LOL.
- or alternatively, pipe the data direct to S3. At the moment I am GETting the data, then uploading it to S3. But if I could send the data to S3 directly that would be even better I think.
Apart from my final upload to YT I don’t think I actually NEED the downloads in my calls. I am only pulling it into n8n because I don’t know how to get it to S3 without doing that.
And if I could work out how to pipe from S3 into YT then that would be useful too.
Any ideas?