If your workflow generates a file but the next node needs a public URL, you’ve probably wrestled with Google Drive URL quirks or S3 setup overhead.
I built Upload to URL - a community node that gives you a public URL immediately, and lets you set an expiry date so the file self-destructs when you no longer need it.
Clever use case. The “I need a public URL right now” problem comes up constantly in n8n workflows – especially when you’re passing files between HTTP request nodes, webhooks, or AI pipelines that need URLs rather than binary data.
I’ve been handling this a few different ways depending on context:
Cloudinary for images/video (their unsigned upload endpoint works great with an HTTP Request node, no SDK needed)
Tmpfiles.org for quick temporary hosting (free, no auth, expires automatically)
My own n8n webhook as a proxy – upload to an S3-compatible store like Backblaze, then return the signed URL
The auto-expiry feature you built is the thing I always end up wanting but don’t have. Having to manually clean up old files in S3 is annoying.
Does your node handle large binary files okay, or is there a size ceiling to be aware of?