I’m excited to share that Upload to URL (n8n-nodes-uploadtourl) is now officially verified by n8n and available directly in the n8n integrations directory!
Integrations link: Upload to URL integrations | Workflow automation with n8n
The Problem It Solves
If you’ve been building AI workflows, multi-agent systems, or just working with external APIs, you’ve likely hit the “binary data wall.”
Many modern endpoints (like OpenAI Vision, Mistral OCR, or external webhooks) require a public URL to access a file. The standard workarounds usually involve:
-
Setting up AWS S3 buckets and wrestling with IAM/CORS policies.
-
Uploading to Google Drive, changing permissions to “Anyone with the link”, and building a separate cleanup node to delete it later.
-
Passing massive Base64 strings that bloat payloads and crash webhooks.
It’s a lot of infrastructure overhead for a file that only needs to exist for 10 seconds so an API can read it.
The Solution
I built Upload to URL to act as a frictionless middleman. It does one thing perfectly: it turns your n8n binary data into a temporary public URL, and then automatically cleans up after itself.
How it works:
-
Drop your file in: Connect any node outputting binary data (like an email attachment or a downloaded invoice).
-
Get a URL: The node instantly returns a fast, public CDN link.
-
Auto-Destruct: Set your desired expiry (e.g., 24 hours), and the file is permanently deleted. Zero digital clutter left in your cloud storage.
Common Use Cases
-
Vision & OCR Pipelines: Instantly pass email attachments or downloaded images to Claude/OpenAI for text extraction without touching S3.
-
Multi-Agent Handoffs: Pass a lightweight string URL between your n8n AI agents instead of heavy Base64 data, keeping your memory usage low.
-
Temporary Client Downloads: Generate a quick, expiring download link to send to a customer via email or Slack.
I originally built this just to stop my own Google Drive from turning into a graveyard of temporary workflow files, but I hope it helps speed up your builds too.
Would love to hear any feedback, feature requests, or interesting ways you end up using it!
