Handle chat file upload

I’m working on an n8n Cloud flow that receives a message and an uploaded file. In the flow, I check if a file is included in the message, and if it is, I convert the file from binary to a Base64 string so I can pass it to an OpenAI model using the AI Agent node.

However, I’m running into token limit issues—likely because the Base64 string is too large for the model to handle. Are there any best practices or workarounds for handling file uploads in this context, especially for passing content to OpenAI models without hitting token limits?

I’m fairly new to n8n, so I’d appreciate any guidance or alternative approaches.

Maybe you can try and limit the size of the image by resizing it before you send it to openai. Maybe resize to 1024x1024. If you NEED to send full quality images for whatever reason your use case is, look into using Google’s Gemini model instead which has a 1 million token context window.