Hi,
I setup an AI agent using a serverless API on RunPod for GPU using their vLLM template.
vLLM has an OpenAI compatible API so I set up credentials in N8N to use the RunPod vLLM API and it works fine - connects with no issue when using Basic LLM chains.
The problem presents when you try to use the AI Agent node with vLLM setup as an OpenAI API (using the RunPod API enpdoint as the Base URL) - as I said this works fine in other AI node but not the agent node.
If you don’t connect any tools it works (so adding the endpoint as the model and setting up memory (I used Postgres) works fine) but if I add a tool, it errors every single time with this error:
Cannot read properties of undefined (reading 'content')
I cannot figure out what the error means and it doesn’t even clearly state where the error is coming from, I only figured out it was caused when adding a tool by checking each connection to the agent node one at a time to figure out where it crashed.
The workflow works fine if I use the same model on Ollama - so it is not an issue with the model; and the RunPod API endpoint (vLLM) works fine without any tools attached to the agent - so this would seem to be an issue directly with the AI Agent node.
I am using the latest relase of N8N on MacOS in Docker.
I didnt paste the workflow code because it would be useless for troubleshooting as it needs the vLLM endpoint API setting up to test it - but this can be reproduced with a very simple web search workflow.