AI Agent Node doesn't work with vLLM OpenAI API

Hi,

I setup an AI agent using a serverless API on RunPod for GPU using their vLLM template.

vLLM has an OpenAI compatible API so I set up credentials in N8N to use the RunPod vLLM API and it works fine - connects with no issue when using Basic LLM chains.

The problem presents when you try to use the AI Agent node with vLLM setup as an OpenAI API (using the RunPod API enpdoint as the Base URL) - as I said this works fine in other AI node but not the agent node.

If you don’t connect any tools it works (so adding the endpoint as the model and setting up memory (I used Postgres) works fine) but if I add a tool, it errors every single time with this error:

Cannot read properties of undefined (reading 'content')

I cannot figure out what the error means and it doesn’t even clearly state where the error is coming from, I only figured out it was caused when adding a tool by checking each connection to the agent node one at a time to figure out where it crashed.

The workflow works fine if I use the same model on Ollama - so it is not an issue with the model; and the RunPod API endpoint (vLLM) works fine without any tools attached to the agent - so this would seem to be an issue directly with the AI Agent node.

I am using the latest relase of N8N on MacOS in Docker.

I didnt paste the workflow code because it would be useless for troubleshooting as it needs the vLLM endpoint API setting up to test it - but this can be reproduced with a very simple web search workflow.

As an easy example to replicate:

Setup a RunPod Serverless vLLM Endpoint and copy the URL to the API - it should look like this:

https://api.runpod.ai/v2/<API KEY>/openai/v1

Setup a basic N8N workflow as follows:

Chat Trigger Node → AI Agent Node
Add an OpenAI model and set the Base URL to the RunPod API Endpoint (as explained above)
Select the AI model from the list
Add Simple or Postgres Memory
Do not add any tools

Query the Workflow and the AI will send you a response.

Now add a Code tool and just leave the default code (to convert the query to Uppercase) and try to query the workflow - it will crash with the error I posted in the original post.

It does this with all tools (not just the code tool - I just used it as the example here because it is very simple) - it works fine with no tools attached.

1 Like

Same issue here

1 Like

Doesn’t look like anyone has a solution.

This should really be looked into because the same thing happens with a self hosted vLLM endpoints as well.