AI Agent node fails with 'asyncIterator' error when using Google Vertex AI


Hello,

I am consistently running into a Cannot read properties of undefined (reading 'Symbol.asyncIterator') error when trying to use the AI Agent node with Google Vertex Chat Model.

This error persists even in a minimal test case, which suggests a deeper issue or bug.

My Setup:

  • n8n Version: 1.102.3
  • Hosting: Self-Hosted on Railway
  • Node: AI Agent
  • Chat Model: Google Vertex Chat Model using gemini-1.5-pro

What I have tried so far (extensive debugging):

  1. Simplified the workflow to a single, standalone AI Agent node.
  2. Used a simple, hardcoded prompt (“Schrijf een kort gedicht over een fiets”) with no variables. The error still occurs.
  3. Checked Google Cloud Permissions:
    • The Vertex AI API is enabled.
    • The Cloud Resource Manager API is enabled.
    • The Service Account has been granted both the Vertex AI User and Service Account Token Creator roles.
  4. Checked Node Configuration:
    • The correct Google Cloud Project ID is selected.
    • Toggling the “Require Specific Output Format” on and off makes no difference.

This leads me to believe it’s not a configuration error on my end, but potentially a bug.

Could you please advise? Thank you!

2 Likes

I came across this same issue today when doing Vertex testing. I found in the Google Vertex Chat Model the default Model Name N8n populates is “gemini-1.5-flash”. When I change it to “gemini-2.5-flash” it works as expected. I suspect the issue is likely with your model name not matching what Vertex expects.

2 Likes

Thanks - had this same issue.

Is there any other way of correcting this issue
since your solution didn’t work for me, if yes can you share that idea

the solution didn’t work for me either, is there any other way?

What’s the issue @Fernando_by

I’m running a very simple test workflow with basic nodes: a Google Vertex node connected to a chat input. However, even though my Google Vertex API is configured correctly, I keep getting the following error:

Cannot read properties of undefined (reading 'Symbol(Symbol.asyncIterator)')

The thing is, Vertex AI has specific model names that you need to use, and “gemini-2.5-flash” isn’t one of them. Instead, you can use models like “gemini-1.5-flash-001” or “gemini-1.5-pro-001”.

I tried the AI Agent, Basic LLM Chain, I changed the different version of Gemini 1.5 and 2.5 and I get the same “asyncIterator” error. Http Requests work to my vertex ai environment but definitely is a lot more work than the Vertex AI node. Anybody found a fix yet? Hopefully this gets fixed in the next releases

I had the same issue and tested different model names (all names tested are valid in the official Google Python SDK google-genai). Some models work but some don’t.

Please upvote this bug report on Github to possibly get it fixed sooner.

Below is the error message in the log file:

{
    "__type": "$$EventMessageWorkflow",
    "id": "",
    "ts": "",
    "eventName": "n8n.workflow.failed",
    "message": "n8n.workflow.failed",
    "payload": {
        "userId": "",
        "executionId": "",
        "success": false,
        "isManual": true,
        "workflowId": "",
        "workflowName": "Agent playground",
        "lastNodeExecuted": "AI Agent",
        "errorNodeType": "@n8n/n8n-nodes-langchain.agent",
        "errorMessage": "Cannot read properties of undefined (reading 'Symbol(Symbol.asyncIterator)')"
    }
}

I dont know if this helps anymore but for other readers: If the model is not hosted in the specified region (for example you cant use 2.5 pro in eu3 frankfurt) then you will get the same error. Sometimes it helped to reassign the correct project (by ID instead of by picking it from your list).

I guess @jensus is right.
There is already an open github issue. In the code, n8n is setting the region via the credentials used. As in the Google Docs stated, the model availability for e.g. europe-west3 (Frankfurt) is very restricted. Therefore you can set the region in the Google Vertex Credentials used for the model (e.g. to “europe-west4”).

1 Like

In case this is helpful for anyone else that comes across this thread, I was having the same issue and it turned out that the Service Account required a role associated (I used ‘Viewer’) for whatever reason.

I am having a similar issue. But I am only experiencing the problem when I have a Postgres Chat Memory node connected to the same AI agent as the Vertex AI model. When i disconnect the Postgress Chat Memory node it works fine.

I spent a day to figure out that Google Cloud project just needed a Billing Account linked to it. See my reply on this post: Vertex AI LangChain Agent fails ('asyncIterator' error) when Postgres Chat Memory is connected

You may or may not have the same root cause, but it is likely just that “something connected” is sending a normal error code, and n8n node can’t handle normal error codes, so it bugs out and makes its own unrelated error which tricks you to look for unrelated issues/troubleshooting.

1 Like

Brilliant thanks Kristopher - I had the same issue