OpenAI Chat Model Error 400 Invalid value for 'content': expected a string, got null

Describe the problem/error/question

I am building a workflow to process > 3.000 items through AI Agent. Just after a few items, sometimes 3, sometimes 7, the OpenAI Chat model node rises an error. The AI Agent receives plain text prompt so do no think the error is coming from any existing null field. My understanding is the AI agent and Chat model sync load would not be ok. I have tried to limit the number of items but error is the same.

What is the error message (if any)?

Error in sub-node ‘OpenAI Chat Model‘
400 Invalid value for ‘content’: expected a string, got null. Open node

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • **n8n version:1.41.0
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • **Running n8n via (Docker, npm, n8n cloud, desktop app):npm
  • **Operating system:Ubuntu

Hi @galop! Thanks for reaching out. Can you please share the output of the error you are seeing?

Hi ! many thanks for your reply, here you are:

NodeOperationError: 400 Invalid value for 'content': expected a string, got null.
    at ChatOpenAI.callMethodAsync (/usr/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/utils/logWrapper.ts:59:17)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at Proxy.connectionType (/usr/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/utils/logWrapper.ts:245:26)
    at async Promise.allSettled (index 0)
    at Proxy._generateUncached (/usr/lib/node_modules/n8n/node_modules/@langchain/core/dist/language_models/chat_models.cjs:118:25)
    at LLMChain._call (/usr/lib/node_modules/n8n/node_modules/langchain/dist/chains/llm_chain.cjs:157:37)
    at LLMChain.invoke (/usr/lib/node_modules/n8n/node_modules/langchain/dist/chains/base.cjs:58:28)
    at LLMChain.predict (/usr/lib/node_modules/n8n/node_modules/langchain/dist/chains/llm_chain.cjs:183:24)
    at ChatConversationalAgent._plan (/usr/lib/node_modules/n8n/node_modules/langchain/dist/agents/agent.cjs:476:24)
    at AgentExecutor._call (/usr/lib/node_modules/n8n/node_modules/langchain/dist/agents/executor.cjs:423:26)
    at AgentExecutor.invoke (/usr/lib/node_modules/n8n/node_modules/langchain/dist/chains/base.cjs:58:28)
    at Object.conversationalAgentExecute (/usr/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ConversationalAgent/execute.ts:104:19)
    at Object.execute (/usr/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/Agent.node.ts:323:11)
    at Workflow.runNode (/usr/lib/node_modules/n8n/node_modules/n8n-workflow/src/Workflow.ts:1378:8)
    at /usr/lib/node_modules/n8n/node_modules/n8n-core/src/WorkflowExecute.ts:1050:29
    at /usr/lib/node_modules/n8n/node_modules/n8n-core/src/WorkflowExecute.ts:1726:11

@Ludwig , Any solutions

Bump, same scenario here on some very basic AI Agent (both tools, open ai functions or conversational).

Did someone fix this issue? Im having the same problem, agent was working yesterday, today getting that error:

400 Invalid value for ‘content’: expected a string, got null.

guys do anyone know you to fix this? I have the same problem

Hi, anyone fix this? I have the same here

Hi, anyone fix this? I have the same here

I had this same issue and fixed it. I’m not sure if this fix will work for everyone, but I thought I’d share.

What I noticed was that the JSON coming from my agent had an additional key in it called “blocks”. I had been experimenting with trying to get my agent to output Slack Block Kit formatted JSON, but eventually decided to scrap that idea. However, I think that since that concept of blocks was already in the agent’s window memory, it kept resurfacing.

And something about that extra key caused problems later when I turned off ‘Require Specific Output Format’ in my AI Agent

Here’s how I solved it:

  1. I removed the Window Buffer Memory node
  2. I added a new Window Buffer Memory node
  3. I changed the session key by prepending a v2_ string to it:

Steps #1 and #2 may not have been necessary, but they made me feel good.

Just to reiterate, I changed my key from

{{ $node["Add Session ID"].json["sessionId"] }}

to

v2_{{ $node["Add Session ID"].json["sessionId"] }}

This essentially wiped out the agent’s memory, and that solved my problem.

I hope that someone more educated can swoop in and add some context here. I’ve only been working with n8n for… 3 days!! Good luck!

2 Likes

I had the same error and tried everything to troubleshoot. Finally just copy pasted everything into a new workflow and it worked straight away.

I had been messing with it a lot like running many tests and copying nodes from other workflows. Seems somehow it just glitched out and broke somewhere.

Was on the latest version 1.68 and self hosted Postgres.

Thank you! I had a similar issue, using Postgres memory, updated the key and it went away :grin: for a more permanent solution, I added an edit Field on the incoming message and generate an ID with the expression {{(+new Date).toString(36).slice(-5) + Math.random().toString(36).substr(2, 5)}} which I reference. Has worked like a charm

1 Like

Clearing the memory / updating the key to the attached memory, seemed to do the trick here.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.