🤔 Basic LLM chain node not working with Mistral Cloud model?

Describe the problem/error/question

The Basic LLM chain node throws out an error when plugged with Mistral Cloud Model.

What is the error message (if any)?


NodeOperationError: Cannot read properties of null (reading 'length') at ChatMistralAI.callMethodAsync (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:34:23) at processTicksAndRejections (node:internal/process/task_queues:95:5) at Proxy.connectionType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:156:47) at async Promise.allSettled (index 0) at Proxy._generateUncached (/usr/local/lib/node_modules/n8n/node_modules/@langchain/mistralai/node_modules/@langchain/core/dist/language_models/chat_models.cjs:118:25) at LLMChain._call (/usr/local/lib/node_modules/n8n/node_modules/langchain/dist/chains/llm_chain.cjs:157:37) at LLMChain.invoke (/usr/local/lib/node_modules/n8n/node_modules/langchain/dist/chains/base.cjs:58:28) at createSimpleLLMChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:92:23) at getChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:101:16) at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:416:31)

Please share your workflow

Share the output returned by the last node

[
  {
    "prompt": "Accueille ton Capitaine à bord, puis  annonce-lui les conditions de navigation de la soirée :\nCoucher de soleil : 21h04 ;\nMarée basse à 19h49 à Port-Navalo ;\nMarée haute à 02h45 à Port-Navalo ;\nVent actuel : 15 nœuds, direction nord-ouest ;\nLe vent va baisser jusqu'à atteindre 10 nœuds à 22 heures."
  }
]

Information on your n8n setup

  • **n8n version:**1.36.4
  • **Database:**SQLite
  • **n8n EXECUTIONS_PROCESS setting (default: own, main):**v1
  • **Running n8n via (Docker, npm, n8n cloud, desktop app):**Docker
  • **Operating system:**Ubuntu 22.04.3 LTS (GNU/Linux 5.15.0-94-generic x86_64)

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

And as always, I call the magical help of … the amaaaaaaazing @Jon !!! :magic_wand: :wink:

Cannot reproduce. Your example works fine here on 1.37.0.
Are you sure everything is well configured with you Mistral API key, i.e. payment information?
Have you tried making requests outside of n8n?

Hi @miguel-mconf, yes I do use the HTTP Request node to generate texts with Mistral Cloud. I do use the same credential on the HTTP Request node and on the LLM Chain node. :face_with_raised_eyebrow:

Fixed on 1.37.3 :white_check_mark:

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.