[Problem in node ‘Summarization Chain‘ : fetch failed] error occurred and I need help! (local & ollama)

My n8n version is 1.56.2 (Self Hosted), and running by docker-compose.
Ollama env is ollama.app(local app) and i use MacOS.

When i try to use model in SummarizationChain, raised [Problem in node ‘Summarization Chain‘ : fetch failed].

This is my error log(n8n).

Stack trace

TypeError: fetch failed at node:internal/deps/undici/undici:12502:13 at processTicksAndRejections (node:internal/process/task_queues:95:5) at createOllamaStream (/usr/local/lib/node_modules/n8n/node_modules/@langchain/community/dist/utils/ollama.cjs:12:22) at createOllamaGenerateStream (/usr/local/lib/node_modules/n8n/node_modules/@langchain/community/dist/utils/ollama.cjs:57:5) at Ollama._streamResponseChunks (/usr/local/lib/node_modules/n8n/node_modules/@langchain/community/dist/llms/ollama.cjs:346:26) at Ollama._call (/usr/local/lib/node_modules/n8n/node_modules/@langchain/community/dist/llms/ollama.cjs:376:26) at async Promise.all (index 0) at Ollama._generate (/usr/local/lib/node_modules/n8n/node_modules/@langchain/core/dist/language_models/llms.cjs:359:29) at Ollama._generateUncached (/usr/local/lib/node_modules/n8n/node_modules/@langchain/core/dist/language_models/llms.cjs:173:26) at LLMChain._call (/usr/local/lib/node_modules/n8n/node_modules/langchain/dist/chains/llm_chain.cjs:162:37) at LLMChain.invoke (/usr/local/lib/node_modules/n8n/node_modules/langchain/dist/chains/base.cjs:58:28) at async Promise.all (index 1) at MapReduceDocumentsChain._call (/usr/local/lib/node_modules/n8n/node_modules/langchain/dist/chains/combine_docs_chain.cjs:188:29) at MapReduceDocumentsChain.invoke (/usr/local/lib/node_modules/n8n/node_modules/langchain/dist/chains/base.cjs:58:28) at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainSummarization/V2/ChainSummarizationV2.node.js:339:38) at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:728:19) at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:673:51 at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:1104:20

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @IT-HONGREAT ,

Welcome to the community :tada:

Tip for sharing your workflow in the forum

Pasting your n8n workflow


Ensure to copy your n8n workflow and paste it in the code block, that is in between the pairs of triple backticks, which also could be achieved by clicking </> (preformatted text) in the editor and pasting in your workflow.

```
<your workflow>
```

Make sure that you’ve removed any sensitive information from your workflow and include dummy data or pinned data as much as you can!


The error suggests there’s a connection issue between n8n and Ollama. Can you share your docker-compose file (or parts of it) and a bit more about how you’re running n8n and your setup?

2 Likes

I solved the problem by using the ollama container from the starter kit instead of the local ollama.

Thank you.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.