I do use the ollama with gemma3:12b
|
|
1
|
27
|
April 15, 2025
|
MCP Server/Client Error with Groq/Ollama: 'Non-string message content not supported
|
|
1
|
65
|
April 14, 2025
|
Error in Ollama Embedding
|
|
0
|
12
|
April 13, 2025
|
N8n selhosted not communicating with ollama api
|
|
7
|
56
|
March 12, 2025
|
Making Ollama Chat Model ONLY Form Answers from Browsing Vector Store
|
|
1
|
28
|
March 10, 2025
|
Tunnel smartphone to Google drive through a n8n local (Macbook pro - Ubuntu)
|
|
0
|
19
|
March 6, 2025
|
Local Chat with AI Agent - Ollama and Qdrant
|
|
2
|
243
|
March 3, 2025
|
Questions with Errors in AI Nodes and Telegram
|
|
7
|
163
|
February 27, 2025
|
Ollama node "keep alive" setting has no effect on local n8n install
|
|
2
|
112
|
February 27, 2025
|
How I Solved the "Forbidden" Error When Integrating n8n Cloud with Local Ollama Using an Intermediary Proxy
|
|
1
|
302
|
February 3, 2025
|
Llama3.2 Bug?
|
|
5
|
247
|
January 30, 2025
|
Wrong output from basic LLM chain with llama3.2
|
|
3
|
161
|
January 10, 2025
|
Using Ollama and n8n for AI Automation
|
|
0
|
932
|
October 18, 2024
|