Struggling with embedding using LM studio

Hi,
I am new to n8n and I am trying to follow a number of tutorials.

n8n and the AI starter kit are running in Docker, I am connecting to LM Studio which is outside Docker.

I know that the connection to LM Studio works as I can connect a an OpenAI chat model pointed at LM Studio via the Basic LLM Chain node and receive my replies from the Chat.

However when i then switch over to trying an embedding model in LM Studio While trying to connect a basic embedding I am receiving the error :

Problem in node ‘Qdrant Vector Store‘ , fetch failed

I have tried all 3 of the splitter nodes incl configs :
Token splitter and Recursive Character Text Splitter set at different chunks (512 , 20) (1000,20) etc

I have the - text-embedding-nomic-embed-text-v1.5 and text-embedding-snowflake-arctic-embed-m-v1.5 models in LM sudio

Any thought on how to set up the embedding details would be really appreciated

{
“nodes”: [
{
“parameters”: {
“options”: {
“allowFileUploads”: true
}
},
“type”: “@n8n/n8n-nodes-langchain.chatTrigger”,
“typeVersion”: 1.1,
“position”: [
-840,
-160
],
“id”: “6c045a8e-043e-46cc-b414-93d1fb99456b”,
“name”: “When chat message received”,
“webhookId”: “093f4b44-27fe-4efb-a7b2-3b682185b276”
},
{
“parameters”: {
“mode”: “insert”,
“qdrantCollection”: {
“__rl”: true,
“value”: “={{ $json.sessionId }}”,
“mode”: “id”
},
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.vectorStoreQdrant”,
“typeVersion”: 1.1,
“position”: [
-560,
-160
],
“id”: “36b78279-f67b-4d97-914b-50da7304cc62”,
“name”: “Qdrant Vector Store”,
“credentials”: {
“qdrantApi”: {
“id”: “G6VxDdZVUAFBcdwt”,
“name”: “QdrantApi account”
}
}
},
{
“parameters”: {
“model”: “text-embedding-nomic-embed-text-v1.5”,
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.embeddingsOpenAi”,
“typeVersion”: 1.2,
“position”: [
-720,
200
],
“id”: “01784df0-23f4-445c-87d0-a60ae31c2a56”,
“name”: “Embeddings OpenAI”,
“credentials”: {
“openAiApi”: {
“id”: “EXToKNOmjb8Dz7oc”,
“name”: “OpenAi account”
}
}
},
{
“parameters”: {
“dataType”: “binary”,
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.documentDefaultDataLoader”,
“typeVersion”: 1,
“position”: [
-460,
60
],
“id”: “720be7fa-7384-4c56-803d-f0511e2240d0”,
“name”: “Default Data Loader”
},
{
“parameters”: {
“chunkSize”: 512,
“chunkOverlap”: 20,
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.textSplitterRecursiveCharacterTextSplitter”,
“typeVersion”: 1,
“position”: [
-1120,
200
],
“id”: “33f6977e-f7f0-438e-a0d5-b5d36c960029”,
“name”: “Recursive Character Text Splitter”
},
{
“parameters”: {
“chunkSize”: 512,
“chunkOverlap”: 20
},
“type”: “@n8n/n8n-nodes-langchain.textSplitterTokenSplitter”,
“typeVersion”: 1,
“position”: [
-1120,
0
],
“id”: “e18c6fc4-d26d-491a-ba1e-644a510efe91”,
“name”: “Token Splitter”
},
{
“parameters”: {
“separator”: “,”
},
“type”: “@n8n/n8n-nodes-langchain.textSplitterCharacterTextSplitter”,
“typeVersion”: 1,
“position”: [
-280,
260
],
“id”: “90193d8e-a332-4622-bd37-7eb6f189cb21”,
“name”: “Character Text Splitter”
}
],
“connections”: {
“When chat message received”: {
“main”: [
[
{
“node”: “Qdrant Vector Store”,
“type”: “main”,
“index”: 0
}
]
]
},
“Embeddings OpenAI”: {
“ai_embedding”: [
[
{
“node”: “Qdrant Vector Store”,
“type”: “ai_embedding”,
“index”: 0
}
]
]
},
“Default Data Loader”: {
“ai_document”: [
[
{
“node”: “Qdrant Vector Store”,
“type”: “ai_document”,
“index”: 0
}
]
]
},
“Recursive Character Text Splitter”: {
“ai_textSplitter”: [

]
},
“Token Splitter”: {
“ai_textSplitter”: [

]
},
“Character Text Splitter”: {
“ai_textSplitter”: [
[
{
“node”: “Default Data Loader”,
“type”: “ai_textSplitter”,
“index”: 0
}
]
]
}
},
“pinData”: {},
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “5370d8c2d142c20110f5ea4e2a096b5360deae4471bce7623025b5da9fdb46f7”
}
}

Information on your n8n setup

  • n8n 1.94.0
  • **Database nil
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • **Running n8n via Docker,
  • Windows 11