Support Neo4j database

You’re right I will change the license ASAP, I don’t even know why I set this one, probably a copy/paste error :blush:

Hi Claire,
first of all thanks for this node. I am really excited to work with it. Today I ran into an unexpected problem:
I’m using version 0.2.2 of the n8n-nodes-neo4j community node on an n8n instance hosted by elest.io.

I have two scenarios:

  1. AI Agent Tool: The node (specifically as n8n-nodes-neo4j.neo4jTool) works correctly when called by a Langchain agent to execute parameterized Cypher queries. The agent provides the query and parameters, and the tool executes it successfully.
  2. ETL Workflow (Problem Area): I’m trying to use the same underlying node type (or n8n-nodes-neo4j.neo4j with Resource: Graph Database, Operation: Execute Query) in a separate ETL workflow. Here, I need to pass parameters from a Code node to my Cypher queries (e.g., for $nameValue in MERGE (s:Source {name: $nameValue})).

In the “Execute Query” UI for this ETL setup, my node version (0.2.2) doesn’t show a dedicated “Parameters” input field (unlike the versions documented on GitHub/npm). Instead, when Resource “Graph Database” is selected, it displays an “Index Name” field (which seems more related to vector search).

My attempts in the ETL workflow to pass parameters by structuring the input item as { "query": "...", "nameValue": "some_value" } (or with a nested params object) have consistently failed, leading to the Neo4j error: NodeOperationError: Expected parameter(s): nameValue.

Given that the node can handle parameterized queries when used as an AI agent tool, what is the correct method to supply parameters to the “Execute Query” operation in this version (0.2.2) when used directly in a non-agent ETL workflow? Is there a specific input structure or an undocumented feature for this operation in v0.2.2?

Thanks for any insights!

1 Like

Interested in this and the progress

Is this available within n8n cloud yet? I can’t seem to find it, and any method to install it manually doesn’t seem to work, leading me to believe that it has not been verified yet?

1 Like

Hi @Claire ! :slight_smile:

Thanks for this integration, but I can’t make it work. I don’t understand why..

Do you know why it told me the vector index is wrong, whereas I put the good name ?

Thanks a lot for your help !
Sébastien



You don’t need a ‘node’ to do this and its probably better to have the customization of simply using HTTP requests.

If you run llm-graph-builder: llm-graph-builder/docker-compose.yml at main · neo4j-labs/llm-graph-builder · GitHub

The backend api on port 8000 comes with everything you need.

/upload is for uploading docs, and you can use /chat_bot to use their already built logic to use a model of your choice to chat with the GraphDB.

``
curl -X POST ``http://localhost:8000/upload``
-H “Content-Type: multipart/form-data”
-F “[email protected]
-F “chunkNumber=1”
-F “totalChunks=1”
-F “originalname=document.txt”
-F “model=ollama_llama3”
-F “uri=bolt://neo4j:7687”
-F “userName=neo4j”
-F “password=password”
-F “database=neo4j”
-F “``[email protected]``”
```

theres a /extract endpoint to send to neo4j

And as mentioned /chat_bot to chat using the data.

5 minutes with an LLM constructing your HTTP request and then building HTTP Request nodes in n8n would give you everything you need.

Its not as clean as already built nodes – but those clearly aren’t coming any time soon - and appear to be buggy custom ones only right now. HTTP request is easy and always works

You will need to hit the /upload endpoint, /extract endpoint, then /post_processing endpoint to upload a file.

/chat_bot for talking.

Very simple curls. Check localhost:8000/docs for FastAPI information on LLM Graph builder

1 Like

Hi @Claire we have also been working on a neo4j-node from our (neo4j) side. Would be good to sync and compare and see if we can push this into an official integration.

Feel free to ping me at michael at neo4j.com

3 Likes

Hi everyone, sorry I miss all your messages, will try to answer. I see there are some issues with what I’ve done, I’m working on a new version, hope this will solve most of the issues you encountered.
@GoatLocker interesting way to solve it but I didn’t want to use the graphbuilder API as I didn’t had my own instance and didn’t want to use undocumented API on the neo4j instance + I imagine that add latency, no ?
@Michael_Hunger more than happy to work together. I’m sending you an email to discuss this with you ASAP.

4 Likes

Guys can I clone this repo and use it for Neo4j

is it will work fine with portkey

which LLM I need to use