Impossible to filter a Supabase Vector Store via Metadata

Hi N8N community! :wave:

I’m integrating Supabase with N8N to perform vector searches on pgvector-stored documents. My SQL function works fine in Supabase, but the JSONB metadata filter doesn’t work when called from N8N.

I tried everything, it seems like another filter ils passed through in the background creating a conflict ?

The same filter working well in sql directly in supabase:

I noticed also that it works as intended when I do the same but in a node “Retrieve documents As vector store for chain tools” but when it’s “Retrieve documents as tool for AI Agent” the presence of a metadata filter causes an error

  • n8n version: 1.76.1
  • Database : Supabase
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via: n8n cloud
  • Operating system: Mac

Thank you in advance !

5 Likes

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:
1 Like

Yes I am getting the same issue, did you manage to find a resolution?

No still no luck :frowning:

The same problem here!

1 Like

I’m facing the same issue, any update about this one?

1 Like

I am seeing the same issue. I tried with different vector stores (supabase and pinecone) and same error when using vector store as AI Agent. any fixes or workaround on this please?

1 Like

Same here.
From what I can work out the node only sees the user chat request, NOT the agent request - you can’t filter (or I can’t get it to filter) on what your agent is asking the node, only what the users inputs in their chat.

The attached filter works (sometimes) but it is not what I need. I need the agent to pass the filter parameter.

At the moment I can’t see how to reference the agent output as a field.

1 Like

Thanks for feedback guys i’m feeling less alone, i’m pretty sure there is some filters passed in the backround that “takes the place” of the filters we want to actually pass hence the conflict. Is there a way to monitor detailed logs going out from n8n ? Because on Supabase log side I don’t get enough details.

I’ll try to open a support ticket if it is possible

1 Like

same here…

same here

this other type are accepting filters, but “As Tool For AI Agent” Give this sad message
problem in 1.80.1 and updated to 1.81.0 with same problem

When you do an execution hit F12 to open up dev tools, go to console, you should see the requests that n8n are making.

I’m not 100% sure but you maybe able to just use the Postgrest API with match_documents. So just use an http node with predefined credential type of supabase. This would be the curl, if you want to try and import:
curl -X POST 'https://YOUR_SUPABASE_URL/rest/v1/rpc/match_documents' \ -H 'Authorization: Bearer YOUR_SUPABASE_ANON_KEY' \ -H 'Content-Type: application/json' \ -d '{ "query_embedding": [/* your embedding array here */], "match_threshold": 0.78, "match_count": 10, "filter": { "user_id": 3 } }'

Update for anyone struggling - I hope this gets fixed soon as its a complete pain - but in the meantime. A work around I’m looking at is rather have the agent call the supabase (or pinecone has same issue) node, and try to get it to do the filter etc, I’ve put that part into a separete workflow. So the agent calls this “Search my vector store workflow” and then in that seperate workflow you get a query coming in. From there you can use the supabase node in retreval mode (not agent tool mode) and apply your filter there, use the query from the LLM as your prompt, and then have it return the response to the main workflow. this works ok. but is a pain having another workflow to keep track of.

Would this kind of filter help?
@cookiemonster27320 @Altumate @gmiguel @tani @Luiz_Henrique @pace @Bernd_Grimm

When you use Operation Mode : Retrieve Documents As Tool for AI Agent we only have those options :

Still no solution on my side

Good news ! They solved my ticket and the fix has been deployed in the latest minor release 1.84.0 (still in beta) I tested it and it finally works

3 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.