how does it work under the hood? is it open ai / gemini function calling feature?
when im retrieving data from it as a knowledge base for AI Agent im just setting the query option to function name and giving table name but no where im putting the function parameters or return types. How will the AI Agent know?
do i need to put the table schema in system prompt ?
and for inserting to supabase vectore store how is the data mapping done?
Does OpenAI/Gemini use function calling internally?
Possibly, if you’re using the n8n AI Agent with semantic search capabilities or integration with LLMs. Many current RAG (Retrieval-Augmented Generation) workflows use:
Supabase Vector Store to store embeddings. OpenAI function calling or similar to invoke functions like “search_documents” or “get_context”.
However, this depends on how the AI ​​Agent is designed. If you’re using the n8n AI Agent with query: function, it may be relying on LLMs to generate functions based on context.
How does the agent know the function parameters?
This suggests that the AI ​​Agent is working implicitly (using LLM context) rather than declaratively as in OpenAI function calling with a defined schema. So, yes, you need to help the LLM.
Should I put the schema in the system prompt?
Yes, it’s recommended. If you don’t explicitly define the types and columns in the function definition or metadata, you must provide: Table name, Relevant columns (names + data type), Semantic context if applicable.
How is data mapping performed when inserting?
In the context of n8n + Supabase Vector Store: Text is converted to embeddings (using OpenAI, etc.).
The result (vector embedding) along with the metadata (original text, IDs, etc.) is inserted as a row into a Supabase table. n8n lets you map which fields are saved using the node fields.
You must ensure that the table has a vector column type and that the node is configured with the appropriate embedding model.