Hello everyone, I am new to N8N and exploring the world of automation!
Apologies if this has been asked before but I have searched hard to find an answer (to no avail) before reaching out and possibly wasting all your time!
I am trying to use the Supabase vector store tool to import data (text) from a workflow into a Supabase table however when I try and connect the OpenAI Embeddings node and select my openAI account, the Model returns “No data” I can not seem to get this to work, other parts of my openai account work (i.e. chat node etc…) my openai account permissions is set to “all” but the model just does not seem to be able to pull back any data so i cant select “text-embedding-3-small” or whatever. Does anyone have any tips for a noob?
I have created a bunch of automation flows before but this is my first vector store and the error has stumped me! I have searched high and low for an answer, asked all the ai’s (LOL) but unfortunately to no avail!
I dont think this is related to any workflow ( i could be wrong?) but i wonder if it has something to do with an account setting or something but again NOOB i am still learning (so be gentle! :P)
Im running Self hosted N8N running docker connecting to supabase
Hopefully this is a really stupid question and an easy answer for someone to point out the error of my ways!
Ensure to copy your n8n workflow and paste it in the code block, that is in between the pairs of triple backticks, which also could be achieved by clicking </> (preformatted text) in the editor and pasting in your workflow.
```
<your workflow>
```
That implies to any JSON output you would like to share with us.
Make sure that you have removed any sensitive information from your workflow and include dummy or pinned data with it!
Make sure you are connected to OpenAI (check API key, try reconnecting). This issue has nothing to do with Supabase. Here’s me without Supabase account connected to OpenAI with free tier.
I figured it out!!! Because this is a pay per use account, I was being a cheap A$$ and only enabled gtp-3.5-turbo originally on the “Allowed Models” in the Model Usage section on my project on openAI. (I told you it would be something stupid and easy! )
@ihortom, thank you so much for your prompt reply, I’m sorry for the slow response, I spent all yesterday trying to get this working!
I worked this out because I was also trying to remember why I only had “gtp-3.5-turbo” in the chat options and I knew I had done that deliberatly (again being a cheap A$$! LOL)
Then i found the embeddings options too!
Have a great day all and yeah… if anyone else ever complains they cant select any options you can tell them to check their “Allowed models” in the Model usage part of their OpenAI project!
for reference:
I have now added text-embeddings-3-small to my list and will continue to add more as I need them (maybe ill concrete it in my brain for future! :P)