Add vector embeddings to existing table?

Hello!!

I’m building a data import pipeline. I’m extracting from, in this case, Salesforce and putting records into Postgres w/ pgvector (at Supabase). It’s pretty easy using a salesforce query and then a postgres upsert for each, but I also wanted to take a few text fields from the object and generate embeddings to put them into some columns in the existing postgres row. As an example. I would have description and description_embedding. My queries will take text and embedding to query. I have all the right pieces in place, conceptually, except how to get my workflow to generate the embedding and get it into the database row.

What is the best way to make this happen? I tried the Supabase Vector Store node, but that doesn’t seem to want to update an existing postgres row/column, it seems to want a table dedicated to the vectors. Is there a different node I should be looking at? And should the embedding be generated before the upsert and just included as another field? or done after the row updated and do a separate update just for the embedding?

Thanks for helping me put this together conceptually.

Information on your n8n setup

  • n8n version: Cloud

I have figured it out. After my initial row insert takes place I execute a sub-workflow that takes and id, text and a model name and makes an HTTP reuqest to OpenAI to embed the text, that is then returned from the sub-workflow and I use the id and the embedding to update the row with the vectors.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.