Storing Vectorization Costs When Updating AI Chatbot Vector Database

I’m new to n8n and really excited to explore it, but I might not always see the simplest solutions. I’ve spent hours trying to solve this problem, but I’m at the point where I need to ask for help.

We store our internal company documents in an application called Bitrix. Whenever documents are updated, we use the attached workflow, which monitors new documents in Bitrix and embeds them using the OpenAI API. The workflow also deletes the vectors of files removed from Bitrix in the Supabase database.

The problem I’m facing is tracking the costs. Normally, I would store the token costs from the OpenAI API and later use them to generate reports, but I can’t seem to find a way to retrieve this information during the embedding process. As an alternative, I thought of saving the size of the vectorized documents (based on the “File Size - Summarize…” branch), but I’m running into an issue when multiple files are uploaded at once. In this case, I get as many items in this branch as the number of files uploaded, and I’m not sure how to sum up the file sizes.

My question is, what’s the best way to track the costs in this scenario, and how could I implement it? I’d appreciate any insights from the community!

My workflow:

Information on your n8n setup

  • n8n version: 1.60.1
  • **Database (default: SQLite): default **
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): cloud
  • Operating system: Windows 11

I guess I found the solution based on this workflow.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.