Before explaining my problem, here are some informations about my n8n version and workflow description.
N8N
n8nVersion: 1.89.2
platform: docker (self-hosted)
nodeJsVersion: 20.19.0
database: sqlite
executionMode: regular
concurrency: -1
license: enterprise (production)
WORKFLOW
Webhook
AI Agent + Anthropic Chat Model
Code
Respond to Webhook
From the Code node, I want to access to the tokenUsageEstimage JSON object at the output of the Anthropic Chat Model … but it seems it is not possible.
I have tried to reference the Anthropic Chat Model node directly from the Code node but it doesn’t work.
I couldn’t find a way to propagate that tokenUsageEstimate object through the AI Agent node.
Am I doint it wrong? Why can we see that information and can’t access it?
Ideally, I want that information to be added to the response of the request.
Thanks for your help and great job on the work you’ve done — n8n is an amazing tool!
You probably aren’t doing anything wrong. The bits of data (token usage stats) that you want to access probably just aren’t passed along or exposed in any way you can use in a workflow. There’s a discussion about this already here where various things were tried, but I don’t think anything ever worked.
There is a relatively highly voted feature request about this, if you want to add your vote to it.
I did a quick test, using the n8n REST APIs you can get access to all executions of a specific workflow. Using the execution id, you can send a request to get all the execution data of that specific execution … including the tokenUsageEstimate, cool
Knowing that, it shouldn’t be complicated to add a HTTP Request in the workflow to get those informaitons … if we wan get access to the execution id during the execution of the workflow.
I was able to add the execution ID to my workflow returned value, then, after I can request the workflow execution data and extract the token usage estimate.
It is not what I wanted (get the token usage estimate in output of the Chat Model node, but at least I can solve my case.
I won’t mark this topic as resolved for that reason.