Can we use Batch API for AI agent? How to set it?
I am using Claude Sonnet 4
Information on your n8n setup
- n8n version: latest
- Running n8n via n8n cloud
Can we use Batch API for AI agent? How to set it?
I am using Claude Sonnet 4
Hi @Kent_Lee, in the Add Options section of the AI Agent theres a Batch Processing Option which you could use. If you’re looking for Batching through the API itself, I don’t see any native features within Anthropic’s chat model connector, but you could use an HTTP request node and write the payload yourself in line with their API docs. Hope this helps!
Is this batch processing option mean calling the Batch API or just the node itself batching the calls?
This instance is the node itself batching calls (to combat rate limiting). I’m not sure if any of the “message a model” nodes/tools in n8n natively support batching API endpoints. Your best bet would be to manually hit the batching API endpoint with an HTTP request node in my opinion