Size limitations of data returned by AI Agents tools?

Hello!

I’ve noticed that data can get truncated when using an AI Agent. For example, if a tool returns a lot of JSON, the agent might see a handful of the records, but not all of them. This isn’t surprising and I’ve found ways to work around it.

However, I’d love to know what’s causing the data to get truncated. Does Window Buffer have a maximum size for each window of information? Or, when a tool returns JSON to the AI Agent, is there a limit to the size of the data that’s returned?

This question is not related to a specific workflow. I’m writing a tutorial on how to use document storage (MongoDB) to circumvent such limitations, and I want to get my facts right.

Thanks!

  • n8n version: 1.63.4
  • Database (default: SQLite): default
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
  • Operating system: n/a

Hi @Bret_Truchan ,

Thanks for posting! You can limit the number of interactions returned via the Context Window Length parameter but there isn’t a limit on the size of t. he messages themselves :slight_smile:

2 Likes

Thanks Aya. I’ll have to do some sleuth work to find out why some of my data was getting truncated. If I find out, I’ll post what I find here. :slight_smile:

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.