How to split a JSON file into smaller ones, analyze those ones with AI, and then combine the responses into one

Describe the problem/error/question

Hi all,

I am wondering if following workflow is possible/recommended.

I am retrieving a csv file of about 500 rows and 150 columns. I want to split that file into multiple files of lets say 200-300 rows, perform an analysis on them using AI, and then combine all those analysis responses into one. The main reason why is that a file like that is about 1,000,000 tokens, too much even for Google Gemini

What is the error message (if any)?

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

You could start with a basic flow like this. You will, of course, need to fill in some details about what the output from the AI step will look like, and how you want to aggregate that data after the batch/loop is done. The batch size in the Loop over items node should control how many items go to the AI step at one time. Within the loop you could add a step to combine each batch of input / prompt messages in to a larger formatted list. It’s hard to tell what your end goal is, so this is JUST a starting point.

Hi, thanks so much for your response. I figured it could look something like this. What do you mean by having the answers formatted into a larger list inside the lopp exactly?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.