Describe the problem/error/question
Hi all,
I am wondering if following workflow is possible/recommended.
I am retrieving a csv file of about 500 rows and 150 columns. I want to split that file into multiple files of lets say 200-300 rows, perform an analysis on them using AI, and then combine all those analysis responses into one. The main reason why is that a file like that is about 1,000,000 tokens, too much even for Google Gemini
What is the error message (if any)?
Please share your workflow
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)
Share the output returned by the last node
Information on your n8n setup
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system: