Best way to integrate OpenAI model with large dataset

Hi fellow n8ners!

I am looking for some guidance from fellow n8n experts here.

How would you handle a workflow where you must feed 10K rows into the openAI node and have it output the results back into the google sheet? I tried connecting a rowAdded trigger to a loop through batch node (500 batch size) and then contain the OpenAI model within. Every time I try to test the openAI node the inputs disappear and results in no output. When I execute the inputs within the node they come back but as soon as I test the overall step again they disappear.

Is my overall approach not the best?

n8n version: 1.75.2
Database: SQLite
n8n EXECUTIONS_PROCESS: own, main
Running n8n: n8n cloud
Operating System: macOS Big Sur Version 11.7.10

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system: