Hello,
I am building a workflow that lets me query large csv files, by splitting them into smaller batches. It consists of 3 workflows in total. One is general, one is getting that csv file, and third one is for querying the batches, result of which are then combined in the main one
I always get 503 error, even thought I separated it as much as I could. I get the message in the Slack with correct file so in my head it means that the workflow for getting the information isn’t the issue, but I am not sure. *Its not letting me paste all three workflows, so I posted the main one and one that gets the information. The one missing just consists of some data transformation modules and one Google Gemini call