Managing a query that's too large and generates a status 413 error

Hi @MutedJam,
Indeed, I was wondering if an implementation similar to this would move the problem to the end of my workflow since I would end up using the same amount of memory, or generating a file that’s too big to handle once again.

I think I should be able to bypass that if I write the results of each call & transformation to my end result (a GSheet currently) 500 by 500. I guess my question is if doing it like this would not risk putting a similar load on N8N as what I’m doing currently after 8 loops, for example.

I’ll try it out thi afternoon and come back to you if I still get a 413 request. thanks !