I’m trying to build a function that collects all rows of data from an API. The API will return up to 1000 responses per page, and I’m able to set the page as a parameter of the API URL. The API does not return a cursor. How to I create a workflow that will query the API until it returns no further responses (either <1000 records in a response or a blank response), combines all pages of the responses, and then dumps all records into a PostgreSQL database?
Information on your n8n setup
n8n version: 0.206.1
Database you’re using (default: SQLite): PostgreSQL on Supabase
Running n8n with the execution process [own(default), main]: own
Running n8n via [Docker, npm, n8n.cloud, desktop app]: n8n.cloud
You can use an if to check if there is less then 1000 records. You can do this by first aggregating the items and then checking the length. That would be with the item lists node to aggregate. and then simply use length in an expression to check what the size is of the array.
So when there is 1000 records simply return back to the request and call for the next page.
I’ve also created a community node for this, as it can be quite annoying to deal with sometimes.
Bram, I’m further confused, because my API already returns what is basically a table. There doesn’t seem to be an appropriate Table Operation that makes sense here. All of them would transform my existing table into something that doesn’t align with what I want. I just want to assemble the 1000 entries repeatedly until I get all of them ready to submit to postgre.
Alternatively, I’d be happy to post them to postgre in 1000 increment batches and just iterate the page by 1 over and over until we reach the end.