Describe the issue/error/question
I’m trying to build a function that collects all rows of data from an API. The API will return up to 1000 responses per page, and I’m able to set the page as a parameter of the API URL. The API does not return a cursor. How to I create a workflow that will query the API until it returns no further responses (either <1000 records in a response or a blank response), combines all pages of the responses, and then dumps all records into a PostgreSQL database?
Information on your n8n setup
n8n version: 0.206.1
Database you’re using (default: SQLite): PostgreSQL on Supabase
Running n8n with the execution process [own(default), main]: own
Running n8n via [Docker, npm, n8n.cloud, desktop app]: n8n.cloud
You can use an if to check if there is less then 1000 records. You can do this by first aggregating the items and then checking the length. That would be with the item lists node to aggregate. and then simply use length in an expression to check what the size is of the array.
So when there is 1000 records simply return back to the request and call for the next page.
I’ve also created a community node for this, as it can be quite annoying to deal with sometimes.
Bey Bram, will you please be more specific with your reply?
Or can you validate that this is what the workflow should look like?
- Start Trigger - Every Day at 6:00 am
- Query the API
- Item List Node to Aggregate
- “Use Length in an Expression” < This I don’t understand
- If =1000 records, return to the API call node, which will loop to the table node, and the if check node
- When we get to <1000 records, submit the table to Postgre?
Bram, I’m further confused, because my API already returns what is basically a table. There doesn’t seem to be an appropriate Table Operation that makes sense here. All of them would transform my existing table into something that doesn’t align with what I want. I just want to assemble the 1000 entries repeatedly until I get all of them ready to submit to postgre.
Alternatively, I’d be happy to post them to postgre in 1000 increment batches and just iterate the page by 1 over and over until we reach the end.
This is my logic so far:
Thanks so much in advance.
Have you checked out the example here: Pagination Scraping html - #2 by MutedJam
It has a loop that increments a page count by 1 which sounds like it would do what you are after.