Loop within a loop?

Hello,

I have a question, I retrieve pdf uploads on an online server, and sometimes at the end of the month I have more than a hundred (175 for this example) outside, the API behind and n8n n can’t keep up.
Is it possible to do in a way that:
If greater than 50, then we recover the first 50, and then we recover the next 50, up to 170?

Thank you and have a good day !

Hi @Larmier_Anthony

I dont understand the exact problem.
You have an API that can only give you 50 records at a time?

Maybe it is good to share your workflow and/or what api you are using.

Hello :slight_smile:

Ah no no, the api arrives very well, sent all the pdfs, the problem I think is n8n which crashes, for no reason it is written… so I try to work around the problem…
I already had the problem of recovering a 40th one, suddenly I put a higher batch interval and it passed, but for 170 pdf it is not enough anymore …

Ah ok I think it is the size of the PDF’s then.
Have you taken a look at the split in batch node?
You can use this to loop through batches. Do make sure you move the PDF handling inside a seperate workflow though. This you can execute with the Execute workflow node.
One final thing, make sure to remove the data from the last node in this sub workflow so the data isn’t returned to the main workflow. :slight_smile:

Hope this makes sense. There should also be some examples around.

2 Likes

Thanks for your help, I’ll look that way then!

2 Likes

If you’re self hosted, you can also look at setting N8N_DEFAULT_BINARY_DATA_MODE to filesystem to reduce the memory burden by not holding the PDFs in memory.

https://docs.n8n.io/hosting/environment-variables/#deployment

2 Likes

Hello, thank you for your answer, I did as you said, the problem of downloading the pdf is solved, now it’s the download that is the problem… Do you have any idea?

Do you see if the RAM usage goes up to 100% before it crashes?

It is not badly used, but there are still free! I will see if I can upgrade the VM and see if that solves the problem.

Otherwise, split in batch works, I just need to find out how to make it take the first ten, then the next ten etc, until there are more

Simply loop back to the split in batch node. So the last node a line to the input of the split in batch node.
If you want it to continue after doing this you will need to add an if and split it there to continue or keep looping the split in batch. This isnt needed of you simply want it to finish when it is done.