Large firestore database split in batches

Hello everyone,

I do have a large 75k data in firestore. How I can split in batches or using other methods to make it in chunks?

Hi @edvinas, welcome to the community!

You can paginate through large results numbers using a structured query and then use suitable data to limit the results you’re getting from Firestore: StructuredQuery  |  Firestore  |  Google Cloud

For example, fetch only users with an ID between 0 and 1000, then 1001 and 2000, then 2001 and 3000 etc. You’d only need to find a suitable field in your documents.

To save memory in n8n you can then build a parent workflow that only keeps track of these values and do the heavy lifting in a sub-workflow instead. This example post explains the basic idea, but instead of calendar events you’d loop through Firestore documents of course.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.