SplitInBatches - Clear/Purge Cache at end of each Iteration

I have some workflow that produces a lot of data on each iteration of split in batches. It visibly slows the workflow down (each iteration is slower than the last); I trust it is saving all the data accumulated.

Is there a way to purge the data that has been used (and is therefore no longer needed) at the end of each iteration so that speed is maintained?

Please share the workflow

Information on your n8n setup

  • n8n version: Latest
  • Database you’re using (default: SQLite): SQLite
  • Running n8n with the execution process [own(default), main]: own
  • Running n8n via [Docker, npm, n8n.cloud, desktop app]: Docker

You could try to split the workflow into multiple workflows. To alleviate memory issues you can call a sub-workflow after your split with a webhook. After each batch that memory will be reclaimed.

In the receiving workflow webhook you’ll want to set

Good idea - can put a http request at the end and then a webhook at the start to create a loop through multiple workflows

Keen to hear if there are any other solutions from the n8n team however

Are you logging execution progress? How are you checking the data as well, When you run it in the browser it will keep everything to show you but in the background it might not be so bad.

Hi Jon, yes logging the execution process - if that’s not done, does it make things faster?

And yes viewing it in the browser

Yeah both of those things will make things slower :slight_smile: