My workflows become extremely slow when they get large and start processing a lot of text data.
I understand that some slowdown is expected with bigger workflows, but I’d like to know if there are any best practices or tips to improve performance. I’ve noticed that the biggest drop in speed seems to happen when n8n loads the output data from each node — especially when dealing with heavy or lengthy text.
Best is to split the processing up into smaller batches. and then process those batches in a subworkflow. after a subworkflow ends it will return the data from the last node executed and release everything from memory. So if you make sure not to return data from the subflow the main flow will remain fast and snappy