Execution Data Saving Conditions


The Idea is:

Although I’ve activated the data pruning, size of execution_entity rise up to 1 TB. There are some workflows that checks some data with HTTP Requests and decide whether workflow will continue or not. These workflows are triggering every 3-5 minutes in average.

When I analyze execution_entity table, 30% of data were meaningless for me because of the case that I’ve mentioned.

A recorvery option would be separating workflow as collecting the data and working on the data. Collecter workflow wouldn’t save execution data if it successful but it would be a just taking the easy way out. I imagined a solution that more powerful and flexible.

If I would be able to define a condition in workflow settings for saving successful execution to give ability to do not save execution data if my selected node is the last executed node, it would cover the issue.

Are you willing to work on this?

I already did it for my environment and I’m planning to build my custom docker image and use it in production. For now, I’m able to select only single node but it will be better to select multiple nodes. There are also some cases need to be handled like what will happen if I deleted the node that I’ve selected in here.

Another solution could be giving this ability to No-Op node as an option in node settings and using the node with this option in a place that we want to do not save the execution data.

In brief, with that tiny option that I made, I will save 30% of my DB cost and have more meaningful and clean execution history.

Yes very much so!

Similar to existing req:

This is a very cool idea that I hope will be implemented
I use a lot of schedule-triggered workflows, which most of the time just check if there is something to be executed.

1 Like