Time Saved Insights - How to get an accurate calculation

We’re very excited about the time saved insights on Enterprise! However, we’ve run into some situations that make it tricky to calculate.

Workflow challenge 1:

  • Workflow contains a process that iterates over records, not just one record.
  • If the workflow runs and saves 5 minutes when processing 1 record in the flow, the time savings is 5 minutes.
  • If it runs and processes 10 records in the flow, the time savings should actually be 50 minutes. (n * time saved)
  • The time savings should actually be multiplied per record in the execution, not per execution. Or, it should be applied at a node-run level based on the number of times a given node was called.

Workflow challenge 2:

  • Given the above limitations, we started to explore modularizing the part of the workflow that processes over a given record.
  • We split it out into a different workflow, with the intention to call it in a parent workflow to get it to execute and track 1x per record.
  • The challenge with this is that time saved does NOT apply to sub-workflow executions, so the calculation never gets tracked.

Workflow challenge 3:

  • In order to get a parent workflow to trigger a sub-workflow execution that actually tracks time, we decided to call it via a webhook.
  • This is creating additional overhead, complexity and confusion to work around the time tracking issue.
  • Beyond that - if we push changes to github and pull them into our prod instance, the base path of the URL we provided in the webhook call in the parent workflow does not change. It remains the dev env URL instead of updating to the prod env webhook URL.
  • Our prod env is read-only, so the user can’t just update it when it gets to prod.
  • The one workaround we found is to leverage variables to use as the base path for webhooks, but as you can imagine, this gets messy and is confusing for our users.

Solutions I think could work:

  • Allow us to select whether we want to track the sub-workflow time saved instead of defaulting to NOT tracking it
  • Allow us to select which node has a “multiplier” so it multiplies time saved against each trigger of the node

Has anyone faced similar? Any other workarounds in the meantime?

4 Likes

Hey,

Thanks a ton for laying out these real-world scenarios so clearly!

The way time saved is currently calculated is indeed pretty constrained. Right now it only looks at the top-level execution, so workflows that calls a sub-workflow can underestimate the true impact. Unfortunately, I can’t think of other workaround to make the calculations more accurate.

Your idea of letting to decide how sub-workflows are treated (tracked or not) and adding a per-node multiplier for loops makes a lot of sense. I’m passing this feedback to our product team so we can evaluate what it would take to make the calculation more flexible.

1 Like

This should be pretty high in the prio list to fix because it massively changes the perception of the benefits of n8n. There is a slight difference if i report 20 hours saved or 150 per day…

I was also looking for this information, i am building a workflow that i want to then re-use for customers and ofc as a marketing and value preposition, display the saved time.

a very simple but yet working solution to meet all requirements would be to add the “time_saved” value as a settings field in every node.

if you fill a value at the end of the workflow element and the final node runs, the time_saved value gets added to the main workflow statistic.

so if my workflow e.g. just looks up a caller ID in CRM and plays it back to my voice agent, this might save me 15 - 30 seconds, while booking an appointment might be another 30 but a full AI based summary of the call in addition to the “call_duration” value transferred via postcall webhook to n8n is a whole other story.

So i wish the time saved to be open for JSON values as well

options like i described would at least solve my problem instantly.