Best Practice Question - Check database updates regularly

Trying to figure out the best way to do this, so would love some best practice ideas.

Over simplifying, but basically I’m going to trigger with a cron or schedule (say, every hour). That will trigger a postgres DB query. The returned data will be pushed to Hubspot.

What I want is for my trigger or query to work such that I only get rows returned that have been modified or added since the last time the workflow was triggered. Is there a way to do this? Ideally, I’d know the last run and current run times and then use a last modified value in my tables to grab the rows that fall in that range or something like that.

Thanks!

Hi @jmr123 :wave:

So this is a little bit tricky, but for example if you have a flow that you know runs every 30 minutes, you can set to a date to reflect that, along the lines of {{ $now.munus({ minutes: 30 }) }}.

The downside is that will be when the workflow started, not when it finished. To work out the finish time you might need static data:

An idea would be perhaps the last step of the workflow could be to set a lastRunTime variable, and the first step of the workflow could be to read it. :+1:

2 Likes

Thanks…getWorkflowStaticData might work.

For the record, this is VERY hard to work with since you can’t get values in testing. So if you need to use the value in a DB query (for example) you need to fake the static value and then get your DB query set and then switch back to a dynamic value for the getWorkflowStaticData, but hard to tell if all of that is actually working.

2 Likes

I’ve been throught a similar problem.

My solution was creating a new column in my database and set an bool with 0 as not processed and 1 as processed.

SELECT * WHERE processed = 0

and updating that column after processing it

UPDATE col
SET processed = 1
WHERE id = X

Of course, you need an id for each item you’re processing.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.