As title states, I have a workflow that is not running to completion but not showing any error. This workflow is supposed to process roughly 1400 records, but only makes it to 170 and then stops without kicking any error.
It looks like the ‘Update Sheet’ node is the last one to execute. Another note: The execution log says that each execution errors out after a few milliseconds, when in actuality the workflow runs for at least a few minutes before it stops.
removing the wait node seemed to let the workflow process a little further. I also increased the batch size to 500 and it stopped at the set node so it does appear to be a memory issue although I can’t figure out how this one workflow is chomping through 600+ mb of memory in processing 1400 rows from an excel sheet.
You could iterate over ranges on the sheet, then re-think the code node (if you really need it), both of these are recommendations from the “How to reduce memory consumption in your workflow” doc:
right now it is getting all the rows at once, you could however try to split and get them by ranges of 100 or something. I don’t have access to Office365, but I assume you would use this parameter to specify the range:
I’ll give it a shot, but I’m not quite sure how to go about it. If each row has a total of 13 columns (A - M) how would I configure these fields in order to iterate over all 1400 rows?
If I had to do it, I’d create an array of indexes of row numbers: [1, 201, 401, 601…]
split it into separate items
iterate over them and on each iteration I’d get current row and construct the range as A{row_num}:M{row_num+199}
I discovered that the hubspot node was returning a ridiculous amount of data - I simplified the output and the workflow now runs without issue. Thanks for your help @jabbson