I have a couple of workflows that have HTTP nodes that fetch flat files around 300-400kb in size. These nodes always show up with a warning when opened, that the data is too large to be displayed. But, the workflows themselves execute successfully and all further nodes in the flow are able to process the data correctly.
However, in the execution history, this workflow always shows up with an error status. On further investigation, I realized that if I removed the node that fetches this large flat file, the error status goes away. I understand that this may be intentionally coded to ensure that the execution history does not blow up in size storing all the large files that are fetched, but it would be much better if the status of the workflow accurately reflected if the workflow succeeded or not. Marking the workflow with an error status seems wrong, since the workflow did execute correctly, despite one of the nodes fetching a large amount of data.
Is that something that can be fixed? I don’t expect the execution history to store the data fetched (Although it would be nice to leave that choice to the user). Just being able to see the accurare outcome of the executed workflow would be nice.