I have spent 2 days in innumerable iterations trying to get a reset expression to work properly on a loop over item node, the flow is simple.
I load a list of webpages from a file and loopthem over items to feed an HTTP request for a webscrapper, the output of that is to be analyzed by an Ai agent but the context gets too large, so I use a loop over items to break it in pieces and let the Ai agent extract in parts.
Initially, it works very well, the problem is when the second webpage gets scraped and the JSON is fed into the loop over items in the same interaction.
if the loop node is set to finalized, all the items together old and new are output in the done line of the loop node.
I try to reset the node but I need it to reset only when it receiving the item batch from the HTTP request, but I cant manage a way to make the expression work, I have tried to reset if the name of the previous node is HTTP request, also if the input does not have the noitemsleft context that the loop add but it does not work, I have tri other more complex ones using ID and comparisons.
but it feels this should be easy, I just can get it to work, the node never resets and all items output over done after the first batch is finalized.
Hello!
I understand the frustration. loop behavior in n8n, especially when combined with AI Agents and long-running workflows, can get tricky.
From your explanation, it sounds like the issue is caused by the Loop Over Items node accumulating items across different HTTP response batches instead of “resetting” for each new input.
Possible solutions, I guess :
Avoid using done output when looping over multiple batches
The done output of a Loop Over Items node emits all items processed so far, not just from the current batch. That’s why you’re seeing the accumulation.
→ Use the loop output or an internal sub-loop, and connect the next logic step from inside the loop. This way, you process items one-by-one and avoid mixing batches.
Use an intermediate node to tag batches
Right after your HTTP Request node (which returns a new batch of pages), you can:
Add a Set node with a unique batchId (e.g. using {{$now}} or a static UUID for each file).
Add that batchId to each item.
Then, inside the loop, only process items with the current batchIdusing a conditional (If or Code) node.
thanks for the reply, about option 1, is there a way to stop the loop ober item to go in to finilized mode? as is it now, the first batch coming from the http, work well and loop over item, but as soon as it finished the node sets itself to finilized and i cant find a way to stop that or resent the node, any other input them goes though done outout.
the second option I have tried as well and the same happens the loop node goes to finilized state and no more looping over item.
so far the only thing that has show any result is reseting the node with any imput, so it starts again when http sends an aray of Items, the problem is that anything that comes back to retriger the node, is treated as new input and resets tha node as well, so no looping over the current batch of items either. I have an If node over my loop that check if the output is the last item of the current batch, to stop inifite loops, that works well, but i need the loop node to reset only and only when receiving nre data from the http.
I think that would definetly work, I have not done it because i wanted to have the entire process in a single workflow, my http request call have limited resources so it it activate multiple times it wil crash, I think i may be able to set a timer or a call back to avoid this, but I was hoping to be able to solve it within the same workflow.
Nevertheless If I dont manage to solve this today, it will become a subworkflow for sure.
I have decided to work with subflows. It means many changes, but by storing data in sheets and gathering one by one, I managed to get rid of other loops over items. I kept only the one after the HTTP request (the only one with big enough content to be worth it), so using external triggers as a loop, I managed to accomplish the same in a simpler way, just over several workflows.