I have an n8n workflow that pulls every unread email daily basis, stitches the text together, and sends the whole block to an AI-summarisation node (OpenAI).
Whenever the daily haul gets large, roughly tens of messages, this AI summarisation module simply hangs. I believe it is due to overall payload too large, thus even though there’s chunkin, it still hangs.
May I know what best way to design for a usecase like mine?
I would use the Loop node for each email message item, summarize the individual email and then in the end just bring it all together into a final summary. This way you call the summerize node in smaller chunks
Something like this rough example:
You might not need the final summary, only if you want to.