Trying to do daily AI summary on my email box, data is too large and hangs n8n

I have an n8n workflow that pulls every unread email daily basis, stitches the text together, and sends the whole block to an AI-summarisation node (OpenAI).

Whenever the daily haul gets large, roughly tens of messages, this AI summarisation module simply hangs. I believe it is due to overall payload too large, thus even though there’s chunkin, it still hangs.

May I know what best way to design for a usecase like mine?

Just curious how many emails we are talking about?

If the payload is too large.
Maybe do some clean in the first place.

I think there are many emails didn’t need AI summary, we can filter that first.

Then maybe think about send only subject or first n characters in email body

1 Like

It is about 30 emails per day, each a detailed report about a topic.

In my usecase, unfortunately all need to be summarized. Any idea what’s best way to do this?

I am thinking a way is to vectorize them first. And RAG

Not sure putting RAG is a good method or not.

Maybe put every email into a quick AI summerize first.

(30 emails → 30 short summarization)

Then use the final summerize to sort or split these emails into different level.

(30 short summarization → 1 long summarization)

I would use the Loop node for each email message item, summarize the individual email and then in the end just bring it all together into a final summary. This way you call the summerize node in smaller chunks

Something like this rough example:

You might not need the final summary, only if you want to.

1 Like

That works! At least for emails that are not too long, each can be summarized well without exceeding context window.

Thanks! Looks like the looping is the way, works well for me

Also look at other models with bigger context windows like Claude 200k and Gemini with 1m+

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.