N8N Cloud running out of memory

Hi, I’m using N8N Cloud. I have a workflow which gets 30 items from an API, splits them, appends them to Google Sheets, then calls its own webhook to start another round and get the next 30 items.

This works dozens of times in a row, but sometimes the google sheet node returns an out of memory error and the n8n instance crashes.

The size of the API response we’re working with here is not significant—with 30 items it averages out at around 70kb of json including when it crashes.

I have two questions:

  1. What might be causing it to run out of memory sometimes and how would you adjust this workflow? I’ve tried everything I could based on existing documentation on this issue.
  2. Is there any way to automatically recover from n8n running out of memory or do I always have to restart the process? It’s supposed to do this every day and I can’t be monitoring it.

The only thing I think may be at fault is that there are several $jmespath expressions in the google sheet node inputs that join together some line items.

Hi @Giovanni_Segar :wave: Sorry to hear you’re running into this!

Can I ask a few questions just to get some more information?

  • What version of n8n cloud are you using? Does upgrading to the latest version help? If you need instructions on how to upgrade, let me know!
  • What n8n cloud plan do you use? If you wouldn’t mind, you can DM me the email associated with the account and I could look closer into that (and possibly some logs that might shed some light).
  • Have you tried less than 30 items at once? If so, does this make a difference?

@EmeraldHerald I’m using 1.0.5.

I’ll DM the email.

I was originally doing 100, then I took it down to 80, 50, 30. If I have to go less than 30 it’s just going to take too long.

I did a test where I ran the workflow without any $jmespath or $if commands and it worked without a hitch. My guess is that the $jmespath function is requiring a lot of memory. It seems like it shouldn’t be that intensive though. I’ll run some more tests.

So, I got it working without JMESPATH. There was one particular field where I was concatenating all line items into one string, and the line items can be decently long strings (a few paragraphs each).

Considering the total size of the payload, it doesn’t seem like this should be a problem, but maybe jmespath is not very performant.

Do you have an enterprise version that contains more memory? Is the only way to achieve this self-hosting on a powerful platform? The client would happily pay more to increase capacity.

1 Like

Hi @Giovanni_Segar - first and foremost - glad to hear you got this working!

We do have an Enterprise offering, which you can fill out the form here to have a member of the sales team get in touch with you. :slight_smile: Another option might be to upgrade your cloud plan to a higher tier if for some reason Enterprise doesn’t suit you and your client.

@EmeraldHerald Does Pro have a higher memory limit for a running workflow? What about enterprise?

Hi @Giovanni_Segar - both have a higher memory limit, yes. Pro in particular has double the memory of the Starter plan.

And of course, with Enterprise, that’s an entirely different ballgame and something you can talk to sales about :wink:

1 Like

I am currently on an enterprise plan and running into memory issues as well, I just wish the platform would work properly. I only have 5 active workflows.

Hi @antonio_monteiro :wave: I’ve taken a look there, and it looks like another team member has responded to your email you sent over :+1: You should also have some more members of the n8n team getting in touch with you shortly. :bowing_man: