Utterly disappointed on memory usage & lack of error logs on cloud

I wanted to 60 page pdf into image files on cloud hosted n8n. To my disappointment, the workflow crashed with a ‘connection lost’ when I tried to decompress 36 mb zip file. Reading the forums got me thinking that it’s a ‘memory issue’, so i upgraded my plan to pro. It still does not work and crashes.

Maybe it’s not a memory issue, maybe it is. I still need more information to be able to debug. It feels like i got tricked into buying a higher tier subscription

1 Like

Absolutely agreeing on this.

Hey @HK_Jeong
Sorry to hear and definitely feel your pain - perhaps you can try request for a refund?

Just leaving this here for others you might come across the same issue.
Here are the memory limits on the different cloud plans. I think I have mentioned this during a feedback session but I’m pretty sure these were the specs decided before AI workflows really took. As you can see even for pro plans, it’s really not a lot!

Trial: 320MiB RAM, 10 millicore CPU burstable
Starter: 320MiB RAM, 10 millicore CPU burstable
Pro-1 (10k executions): 640MiB RAM, 20 millicore CPU burstable
Pro-2 (50k executions): 1280MiB RAM, 80 millicore CPU burstable
Enterprise: 4096MiB RAM, 80 millicore CPU burstable
1 Like

Hey there. I decided to maintain pro account to share my workflow.

I set up a code block and seperate web service to negate this problem. Kinda negates the whole purpose of no code tool, so I hope you guys could do something about it in the future. For this node and for observability in general.

Just to get the word out there - decompression currently does not work when file is over 20mb (roughly)

Appreciate the service

1 Like