Checking memory usage in cloud instance

Is there any node or console that can report the memory be used by a cloud instance when a workflow is running?

We’ve been hitting what appear to be out of memory issues but have no way to evaluate how to optimize because the points of failure change from execution to execution.

Can you explain the mean of some of the information given below?

  • executionMode: regular
  • concurrency: 20

Thanks

instance information

Debug info

core

  • n8nVersion: 1.83.2
  • platform: docker (cloud)
  • nodeJsVersion: 20.18.3
  • database: sqlite
  • executionMode: regular
  • concurrency: 20
  • license: community
  • consumerId: 00000000-0000-0000-0000-000000000000

storage

  • success: all
  • error: all
  • progress: false
  • manual: true
  • binaryMode: filesystem

pruning

  • enabled: true
  • maxAge: 720 hours
  • maxCount: 25000 executions

client

  • userAgent: mozilla/5.0 (windows nt 10.0; win64; x64) applewebkit/537.36 (khtml, like gecko) chrome/134.0.0.0 safari/537.36
  • isTouchDevice: true

Generated at: 2025-04-01T15:28:19.107Z

Hi @Dave_Schafer, Please check our docs for details on env variables see here for execution modes and this for concurrency and this for information about memory limits
I hope this helps!

Thanks for the info on the mode and concurrency.

On the memory side, I’m looking for a solution to know how much of our cloud instance memory is being used by a single execution. Ideally the peak memory used since that will allow us to plan the amount of concurrency we can use.

@Dave_Schafer at the moment we don’t expose that information to end users.

Hi. Since that is case how do we troubleshoot complex workflows that are running out of memory? I need to have some sense of where the memory usage is happening so we can refactor. It’s not an obvious problem to solve with this type of platform.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.