Hi, Im trying to setup a workflow that is going to process the output of the n8n audit command, Im specifically interested in the Credentials section at the moment.
I’ve tried both the n8n node as well as the CLI interfaces to generate the audit, but in both cases I get the same error:
`` FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
```
I’ve increased the memory for Node up to –max-old-space-size=3072, but that doesn’t seem to be enough.
That FATAL ERROR is a classic—it basically means the audit command is trying to load your entire execution history into RAM to check for credential usage, and 3GB just isn’t cutting it.
Here are the three quickest ways to get past it:
You need way more than 3GB if your database is large. Try setting export NODE_OPTIONS="--max-old-space-size=8192" (8GB) just for this command.
The audit is likely choking on old execution logs. Run n8n db:prune (or set EXECUTIONS_DATA_MAX_AGE=24) to clear out the history, then try again.
Workaround: If your server can’t handle 8GB+ RAM, export your workflows/credentials and import them into a fresh local n8n instance (like on your laptop). Run the audit there—it will be instant since there’s no history to scan.
I didn’t know that it’s loading the entire execution history - we have increased it to:
…MAX_COUNT: 50000
…MAX_AGE: 504
I suppose that explains it. Previously I was able to complete the audit on 2.5GB, but I think that was before the execution history increase. We want to keep the history at this level, so I cant delete it only for the audit, plus, I want to run the audit periodically in a workflow.
I have just found an env var N8N_SECURITY_AUDIT_DAYS_ABANDONED_WORKFLOW in the code - I will experiment with this, as it may reduce the amount of executions being loaded during the audit.
Thank you!
Edit: I have reduced the N8N_SECURITY_AUDIT_DAYS_ABANDONED_WORKFLOW value and that did the trick. Obviously there is a downside to this, because the Credentials not used in recently executed workflows part of the audit is not reliable now, but that is acceptable for my current use case.