Glad to hear you figured out what was wrong here @Edmunds_Priede, thanks for confirming!
n8n can be quite memory hungry as it has no built-in limits when processing large amounts of data. If it’s mostly n8n consuming the memory on your system, it can be worth reducing the amount of data processed in a single workflow execution.
A quick example on how to address this in n8n: I was working with a user who had trouble importing all Google Calendar events the other day. In such a scenario it could make sense to first define a sensible pagination schema in the parent workflow (in this example case date ranges of one month each) and then have your parent call sub-workflows for each of these pages (with the sub-workflows only returning a very small dataset if any).
That way, memory would only be required for each sub-workflow execution processing only a sub-set of your data and would become available again afterwards.
The workflows looked like this:
Parent
Sub-workflow
Now only month worth of data would be processed at once rather than three years:
Hope this helps folks facing the same problem!