Problem Description: I am running a workflow that processes student verification forms from a Google Sheet. The workflow involves downloading images from Google Drive, performing OCR with an AI Agent, and updating the results back to Google Sheets.
While I have about 42 items to process, the loop consistently stops after the 6th or 7th item. When it hits a specific large image (approx. 8.18 MB), the AI node fails with a “Run out of memory” error, and the entire execution halts.
Workflow Logic:
Google Sheets Trigger: Fetches student data (approx. 42 rows).
Filter: Filters specific records to process.
Loop Over Items: Iterates through each student record one by one.
Download File (Google Drive): Downloads the student’s ID/System screenshot using a Regex URL.
AI Agent (OCR): Analyzes the image to extract the student’s name.
If Node: Compares the AI-extracted name with the Google Sheet record.
Google Sheets Update: Marks the record as “Approved” or “Pending Review”.
Memory Cleanup (Edit Fields): I have an Edit Fields node at the end of the loop with Include Other Input Fields set to OFF to try and clear the binary data before the next iteration.
Wait Node: A 3-second pause to allow for garbage collection.
The Issues I’m Facing:
Out of Memory: Even with the “Edit Fields” cleanup node, the AI node crashes when encountering an 8.18 MB JPG file.
Loop Premature Termination: The loop often stops and shows “Success” after only 6 iterations, even though there are 42 items in the input.
Questions:
How can I more effectively handle large binary files (8MB+) in a loop to avoid OOM?
Why would the Loop node finish early when there are still items left in the queue?
Is there a way to force n8n to release the memory of the binary object immediately after the AI node finishes its task?
The issue here seems to be that n8n is running out of memory because binary data is kept in RAM until the execution finishes. The best thing that you can do is after the Google Download node add a “Move Binary Data” node and Move it to the File System. Before the AIOCR node add another “Move Binary Data” node and have it “Read from File System” then once the AI node is complete add a final “Move Binary Data” node and set it to “Delete from File System” this keeps each image processed then deleted, and should be able to process all 42 items successfully. Good Luck!