Please execute the whole workflow, rather than just the node. (Existing execution data is too large.)

Hello everyone.
I was working on an automation where i had to download files from google drive and upload create a draft on outlook. But I am getting “Please execute the whole workflow, rather than just the node. (Existing execution data is too large.)” { "meta": { "instanceId": "e1e4cd3b5a891d2f68b492c2798a3408fcaffc9ea0e4ef5a8cc5180f786b1d84" }, "nodes": [ { "parameters": { "method": "POST", "url": "=https://graph.microsoft.com/v1.0/me/messages/{{$('Microsoft Outlook').last().json.id}}/attachments", "authentication": "predefinedCredentialType", "nodeCredentialType": "microsoftOutlookOAuth2Api", "sendHeaders": true, "headerParameters": { "parameters": [ { "name": "Content-Type", "value": "application/json" } ] }, "sendBody": true, "specifyBody": "json", "jsonBody": "={{ $json }}", "options": {} }, "id": "e843decc-ea89-4586-a63c-760cadeac9ab", "name": "HTTP Request", "type": "n8n-nodes-base.httpRequest", "typeVersion": 4.2, "position": [ 4260, 280 ], "alwaysOutputData": true, "credentials": { "microsoftOutlookOAuth2Api": { "id": "UOVKaLQKAxhB6bXg", "name": "Microsoft Outlook account" } } }, { "parameters": { "jsCode": "const formattedAttachments = $input.all().map(async (item) => {\n if (item.binary && item.binary.data) {\n const binary = item.binary.data; // Access the binary data directly\n\n // Get the binary data buffer\n let binaryDataBufferItem = await this.helpers.getBinaryDataBuffer(0, 'data');\n \n // Convert binary data to base64\n const base64Content = Buffer.from(binaryDataBufferItem).toString('base64');\n\n // Add file details (file name and MIME type)\n return {\n \"@odata.type\": \"#microsoft.graph.fileAttachment\",\n \"name\": binary.fileName || \"unnamed-file\", // Use file name if available, otherwise default\n \"contentBytes\": base64Content, // Base64 encoded content\n \"contentType\": binary.mimeType || \"application/octet-stream\" // Use MIME type if available, otherwise default\n };\n } else {\n throw new Error(\"Invalid input: Missing binary data in one of the items.\");\n }\n});\n\n// Await all promises from the map function\nconst formattedAttachmentPromises = await Promise.all(formattedAttachments);\n\n// Return the formatted attachments as JSON\nreturn [{ json: { attachments: formattedAttachmentPromises } }];\n" }, "id": "47130509-45e7-493e-9df9-33802d110068", "name": "formatingdata", "type": "n8n-nodes-base.code", "typeVersion": 2, "position": [ 3780, 280 ], "alwaysOutputData": true }, { "parameters": { "fieldToSplitOut": "=attachments", "options": {} }, "id": "35905825-9260-4acf-8eab-754a5c44c051", "name": "Split Out", "type": "n8n-nodes-base.splitOut", "typeVersion": 1, "position": [ 4040, 280 ], "alwaysOutputData": true } ], "connections": { "formatingdata": { "main": [ [ { "node": "Split Out", "type": "main", "index": 0 } ] ] }, "Split Out": { "main": [ [ { "node": "HTTP Request", "type": "main", "index": 0 } ] ] } }, "pinData": {} }

Version: 1.67.1
Database: SQLite
RunningOn: Docker
DockerCommand: docker run -d -e WEBHOOK_URL=Domain -e N8N_AVAILABLE_BINARY_DATA_MOD=filesystem -e N8N_DEFAULT_BINARY_DATA_MODE=filesystem -e N8N_BINARY_DATA_STORAGE_PATH=./ --name n8n -p 5678:5678 -v n8n_data:/home/node/.n8n docker.n8n.io/n8nio/n8n

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

When testing, your execution data is stored in memory. If you don’t hit the bin/trash can icon at the bottom of the screen then often you can exceed the memory limits. Just hit that button after each test and it should be fine to run a test again, or alternatively add a manual trigger and click ‘Test Wofklow’ at the bottom of the screen and it’ll wipe the current execution data and start again :slight_smile:

Still getting the same error Everytime i reach the split Out node it gives that error