Unable to get buffer size if file larger than 2 GB

Describe the problem/error/question

I’m trying to get the buffer size of a file but it’s unable to execute the code node if it’s larger than 2GB. I’ve already configured to filesystem set N8N_DEFAULT_BINARY_DATA_MODE=filesystem but error still popups. Uploading files larger than 2GB works using the http node works but reading buffer size doesn’t

What is the error message (if any)?

File size (2179631187) is greater than 2 GiB null

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • **n8n version:**1.83.2
  • **Database (default: SQLite):**default
  • **n8n EXECUTIONS_PROCESS setting (default: own, main):**default
  • **Running n8n via (Docker, npm, n8n cloud, desktop app):**npm
  • **Operating system:**WIndows Server 2022
1 Like

Hi @Ruriko

Did you try increasing the default environment variabler N8N_PAYLOAD_SIZE_MAX ? Endpoints environment variables | n8n Docs

1 Like

@mohamed3nan doesn’t this only apply to webhooks etc? this is why it’s an endpoint ENV variable. since 16MB is the limit of the cloud instances.

@jcuypers I’m not entirely sure, to be honest, but just trying things out — trial and error helps find a starting point to debug the issue…

1 Like

There have been changes to different nodes in order to support >2GB. maybe they should make the same changes for this one?

Maybe you can use ‘read binary’ node?

He’s already using the Read node. The issue appears when he tries to get the buffer using the Code node, for further operations

const buffer = await this.helpers.getBinaryDataBuffer(0, 'data');
1 Like

I just tried and no it still won’t execute

I have seen some info in other topics regards :

N8N_RUNNERS_MAX_PAYLOAD Number 1 073 741 824 Maximum payload size in bytes for communication between a task broker and a task runner.
N8N_RUNNERS_MAX_OLD_SPACE_SIZE String The --max-old-space-size option to use for a task runner (in MB). By default, Node.js will set this based on available memory.

not sure if it would help.

I tried set N8N_RUNNERS_MAX_PAYLOAD=4294967296 but still won’t get buffer size

not sure i tried to simulate and already get an error reading a 3GB file :

which translates around 500+MB.

For that read/write node error you have to set N8N_DEFAULT_BINARY_DATA_MODE=filesystem

1 Like

Okay, I replicated it with 5gb file and got an error

Stack trace:

Error: Unknown error at JsTaskRunnerSandbox.throwExecutionError (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/Code/JsTaskRunnerSandbox.js:55:19) at JsTaskRunnerSandbox.runCodeAllItems (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/Code/JsTaskRunnerSandbox.js:25:20) at processTicksAndRejections (node:internal/process/task_queues:95:5) at ExecuteContext.execute (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/Code/Code.node.js:107:20) at WorkflowExecute.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/workflow-execute.js:681:27) at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/workflow-execute.js:913:51 at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/workflow-execute.js:1246:20

also with 2.6GB file, same

I’m not sure if this is a bug we should report…

1 Like

Hi,

the problem stems from here. getasbuffer for filesystem uses std node readfile which supports a max of 2GB …

Indeed, a bug / feature request should be made I guess

reg,
J.

async getAsBuffer(fileId: string) {
	const filePath = this.resolvePath(fileId);

	if (await doesNotExist(filePath)) {
		throw new FileNotFoundError(filePath);
	}

	return await fs.readFile(filePath);
}
1 Like

How did you know there’s an explicit file size limit? I couldn’t find it in the code.

Is there a reason to read such large files into memory? could you use file streams instead?

1 Like

searched in code and after that :slight_smile:

is search good old google

im not a AI cheater :slight_smile:

Also there might already be a solution in n8n github code.
just above the getbuffer there is getstream whcih might solve it ?

regards
J.

1 Like

well we could but its the internal helper functions which use the fixed functions… and not the streams

In addition this is set:

      - N8N_RUNNERS_ENABLED=true
      - N8N_DEFAULT_BINARY_DATA_MODE=filesystem
      - N8N_RUNNERS_MAX_PAYLOAD=4000000
      - N8N_RUNNERS_MAX_OLD_SPACE_SIZE=4000

Hi @netroy,

There’s a use case involving uploading large files to some providers that support it—with limits. Some of them allow chunked uploads, like 500MB per chunk.

For example, if I have a 3GB file, I want to read it and use the Code node to split it into chunks and continue the operations from there.

Also, what exactly did you mean by file streams? Could you please clarify?

So it’s considered a bug? How do I convert the file size to bytes given from the Read Files node?

I feel like many of us are working on similar workflows :sweat_smile:

By the way, the trick I found until this is solved or someone figures it out for us is that I used the ‘execute’ command to get the file size, followed by the code to parse the file size…