Problem in node ‘Read Binary Files‘ Cannot create a string longer than 0x1fffffe8 characters

Okay so this might be a bug, I’m trying to load a binary file of 850 Mb to use with pinecone. It seems to load the whole file into one string

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

install command:

docker run -d  -e "N8N_PAYLOAD_SIZE_MAX=2000" -e "NODE_OPTIONS=--max-old-space-size=4096" --memory 5000m --name n8n -p 5679:5678 -v n8n_data:/home/node/.n8n docker.n8n.io/n8nio/n8n:ai-beta

EXECUTIONS_PROCESS:
No Idea

Os:
Artix Linux

Hey @dRAT3,

By default we handle all binary data in memory and this looks like the file is too large for that, Can you try setting the N8N_DEFAULT_BINARY_DATA_MODE environment variable to filesystem and trying again.

Okay thx this worx. But now my convert to json has the exact same error. Is there a way I could solve this with a code block?

Hey @dRAT3,

You could try reading the data using streams / in chunks to see if that helps but I don’t have an example for doing this in the code node.

Which node are you using to convert to json it could be that we need to update the node to work with larger data sets.

I believe it was Convert to/from binary data with the set all data enabled, I’ve deleted the flow, so not 100% sure.

Being able to work with larger datasets would be awesome, I’d be the first to make a video on how to do it with the docker-n8n-beta since I’ve got all commands in my cheat sheet to set up a working env with large files.

Also if you could make the working with large files async in logical places that would be overpowerd. I’d be happy to help out with testing.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.