Is there a maximum string size when we read binary file?
I am trying to read a very large binary file and use Move Binary Data node to add this bianry string to a JSON key. I am then using a function node to get JSON records by splitting input string by new line character. My file is 235MB and
I get following error during Move Binary Data node execution: PayloadTooLargeError: request entity too large.
I get following error during Function node execution: There was a problem running hook “workflowExecuteAfter” RangeError: Invalid string length.
Hey @Gowthaman_Prabhu, the maximum payload size can be adjusted using the
N8N_PAYLOAD_SIZE_MAX environment variable.
Though in general I’d suggest avoiding such amounts of JSON data. Subsequent nodes in your workflow would keep their own copy of the data, driving up memory consumption big time. So you might want to split the 235 MB file you’re working with into smaller chunks before processing it with n8n.
Thank you. Yes splitting large inputs before n8n workflow is viable.
@MutedJam Is N8N_PAYLOAD_SIZE_MAX also applicable for local file reads or only when fetching data via REST?
This setting will affect HTTP traffic, but such traffic also occurs when executing a workflow manually and the data is sent from and to the UI or when looking at past execution data.