Hi everyone,
We are running into a deadlock situation trying to process a large Product Feed (XML, ~174MB) on n8n Cloud. We are hoping someone has a clever workaround for processing large files when the source server behaves aggressively regarding compression.
The Context:
-
Environment: n8n Cloud (v1.121.3).
-
Source: A client’s product feed (hosted on Channable/Cloudflare).
-
File Size: 174MB XML.
-
Goal: Split the XML into items and process them in batches with AI.
The Blockers:
1. “Download All” leads to OOM Since we are on n8n Cloud, we hit memory limits. We are running in binaryDataMode: filesystem, so downloading the file is fine. However, passing this 174MB file to the XML to JSON node causes an immediate Out of Memory (OOM) crash, presumably because it tries to build the entire JSON object in RAM before splitting.
2. “Streaming/Chunking” fails due to Forced GZIP We attempted to build a “Manual Chunking” workflow using the HTTP Request node with Range headers (e.g., bytes=0-100000).
-
The problem: The source server (Cloudflare) ignores
Accept-Encoding: identity. It forces aContent-Encoding: gzipresponse even for partial content. -
The result: We receive a partial chunk of a GZIP stream. Since we don’t have the file header/dictionary for the middle chunks, n8n cannot decompress them (error:
unknown compression methodor just garbage characters).
3. “Manual Streaming” in Code Node is restricted We tried writing a Code Node to stream the binary data from disk using await this.helpers.getBinaryStream(0, 'data') to manually parse/slice the XML without loading it all into RAM.
-
The problem: We get
Error: The function "helpers.getBinaryStream" is not supported in the Code Node(likely a Cloud restriction). -
Using
getBinaryDataBufferon the full file also causes OOM.
The Question: Is there any way in n8n Cloud to: A) Stream/Parse a large XML file from disk (filesystem) line-by-line without loading the whole structure into JSON first? B) Successfully handle a forced GZIP response on an HTTP Range request?
We are stuck between OOM on one side and GZIP corruption on the other. Any pointers would be greatly appreciated!
Thanks!