Binary file uploads to Dropbox only 58 bytes

I’m trying to upload a .zip file from n8n to Dropbox using a Code node, but Dropbox always receives a file that is only 58 bytes in size — even though the original files are around 250 MB to 2 GB.

When I inspect the binary data inside n8n, I see:
items[0].binary.data.data → “filesystem-v2, 9 bytes”

So it looks like the binary content isn’t being read properly from the filesystem before upload.
Here is the workflow:

My reference for the code node:

:gear: Setup Details

  • n8n version: latest

  • Deployment: Docker (Windows 11 host)

  • Binary data mode: filesystem (N8N_DEFAULT_BINARY_DATA_MODE=filesystem)

  • Environment variables:
    “N8N_FORMDATA_FILE_SIZE_MAX=8192”,
    “N8N_RUNNERS_MAX_OLD_SPACE_SIZE=8192”,
    “N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS=true”,
    “N8N_DEFAULT_BINARY_DATA_MODE=filesystem”,
    “N8N_PAYLOAD_SIZE_MAX=8192”,
    “NODE_VERSION=22.19.0”,
    “NODE_ENV=production”,
    “N8N_RELEASE_TYPE=stable”,
    “N8N_RUNNERS_TASK_TIMEOUT=600000”,
    “N8N_RUNNERS_ENABLED=true”,
    “NODE_OPTIONS=–max-old-space-size=8192”,
    “N8N_BINARY_DATA_TTL=43200”,
    “PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin”,
    “YARN_VERSION=1.22.22”,
    “NODE_ICU_DATA=/usr/local/lib/node_modules/full-icu”,
    “SHELL=/bin/sh”

What I suspect
It seems items[0].binary.data.data only contains a reference string (like “filesystem-v29 bytes”) instead of the actual file data. So when Buffer.from() runs, it’s only converting that short string, not the actual binary file.

How can I properly load the full binary file from n8n’s filesystem mode so the Code node sends the actual file bytes to Dropbox?

PROBLEM:-

When N8N_DEFAULT_BINARY_DATA_MODE=filesystem,
n8n does not keep binary data in memory.
Instead, it stores only a reference like:

items[0].binary.data.data = "filesystem-v2, 9 bytes"

so when you do

Buffer.from(items[0].binary.data.data, 'base64')

SOLUTION:-

use this build-in helper

await this.helpers.getBinaryDataBuffer(itemIndex, 'data')

THIS LOAD THE ACTUAL FILL CONTENT FROM THE FILESYSTEM AUTOMATICALLY -even for GB - size files.

FIX AND REPLACE YOUR JSON CODE WITH

// ✅ Load real binary file data from n8n filesystem
const fileBuffer = await this.helpers.getBinaryDataBuffer(0, 'data');
const fileSizeBytes = fileBuffer.length;

// ✅ Access token (your Dropbox token)
const accessToken = "sample"; // replace with real token

let fileName = $json.fileName || "uploaded_file.zip";
fileName = fileName.trim().replace(/\s+/g, "_");
if (!fileName.toLowerCase().endsWith(".zip")) fileName += ".zip";

// ✅ Dropbox path
const dropboxPath = `/Migration/${fileName}`;

// ✅ Helper: simple upload (<150MB)
async function uploadSimple() {
  return await this.helpers.httpRequest({
    method: 'POST',
    url: 'https://content.dropboxapi.com/2/files/upload',
    headers: {
      'Authorization': `Bearer ${accessToken}`,
      'Content-Type': 'application/octet-stream',
      'Dropbox-API-Arg': JSON.stringify({
        path: dropboxPath,
        mode: 'overwrite',
        autorename: true,
      }),
    },
    body: fileBuffer,
    encoding: null,
  });
}

if (fileSizeBytes < 150 * 1024 * 1024) {
  const response = await uploadSimple.call(this);
  return [{ json: { success: true, method: 'simple', fileSizeBytes, response } }];
}

// ✅ Chunked upload for big files (>150MB)
const chunkSize = 150 * 1024 * 1024;
let offset = 0;

const start = await this.helpers.httpRequest({
  method: 'POST',
  url: 'https://content.dropboxapi.com/2/files/upload_session/start',
  headers: {
    'Authorization': `Bearer ${accessToken}`,
    'Content-Type': 'application/octet-stream',
  },
  body: fileBuffer.slice(0, chunkSize),
  encoding: null,
});
const sessionId = JSON.parse(start).session_id;
offset += chunkSize;

while (offset < fileSizeBytes) {
  const isLast = offset + chunkSize >= fileSizeBytes;
  const chunk = fileBuffer.slice(offset, offset + chunkSize);

  const url = isLast
    ? 'https://content.dropboxapi.com/2/files/upload_session/finish'
    : 'https://content.dropboxapi.com/2/files/upload_session/append_v2';

  const headers = {
    'Authorization': `Bearer ${accessToken}`,
    'Content-Type': 'application/octet-stream',
    'Dropbox-API-Arg': JSON.stringify(
      isLast
        ? {
            cursor: { session_id: sessionId, offset },
            commit: { path: dropboxPath, mode: 'overwrite', autorename: true },
          }
        : { cursor: { session_id: sessionId, offset } }
    ),
  };

  await this.helpers.httpRequest({
    method: 'POST',
    url,
    headers,
    body: chunk,
    encoding: null,
  });

  offset += chunk.length;
}

return [{ json: { success: true, method: 'chunked', fileSizeBytes, message: 'Upload complete' } }];

HOPE THIS WILL HELP YOU !!!

1 Like

Why a code node?

Try this as a sub-workflow:

When uploading using the built in node for dropbox, it will respond with invalid array length (see the github link above).

1 Like

I know you want me to look at the code in github, but I instead wrote a sample that uploads large files to DropBox using their upload session and in chunks (binary is the only way)… I tested it with a 311mb executable file. It works. So here it is: