Binary File Data Not Accessible When Passed via Execute Sub-Workflow Node

The problem

n8n Extract Node Not Working in Subworkflow - Binary File Data Lost When Passed Through Execute Sub-Workflow Node

I’m experiencing an issue where the Extract node fails to process binary file data within a subworkflow when the file is passed via the “Execute Sub-Workflow” node. The binary data is sent in the correct n8n format with all required fields (mimeType, fileType, fileExtension, data, fileName, fileSize), but the extraction fails silently.


Key Problem Details:

  • :white_check_mark: Works: Extract node functions correctly when testing subworkflow independently using “Testing n8n sub-workflow” trigger
  • :cross_mark: Fails: Extract node produces no output when subworkflow is executed via “Use case (When executed by another workflow)” trigger
  • Binary data format used: Standard n8n binary object with mimeType, fileType, fileExtension, data, fileName, fileSize

Attempted Solution - “Include Other Input Fields” Method: During research, I discovered that when using Extract nodes after intermediate processing nodes (like Set, Code, or other transformation nodes), you must enable “Include Other Input Fields” in those nodes to preserve binary data.

However, this solution doesn’t work for subworkflows. Even when all Set nodes within the subworkflow have “Include Other Input Fields” enabled, the Extract node still fails when the binary data comes from a parent workflow via the Execute Sub-Workflow node.

Root Issue:
How to properly pass and extract binary file data in n8n subworkflows when called by parent workflows? This workflow pattern should be fundamental for modular n8n automation design.

What is the error message (if any)?

No explicit error message. The Extract node executes but produces no output and no error - it simply fails silently to extract the file content. This makes debugging particularly challenging.

Please share your workflow

Main Workflow (calls subworkflow):

Sub-Workflow (contains Extract node):

Information on your n8n setup

  • n8n version: 1.99.1
  • Database (default: SQLite): default
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker on Railway.com (docker.n8n.io/n8nio/n8n)
  • Operating system: Linux

Tags: #n8n #subworkflow #extract-node binary-data #file-processing #execute-sub-workflow #workflow-automation #include-other-input-fields

Related Issues: n8n binary data handling, subworkflow file processing, Extract node limitations, Execute Sub-Workflow node binary data transfer, “Include Other Input Fields” not working in subworkflows

Hi there, why not just make the sub workflow to receive all data? because i tried it and it works

you just need to format the data that you want to sent in the parent workflow, but for the sub workflow you can just make it to accept all data so the binary will not get blocked

if this helps, please mark it as the answers

1 Like

Hello @fahmiiireza , thanks for guiding me in a direction.

I tried your suggestion, but it didn’t resolve the specific issue I mentioned above. However, I discovered that one of the Edit Fields (Set) nodes in the parent workflow was missing the “Include Other Input Fields” toggle, so I enabled that setting.

Hi there!
I’ve checked and tried your workflow also your subworkflow!

Well, you’re not doing anything wrong, but maybe, you just typing something that you don’t want :smiley:

From my experience with n8n. I suggest you to remove the “data” key from sub-workflow node, to make it run.

If your goal was converting from json to binary, below it’s your flow with some tweak from me

This your Main Workflow code

1 Like

This Your SubWorkflow

if it helps, please mark it as answers, thanks!

Hello @cutecatcode !

Thanks for testing my workflow and sharing the screenshots! I appreciate the help.

Your suggestion about removing the “data” key shows you understand the issue.

You’re right about that solution working.

However, I need to send binary files together with other metadata (including the “data” key) to create an “any-file-converter” subworkflow that takes various file types and returns extracted content as JSON.

Do you know how to preserve both the binary data and additional metadata when passing through subworkflows?

I think, you can add additional data to main workflow, and I suggest changing the input data node on Subworkflow Trigger. ^^

1 Like

Thanks for the quick replies! I figured out the issue - it’s all about binary data propagation through the node chain.

Solution:

  • ‘Split Out’ node: enable ‘Include Binary’ (under Options)
  • All ‘Edit Fields (Set)’ nodes: enable ‘Include Other Input Fields’

Alternative - Direct binary reference without propagation:

  • Put a code node directly infront of where you need the binary file
// java
return {
  json: $('Edit Fields1').item.json,
  binary: $('Edit Fields1').item.binary
};

Aditional node I required for this particular workflow:

Binary file renaming:

// java
return $input.all().flatMap(item => 
    item.binary ? Object.keys(item.binary).map(key => ({
        json: { fileName: item.binary[key].fileName },
        binary: { data: item.binary[key] }
    })) : []
);

The key insight: binary data doesn’t automatically pass through nodes - you need to explicitly preserve it at each step.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.