Doubt about the Workflow Structure

Hi, guys! I’m in doubt about the structure of the following workflow:

It’s showing an error, I don’t know if it’s because of the structure or something in the node.

What is the error?

Hello, @RicardoE105!

The following error after executing the “switch” node:

It might be worth manually running the switch and disabling some of the functions then working through them to see if any throw an error.

Hey, @jon! I ran some tests. If I run the functions individually, it works. If I pause all functions (keeping only one), no.
My workflow link is below. I have difficulties putting the code here on the forum.

Just fyi, will not make your workflow work but generally important:

You can not use the Merge-Node like this. Only one connection per input is possible. If you have multiple nodes connected to the same input will it only append the data of one of the nodes and will lose the data of the others. You would have to use multiple Merge-Nodes to make it work.

All right, @jan! I will fix this here! Thank you very much!

I made a video showing the behavior described (I made some changes to the workflow). At first, if I trigger globally, an error is shown. However, if I run the nodes individually, everything works. How do I get over it?

Can anyone help me with this?

Hey @Guilherme_Hirsch,

Is it possible to share the workflow? You can DM me the workflow.

1 Like

Hey @Guilherme_Hirsch,

I tried the workflow you shared with me and it works as expected. I don’t get any errors. I just made a small change. Instead of reading the files from Dropbox, I am reading them from my machine, since I don’t have the files on Dropbox.

How many files are you working with? Do you want to try with just a few files?

I tested it here with fewer lines and it worked. However, with more lines an error appears. And I noticed that on the digital ocean console the following message appears:

This explains why there was an error. One quick solution I can think of is using the SplitInBatches node and process only a few files at a time.

As the memory does not get cleared in between, would the SplitInBatches node not help and it would still crash. So if there are a lot of items to process it would require next to the SplitInBatches node also that the data is processed in a Sub-Workflow. That would reduce the memory footprint.

1 Like

I have installed n8n locally now. This will solve at first. Thanks for the optimization tips, @harshil1712 and @jan! I will consider that too.

1 Like