Upload all files from FTP to Nextcloud

Hello there,

I am new to n8n and looking forward to some guidance. I am probably just not getting it.

I am trying to receive a list of files from an FTP and then upload it to a folder on Nextcloud. I have currently set up a simple workflow, where I just list all the files from the FTP folder, which works fine. I want all those files now to be uploaded to Nextcloud.

In the Nextcloud upload settings I have to set a specific file name - but I do not want to preselect a specific file. I just want it to upload all the files found on the FTP. I can also not find an expression that represents all the file paths from the previous FTP list. I thought I would find that under the Previous Node’s output, but that is not the case.

{
  "nodes": [
    {
      "parameters": {
        "operation": "list",
        "path": "=/path/uploads/"
      },
      "name": "FTP",
      "type": "n8n-nodes-base.ftp",
      "typeVersion": 1,
      "position": [
        460,
        300
      ],
      "alwaysOutputData": false,
      "credentials": {
        "ftp": {
          "id": "2",
          "name": "XXXX"
        }
      }
    },
    {
      "parameters": {
        "path": "=",
        "binaryDataUpload": true
      },
      "name": "Nextcloud",
      "type": "n8n-nodes-base.nextCloud",
      "typeVersion": 1,
      "position": [
        700,
        300
      ],
      "credentials": {
        "nextCloudApi": {
          "id": "3",
          "name": "XXX"
        }
      }
    }
  ],
  "connections": {
    "FTP": {
      "main": [
        [
          {
            "node": "Nextcloud",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

I am using the latest docker version of n8n running on unraid with default DB. Thanks!

Hi @bvelte, n8n keeps all data in memory during the execution of a workflow (for now, we actually rolled out some changes to this with 0.156.0 which aren’t enabled yet by default). Meaning if you’re working with a large amount of data, you will eventually max out the available memory causing your instance to crash.

That said, you technically can upload all files on an FTP path to Nextcloud. You’d just need to download them first from your FTP server rather than just listing them:

Example Workflow
{
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        240,
        300
      ]
    },
    {
      "parameters": {
        "operation": "list",
        "path": "/pub/example"
      },
      "name": "List",
      "type": "n8n-nodes-base.ftp",
      "typeVersion": 1,
      "position": [
        460,
        300
      ],
      "alwaysOutputData": false,
      "credentials": {
        "ftp": {
          "id": "37",
          "name": "test.rebex.net"
        }
      }
    },
    {
      "parameters": {
        "path": "={{$json[\"path\"]}}"
      },
      "name": "Download",
      "type": "n8n-nodes-base.ftp",
      "typeVersion": 1,
      "position": [
        680,
        300
      ],
      "credentials": {
        "ftp": {
          "id": "37",
          "name": "test.rebex.net"
        }
      }
    },
    {
      "parameters": {
        "path": "=/Community Testing/{{$binary.data.fileName}}",
        "binaryDataUpload": true
      },
      "name": "Nextcloud",
      "type": "n8n-nodes-base.nextCloud",
      "typeVersion": 1,
      "position": [
        900,
        300
      ],
      "credentials": {
        "nextCloudApi": {
          "id": "34",
          "name": "NextCloud account"
        }
      }
    }
  ],
  "connections": {
    "Start": {
      "main": [
        [
          {
            "node": "List",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "List": {
      "main": [
        [
          {
            "node": "Download",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Download": {
      "main": [
        [
          {
            "node": "Nextcloud",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

Hi there, thanks a lot for your quick reply and explanation. I copied your example and added my FTP to the first stage. In the Copy-Node it has filled the path with /path/. - executing it gives me an error “Not a regular file”.

Ah, that’s most likely because your FTP server will return more than just files in your workflow whereas my test server only returned actual files when listing the example directory.

You’d need to filter these additional items out to avoid trying to download a directory, for example using the IF node.

@MutedJam yeah, the file list shows me a file named . and one named for whatever reason. I managed to if that out, resoluting in 2 valid files in the list. But what exactly do I have to put into the path for the FTP download to now download the 2 remaining files?

1 Like

An expression like {{$json["path"]}} should do the trick like in the example workflow I have shared. This would reference the path of each item the FTP node receives:

image

@MutedJam Thanks, that worked like a charm. I was so confused by it being replaced with 1 filename in the overview, I thought it only does 1 file then. Still much to learn.

Anyways, I really appreciate your help!

@MutedJam As I am thinking about it I would have a more complex requirement as well, maybe you can add something here. I would like to only upload files from FTP > Nextcloud that are not already uploaded there. Can I somehow check/compare, which files would be new?

Or the other way around: Can I set up a trigger, that checks for new files on FTP and then only give them to Downloads/Nextcloud?

So a trigger node for new FTP files doesn’t exist yet for n8n I am afraid. You could, however, via the Cron or Interval nodes regularly fetch the file list from both Nextcloud and your FTP and then use the Merge node to filter the items that don’t exist in Nextcloud yet, for example like so:

(Again you would need to filter out any directories like . or .. as before in case your FTP server returns them)

Example Workflow
{
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        240,
        300
      ]
    },
    {
      "parameters": {
        "operation": "list",
        "path": "/pub/example"
      },
      "name": "List",
      "type": "n8n-nodes-base.ftp",
      "typeVersion": 1,
      "position": [
        460,
        200
      ],
      "alwaysOutputData": false,
      "credentials": {
        "ftp": {
          "id": "37",
          "name": "test.rebex.net"
        }
      }
    },
    {
      "parameters": {
        "path": "={{$json[\"path\"]}}"
      },
      "name": "Download",
      "type": "n8n-nodes-base.ftp",
      "typeVersion": 1,
      "position": [
        1120,
        400
      ],
      "credentials": {
        "ftp": {
          "id": "37",
          "name": "test.rebex.net"
        }
      }
    },
    {
      "parameters": {
        "path": "=/Community Testing/{{$binary.data.fileName}}",
        "binaryDataUpload": true
      },
      "name": "Nextcloud",
      "type": "n8n-nodes-base.nextCloud",
      "typeVersion": 1,
      "position": [
        1340,
        400
      ],
      "credentials": {
        "nextCloudApi": {
          "id": "34",
          "name": "NextCloud account"
        }
      }
    },
    {
      "parameters": {
        "resource": "folder",
        "operation": "list",
        "path": "/Community Testing/"
      },
      "name": "List Folder",
      "type": "n8n-nodes-base.nextCloud",
      "typeVersion": 1,
      "position": [
        460,
        400
      ],
      "credentials": {
        "nextCloudApi": {
          "id": "34",
          "name": "NextCloud account"
        }
      }
    },
    {
      "parameters": {
        "keepOnlySet": true,
        "values": {
          "string": [
            {
              "name": "filename",
              "value": "={{$json[\"path\"].match(/[^/]*$/)[0]}}"
            }
          ]
        },
        "options": {}
      },
      "name": "Get Filename From Path",
      "type": "n8n-nodes-base.set",
      "typeVersion": 1,
      "position": [
        680,
        400
      ]
    },
    {
      "parameters": {
        "mode": "removeKeyMatches",
        "propertyName1": "name",
        "propertyName2": "filename"
      },
      "name": "Merge",
      "type": "n8n-nodes-base.merge",
      "typeVersion": 1,
      "position": [
        900,
        400
      ]
    }
  ],
  "connections": {
    "Start": {
      "main": [
        [
          {
            "node": "List",
            "type": "main",
            "index": 0
          },
          {
            "node": "List Folder",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "List": {
      "main": [
        [
          {
            "node": "Merge",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Download": {
      "main": [
        [
          {
            "node": "Nextcloud",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "List Folder": {
      "main": [
        [
          {
            "node": "Get Filename From Path",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Get Filename From Path": {
      "main": [
        [
          {
            "node": "Merge",
            "type": "main",
            "index": 1
          }
        ]
      ]
    },
    "Merge": {
      "main": [
        [
          {
            "node": "Download",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

Thanks a lot, I will give it a try!

Where does the FTP download save the files exactly? It there a way to delete them from the server the n8n docker is hosted?

Hey @bvelte,

The files sit in memory until the workflow has ended, They won’t be written to the local disk until you use the “Write Binary Node”.

Got it, thanks!

1 Like

I implemented that workflow and now I am stuck with the following error in the last Nextcloud upload node:

ERR_FR_MAX_BODY_LENGTH_EXCEEDED

I already increased the RAM for node.js (even though I do not know why it consumes this much). In the current workflow it finds a 18MB video file and then fails at the nextcloud upload with the given MAX_BODY_LENGTH error.