AWS S3 Node - "Get Many" not returning anything in subfolders

Describe the problem/error/question

When using the AWS S3 Node, “Get Many” only returns files in the root of the bucket, not in subfolders. It perfectly returns root files, not anything else.

What is the error message (if any)?

Empty output on node (when no files in root and only in S3 subfolders)

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
    [email protected] - cloud hosted

  • Database (default: SQLite):
    default

  • n8n EXECUTIONS_PROCESS setting (default: own, main):
    default

  • Running n8n via (Docker, npm, n8n cloud, desktop app):
    cloud

  • Operating system:
    n/a

For some reason in my post above the S3 node shows as not connected. It is connected in my workflow when the error above occurs.

Anyone has any idea how to get this to work and pull the files in subfolders?

I’m also having the same issue with the S3 node…

Im also experiencing the same Issue Although im using Minio S3.
(When enabling always output data the response is "No fields - item(s) exist, but they’re empty
").
However the Create File Get Many request works.

Click options, add folder key enter subfolder name followed by forward slash. i.e. meetings/

only working with one nesting level

any available fix for that?

Hey All

I have been working on some flows with AWS and came across this issue.

I was able to get all files in a subfolder, as well as files in nested folders using the “Folder Key” under options in the S3 node.

I see this has been suggested previously, however there was a comment that you could only search inside one nested file.

It is important to leave a trailing “/” after the file path, that allows you to access multiple nested folders, without it, you will not get any results.

Example below:

Please let me know if this solves your challenge?

that is working just for the ‘files’ but not for the ‘folders’, in my case I want to discover the sub-subfolders to perform some loops before I access the file

Hey @jlv

Ahh okay, I get you.

I tested the same out on folders. Without any folder key, it returned all folders and subfolders in the bucket, then i could also access folders inside of subfolders by adding they key as such:

I tested this with folders created directly in the AWS console and via an n8n flow.

It also seems that using the trailing slash will show you the folder that you are in and all the other files/folders inside it. If you use a trailing slash and there are no files or folders, it will return no results

It may also be worth double checking your AWS IAM permissions:

Make sure your IAM permissions allow listing objects at all prefixes (i.e., s3:ListBucket, s3:GetObject with the correct resource ARN including /*). If only root folder is accessible, subfolders may not appear. It is probably also useful to have s3:GetBucketLocation configured if you have cross-region buckets.

Let me know if this works?

It’s not working, but I thing that is more an issue on how are files uploaded than downloaded, I won’t be able to self-discover, but force the loops by knowing the subfolders