Download many files from a single URL and upload to S3

I have a URL Dropbox - Steve Silver 2023 MSS Images - Simplify your life which has many files. My requirement is to download all the files in the URL and upload them to S3 bucket

My challenge is I’m able to upload a single image but not the entire image

Please share your workflow

Appreciate any quick help in resolving this issue

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

You can use the Dropbox API to get the list of the files and upload to a s3 bucket.

The challenge is the Dropbox account is set up with no authentication, and n8n doesn’t allow me to proceed further without Dropbox authentication. Also, I can’t ask the vendor who set up the Dropbox to set up Dropbox with authentication. Please provide a solution with Http request URL to download many files and upload to S3 folder.

if you really want to download and then upload, my suggestion is as follows (Assuming you got tokens etc ready)

  1. Make a GET request to the Dropbox API’s list_folder endpoint to retrieve the file metadata for the folder.
  2. Parse the response from the list_folder API call to extract the necessary information, such as the file names and paths.
  3. Construct direct download URLs for each file eg https://content.dropboxapi.com/2/files/download?path=<file_path>
  4. Ensure you have S3 bucket ready
  5. Upload the entire file list to the S3 bucket

So two parts of code (one to download files) and second set to upload to S3

const axios = require('axios');

const folderPath = '/path/to/folder'; // Replace with the path of your Dropbox folder
const accessToken = 'YOUR_ACCESS_TOKEN'; // Replace with your Dropbox API access token

const listFolderUrl = 'https://api.dropboxapi.com/2/files/list_folder';
const downloadUrl = 'https://content.dropboxapi.com/2/files/download';

async function downloadFiles() {
  try {
    // Step 1: Retrieve file metadata for the folder
    const response = await axios.post(listFolderUrl, {
      path: folderPath,
    }, {
      headers: {
        Authorization: `Bearer ${accessToken}`,
        'Content-Type': 'application/json',
      },
    });

    // Step 2: Extract file information from the response
    const files = response.data.entries;

    // Step 3: Download each file
    for (const file of files) {
      const downloadPath = `${downloadUrl}?path=${encodeURIComponent(file.path_display)}`;

      // Make a GET request to download the file
      const fileResponse = await axios.get(downloadPath, {
        headers: {
          Authorization: `Bearer ${accessToken}`,
        },
        responseType: 'stream',
      });

      // Save the file content to your desired location
      fileResponse.data.pipe(fs.createWriteStream(`/path/to/save/${file.name}`));
    }

    console.log('Files downloaded successfully.');
  } catch (error) {
    console.error('Error occurred:', error);
  }
}

downloadFiles();

Upload to S3 javascript

const AWS = require('aws-sdk');
const fs = require('fs');

// Configure the AWS SDK with your credentials
AWS.config.update({
  accessKeyId: 'YOUR_AWS_ACCESS_KEY_ID',
  secretAccessKey: 'YOUR_AWS_SECRET_ACCESS_KEY',
});

const s3 = new AWS.S3();

async function uploadToS3(localFilePath, bucketName, s3Key) {
  try {
    const fileContent = fs.readFileSync(localFilePath);

    const params = {
      Bucket: bucketName,
      Key: s3Key,
      Body: fileContent,
    };

    await s3.putObject(params).promise();

    console.log(`File ${localFilePath} uploaded to S3 bucket with key: ${s3Key}`);
  } catch (error) {
    console.error('Error occurred during S3 upload:', error);
  }
}

// Provide the local file path, S3 bucket name, and desired S3 key as arguments
const localFilePath = '/path/to/local/file';
const bucketName = 'YOUR_S3_BUCKET_NAME';
const s3Key = 'path/in/s3/bucket/filename.ext';

uploadToS3(localFilePath, bucketName, s3Key);

1 Like

Hello, has anyone been able to use this function to upload? I tested it here and the n8n just goes straight through it, doesn’t throw an error or anything, just ignores it.

Thanks for the update @StefanOliveira . To summarize you are saying that “you are able to download the files though Dropbox doesn’t have authentication enabled right?”

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.