CSV Import from local Folder

I try to read a CSV from a local folder.
the relative path is not working.
Also /docker/n8n-data/… is not working.

the file is definitely NOT empty.

Can anybody explain the problem ?

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:
1 Like

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @Lutz_Sch,
Off the top of my head

:one: Relative Paths Don’t Work in Docker Containers

Inside Docker, paths are relative to the container’s file system, not the host system.
Example:
/docker/n8n-data/ might exist on your host, but the container may not recognize it.


:two: Your Volume May Not Be Mounted Correctly

If n8n doesn’t have access to the folder, it won’t be able to read the file.

Run this command to check your mounted volumes:

docker inspect n8n | grep Mounts -A 10

How to Fix It

:one: Mount Your Local Folder as a Docker Volume

If you’re running n8n via Docker, make sure the volume is properly mounted when starting your container:

docker run -it --rm \
  -v /absolute/path/to/your/folder:/data \
  -p 5678:5678 n8nio/n8n

Replace /absolute/path/to/your/folder with your actual CSV directory.
Your file should now be accessible inside the container at:

/data/yourfile.csv

:white_check_mark: Update your File Read node to use this path.


:two: Use the /files Path in n8n Cloud

If using n8n Cloud or an external deployment, try the /files directory:

/files/yourfile.csv

This is the default directory where files are stored in n8n cloud instances.


:three: Try Using the “Move Binary Data” Node

  • If your CSV file isn’t reading properly, add a Move Binary Data node before processing.
  • Set the mode to Binary → JSON to ensure the CSV is parsed correctly.

Hey Lutz, it looks like you are using n8n on a Synology-Diskstation.
make sure you add a mount for your data-folder to the respective docker-container like this:

then use your local path in n8n like: /shared_files/kundendaten_xyz

I have never gotten this to work. My work around is to upload it to Google and then set a trigger to execute when the file is changed. Path statements never seem to work with this node, even when I am running N8N locally.

where to find the Volume-Einstellungen ?


could ou share it ?
I just need to read a csv. If I open it locally or in google doesnt mater

If I try to open a csv via google drive I do not get the data.
I can download the file to a harddrive but I have no acces to the content of the csv file.

Sorry, I left out a few details. After you upload the .csv file to the Google drive, open and save it as a Google sheet. Then you can access it with no problems using the Google sheet node. There is a Excell node which I have never used. There is also an Airtable node. I just discovered Airtable. I think it may be new. I love it and am now using it for all my table needs.

The csv file is updated 2x every day. So i do not want to open it every day 2x and save it as a google spreadsheet. …

the CSV is now in a google drive. I have access but its impossible to get any data out of the csv.

When I click on “Download” I can download it to my computer. But thats not what I want.
And when I click on “view” it says that this plugin is not supported.

Any ideas ??

are you running n8n in Docker on a Synology DSM?
If yes:

  1. stop the container
  2. edit the container
  3. go to the tab volume-settings and add the required mounts
  4. restart your container

note: make sure you have any other volumes for n8n mapped correctly as well, my screenshot is just an example where you can find the settings and add your folder.

just add an “Extract from File”-node and select “Extract from CSV” and you should be good to work with your data:

Nope. That does not work on my end.
Here is the result:

how does your CSV-data look like? maybe you need to adjust the node’s options eg specifying the delimiter and/or encoding. size of your data may also play into this, so you could limit the rows to be read.

to help you further, could you pls copy your workflow here and give an example row of your CSV-data (anonymized)?

can you confirm you run this on a Synology Diskstation and if yes, what model is it? It could also be a resource-related issue.

I have done the changes - its not working

I just tested on a (very old) local Rackstation and it works fine to read a local csv from a mounted folder:

you may need to adjust the permissions for the docker and system-user to read from your shared folder /web/test/ like this using Synology File Station. restart your docker-container after changing the permissions.

2 Likes

The delimiter of the CSV = “,”
The structure looks like this:
Anrede,Vorname,Nachname,Firma,Straße und Nr.,Adresszusatz,PLZ,Ort,Telefon,Mobil,E-Mail-Adresse,Geburtsdatum,Webseite,Kunde seit,Letzter Auftrag (Datum),Kundennummer,Interner Schlüssel,Kundengruppe

filesize is about 400 kb - so nothing very big.

Hosted: Synology DS923+ / fully updated (latest n8n and latest DS from Synology)

:heart_eyes:

Permissions of the folder … THAT WAS IT. YOU MADE MY DAY.
Thanks so much for your help !

1 Like

glad you were able to finally solve the issue, let me know or DM if you need more help in getting to know n8n :partying_face:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.