Inside Docker, paths are relative to the container’s file system, not the host system.
Example:
/docker/n8n-data/ might exist on your host, but the container may not recognize it.
Your Volume May Not Be Mounted Correctly
If n8n doesn’t have access to the folder, it won’t be able to read the file.
Run this command to check your mounted volumes:
docker inspect n8n | grep Mounts -A 10
How to Fix It
Mount Your Local Folder as a Docker Volume
If you’re running n8n via Docker, make sure the volume is properly mounted when starting your container:
Hey Lutz, it looks like you are using n8n on a Synology-Diskstation.
make sure you add a mount for your data-folder to the respective docker-container like this:
I have never gotten this to work. My work around is to upload it to Google and then set a trigger to execute when the file is changed. Path statements never seem to work with this node, even when I am running N8N locally.
could ou share it ?
I just need to read a csv. If I open it locally or in google doesnt mater
If I try to open a csv via google drive I do not get the data.
I can download the file to a harddrive but I have no acces to the content of the csv file.
Sorry, I left out a few details. After you upload the .csv file to the Google drive, open and save it as a Google sheet. Then you can access it with no problems using the Google sheet node. There is a Excell node which I have never used. There is also an Airtable node. I just discovered Airtable. I think it may be new. I love it and am now using it for all my table needs.
the CSV is now in a google drive. I have access but its impossible to get any data out of the csv.
When I click on “Download” I can download it to my computer. But thats not what I want.
And when I click on “view” it says that this plugin is not supported.
note: make sure you have any other volumes for n8n mapped correctly as well, my screenshot is just an example where you can find the settings and add your folder.
how does your CSV-data look like? maybe you need to adjust the node’s options eg specifying the delimiter and/or encoding. size of your data may also play into this, so you could limit the rows to be read.
to help you further, could you pls copy your workflow here and give an example row of your CSV-data (anonymized)?
can you confirm you run this on a Synology Diskstation and if yes, what model is it? It could also be a resource-related issue.
you may need to adjust the permissions for the docker and system-user to read from your shared folder /web/test/ like this using Synology File Station. restart your docker-container after changing the permissions.
The delimiter of the CSV = “,”
The structure looks like this:
Anrede,Vorname,Nachname,Firma,Straße und Nr.,Adresszusatz,PLZ,Ort,Telefon,Mobil,E-Mail-Adresse,Geburtsdatum,Webseite,Kunde seit,Letzter Auftrag (Datum),Kundennummer,Interner Schlüssel,Kundengruppe
filesize is about 400 kb - so nothing very big.
Hosted: Synology DS923+ / fully updated (latest n8n and latest DS from Synology)