Import CSV/SKV Files to MySQL Table

Hello,

I have several measuring stations, each of which has an FTP server on which the measurement data is written to a CSV or SKV file in the following format:
20.01.22;14:44:05;288;12;283

The newer measuring stations have the column label e.g. Date;Time;Temperature… in the first line of the CSV file. The older measuring stations log it without column labeling in an SKV.

In the instructions of n8n a module “Convert to Spreadsheet” is used for the transformation before the MySQL import, but I cannot find it.
With the module “Extract from File” (options RAW Data and Read as String) nothing happens.

What would be the right way here?

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @duck, welcome to the community!

In the instructions of n8n a module “Convert to Spreadsheet” is used for the transformation before the MySQL import, but I cannot find it.

Unfortunately, you didn’t share the version of n8n you’re currently using but the old Spreadsheet File node has indeed been superseded by a Extract From File node. So you’re on the right track. The reason nothing happens on this node might be because using the default settings the node is expecting a header row.

So what you want to do here is add and disable the “Header Row” option like so:

Example workflow:

Hope this helps! Let me know if you have any further questions on this.

That was the problem, now it works. Thank you very much!

I just have one last question but a slightly different topic. The flow should download a CSV and import the data into a MySQL database. However, the CSV remains on the FTP server and only new data is written into it.
The workflow should import the newly logged data into the database every day.

At the moment, however, it downloads the CSV every day and imports all files again. Is it possible to check for new data and only import the new data?

I also have this question

This is possible, but somewhat tricky. The easiest approach will depend on your exact data structure. Does your CSV data include a timestamp? If so, you could simply add a Filter node after reading your CSV file and then only keep records from the past 24 hours (or whatever interval your workflow runs in).