CSV Import to MariaDB

Hi,

Long time viewer, first time poster.
Hoping the N8N Community Guru’s could help with me an issue updating rows a MariaDB Database.

Information on the CSV File

  1. I have a CSV File that gets updated daily with numerous lines.
  2. There are headers in the CSV file, namely stock_code,manufacturer_code,barcode,description,sell_unit,sell_price,moq,stockonhand,brand,category1,category2,category3
  3. Under each of the headers are products with the respective information in relation to the headers.
  4. The database in Question has a corresponding table with the corresponding headers as the row names.

Background information on what I am wanting to achieve.

  1. Setup a file trigger to monitor a specific folder for file changes - Done & Working.
  2. Take that csv file and update the corresponding table in the database with the information from the CSV File, leaving the headers in place. It is at this stage that I am failing and I have tried all I can think of to get this working, sadly with no success.

Link to the CSV File: CSV File

Information on your n8n setup

  • n8n Version0.195.5
  • Database: MariaDB
  • Running n8n via Docker

Thank you for you help.

What exactly is the issue? What does your current workflow look like, and do you have an example CSV?

Hi Pemontto,

Thanks for reaching out. I’ve managed to get the workflow to run, the issue is now the database is not being updated with the information from the csv file. What I mean by this is the insert command in the sql connector runs, but updates the values with NULL. After all my reading I have tried to select the column to be update by using the backtick for MySQL i.e. Product Code but this has not worked.

Here is a link to a demo csv file. https://www.dropbox.com/s/7y1dyg6ymcni1zb/CSV%20test%202.csv?dl=0

I’ve included an image of my workflow and I’ve had to blank out the names of the flows for privacy reasons.

Side Note: the Truncate Table connector is there to remove all the information from the relevant table to ensure that the information uploaded is correct & accurate.

Workflow

SQL Data:

Hey @wrwatk,

Welcome to the community :tada:

Can you provide the actual workflow you are using with the node names changed? It is hard to see what you are trying in the nodes from a screenshot. I suspect what is happening is the data coming out of the spreadsheet file node isn’t matching up with the fields in some way.

Hi @Jon

Here is the actual workflow as requested. For obvious reasons I’ve change the file names, database names etc but the flow remains the same as the original. Hope that helps.

The CSV I am using as well can be found here: Dropbox - CSV test 2.csv - Simplify your life

Hey @wrwatk,

The first thing I have noticed is the Insert Data is not listing all the columns you want to insert, When added thought it does look like there is an issue when inserting to a field with a space in the name.

I will do some testing on that shortly and get a bug ticket created, For now using the execute query option should get you around this one.

Hi @Jon

Thanks for the feedback. For now I’ve managed to get around it my using the SET Connector, but I do agree that it appears to be some sort of bug. Now the next thing to figure out is the updating of rows in the db.

1 Like