To Upload Large Dataset to Database in n8n

In n8n, I have a webhook that receives an Excel file with around 100,000 rows. I want to insert the data into a PostgreSQL table. Inserting the data row by row or in small batches is very slow.

I know that using Python with the psycopg2 library can insert data much faster. However, the n8n Function node does not support Python or the psycopg2 library.

I installed Python inside the n8n Docker container and I am trying to run a Python script using the Execute Command node. My questions are:

  1. How can my Python script access the Excel file that was uploaded through the webhook?

  2. How can I run this Python script in the fastest way possible from n8n?

Hey @SBGCO welcome to the community :tada:

Use the “Execute Command” node to call your Python script. In that script, you can use the copy_from command from the psycopg2 library.

Quick guide:

  1. Let the webhook write the Excel file to a temporary location on your computer like /tmp/
  2. In your Python script, tell psycopg2 to find the file at that location.
  3. Use copy_from to make PostgreSQL slurp up all 100,000 rows in a single, fast gulp.

If this helps kindly mark as the solution to help others.

Thanks,
It really works.