Add CSV Export Option for Database/Data Table Results

Current Situation

Currently, when working with database nodes (PostgreSQL, Supabase, etc.) or data tables in N8N, there’s no native, straightforward way to export the results directly to CSV format. Users need to create additional nodes and workflows to transform and save data as CSV files.

Problem

This creates friction in common workflows where users need to:

  • Export database query results for reporting

  • Generate CSV files for data analysis

  • Create data backups in CSV format

  • Share data with non-technical stakeholders

Proposed Solution

Add a built-in “Export to CSV” option in data table views and database nodes with the following features:

Option 1: In the Data Table UI

  • Add an “Export to CSV” button in the data table view (similar to the current Copy/JSON options)

  • Allow users to export visible results directly from the UI

Option 2: As a Node Output Option

  • Add a toggle/option in database nodes: “Output as CSV”

  • When enabled, the node would output the data in CSV format ready to be saved

Option 3: New “Export Data” Node

  • Create a dedicated node for data export operations

  • Support multiple formats: CSV, Excel, JSON, XML

  • Include configuration options for:

    • Delimiter selection (comma, semicolon, tab)

    • Header row inclusion

    • Encoding (UTF-8, ISO-8859-1, etc.)

    • Date format customization

Use Cases

  1. Automated Reports: Generate daily/weekly CSV reports from database queries

  2. Data Integration: Export data to be consumed by legacy systems that only accept CSV

  3. Backup & Archive: Create periodic CSV backups of critical data

  4. Compliance: Generate audit trail files in standardized formats

  5. Quick Analysis: Export data for quick analysis in spreadsheet applications

Benefits

  • Reduced Workflow Complexity: Eliminate the need for multiple transformation nodes

  • Better User Experience: Intuitive export functionality

  • Time Savings: Speed up common data export tasks

  • Standardization: Consistent CSV export across different data sources

Technical Considerations

  • Support for large datasets (streaming for big exports)

  • Configurable CSV options (delimiter, quote character, escape character)

  • Proper handling of special characters and encoding

  • Memory-efficient implementation for large exports

Examples

Current Workaround:

DB Query → Code Node (transform) → Write Binary File → Save to Storage

Proposed Flow:

DB Query → Export CSV Node → Save to Storage

Or simply:

DB Query (with "Export as CSV" option enabled) → Save to Storage

Related Features

  • Could be extended to support other export formats (Excel, Parquet, JSON Lines)

  • Could integrate with existing file storage nodes (Google Drive, Dropbox, S3)

Screenshots

[Include the screenshot showing the Supabase data table that needs export functionality]


Additional Context

This feature would be particularly valuable for:

  • Legal tech professionals working with case management data

  • Data analysts creating regular reports

  • Teams migrating between systems

  • Compliance and audit requirements


Hi @Marcos_Antonio, thx for the detailed feature request.

Option 1 is the one that’s on our radar right now and we have a first working prototype internally.

I expect maximum a few weeks and it should be released :+1:

@Konsti but please don’t forget to also provide an import feature so the export can be used as backup that can also be imported

1 Like

Yep, that’s coming.

Update: with n8n version 1.122.2 you can export and import data tables as csv

2 Likes

this @Marcos_Antonio

Thanks a lot for this feature - however when I try to import, it tells me that id, createdAt, updatedAt are reserved columns that cannot be created and with this, the import fails

@Wall-E This is a known limitation currently. We have a feature in our roadmap to make this easier. Till then, you can use this workaround.

  • Rename these column when you import the csv (ex: id_temp, createdAt_temp and updatedAt_temp)
  • Delete these column after the data table is deleted.

We know this is not an ideal experience, and we will make it better soon.
Hope this helps.

Thanks for looking into this. I also tried to use the workaround you mentioned but still wanted to “at least report” this finding

1 Like