I’m looking to create (or see if something already exists) an n8n workflow powered by an agent (AI/LLM) that performs the following:
- Step 1: Accepts file uploads of various formats (CSV, Excel, JSON, XML, etc.).
- Step 2: Agent dynamically analyzes the file structure, extracts schema (column names, types, etc.), and organizes the data into rows and columns.
- Step 3: Checks if a corresponding table already exists in the target database.
- If yes, it appends the new data to the existing table.
- If no, it creates a new table with the inferred schema and loads the data.
The goal is to build a flexible data ingestion pipeline that reduces manual mapping and configuration — ideal for dynamic datasets from different sources and formats.