Problem with processing large amounts of data

Task - Upload information on the product directory from a txt file to the supabase database (tab-separated text, 4 fields in each line: code, name, manufacturer, country). I disassemble the line with the entry into parts, place the product code in the metadata, write the line with the name and manufacturer to the content.

I ran tests with a small file - everything is ok.

I tried to upload the required file in its entirety - it reports that the workflow is complete, but the data is not loaded into the database.

I split the source file into 752 parts, parts with 1000 lines. When uploading data, everything stops at file 14. ± 13500 lines are loaded into the database. No errors are displayed in the workflow.

Total in the source file ~ 752000 lines (57MB)

Where to look for the problem with Upload data?
How do you recommend to upload such data?

I’ve had a similar problem. But n8n signaled it ran out of memory.
Do you see error messages in Supabase logs?

A workaround I thought of but have yet to try:

  • separate the large file into different manageable files,
  • write them in a temp folder in GDrive,
  • load those files into Supabase,
  • remove them

I already split the source file into 752 parts, parts with 1000 lines… I also tried to upload one file at a time (once per minute).

Another Workflow, which accesses the GDrive directory, and uploads only one file (limitation in the node).

After uploading to the supabase, I deleted this file in GDrive.

With small files used for rpimar (10 lines) - everything is ok.

With files of 1000 lines, there is a problem… All stages are performed, data is uploaded to the database, the file is deleted from GDrive after loading, but the Workflow process does not stop, it continues to hang active.

In the first Workflow with a cycle, it feels like the memory is overflowing, in the second variant with deletion, it seems that GDrive does not report the end of file deletion (after deleting the file at the last stage, for verification, I send myself a message to the bot about the completion. On files with 1000 lines, ~100 Kb - confirmations do not come).

By the way, in the GCloud console I saw that errors appeared when using the second Workflow (with reading one file at a time and deleting the processing field)

What to do with this and how to solve the problem, I haven’t figured out yet.

hello sir , I need your help. I have expected all my workflows but I can run only 1 at a time.i searched for it and it say there is a limit of only 1 active workflow at one time in personal and I need to change it to team space in order to run them all together . how do I do that as I have searched all the setting options but I couldn’t find one

Don’t forget that data will stay in memory until the workflow is completed. So splitting it in batches does not help unless you put the batch processing inside a subflow.