My workflow currently insert with “Loop Over Items” node about 8000 rows, this takes approximately 50 minutes and is unfeasible for my situation. I guess i can reduce this time inserting the entire file using the function available in SQL for tables, can i do this in the workflow?
It looks like your topic is missing some important information. Could you provide the following if applicable.
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system:
It looks like MSSQL also supports base64 encoding, so sending over a file should be possible While this is focused on postgres, @MutedJam has made a quick overview here:
That being said, it sounds like you’d likely prefer the option of directly inserting the data in rows, instead of uploading a file? If so, you may need to either create smaller batches, or consider creating a sub-workflow that handles processing the data in smaller chunks. If your sub-workflow does all the heavy lifting of processing the data and then only returns a small amount of data back to the main workflow, you should see this speed up You can see more about this here.
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.