How to input 6000+ rows from PostgreSQL to Excel without encountering 'Gateway timed out' error in N8N?

I am trying to input over 6000 rows of data from PostgreSQL into Microsoft Excel 365 using N8N. However, I am encountering a “Gateway timed out - perhaps try again later?” (504 Gateway Timeout) error when processing the large dataset.

What I’ve tried:

  • Using the PostgreSQL node to retrieve data and sending it directly to Microsoft Excel 365 node for appending.
  • The process works fine with smaller datasets, but fails when processing 6000+ rows.
  • I’ve confirmed that using a smaller portion of the data works fine, so it’s specifically the large dataset causing the issue.

What are the best practices for processing large datasets in one go without running into timeout errors, and is there a way to increase the timeout or optimize the process for better performance?

Thanks in advance!

If you process 6000 items, 6000 requests would be sent to MS Excel 365 at the same time, which can quickly hit the rate limit.

To handle this, you can use Loop Nodes.

With a Loop Node, you can set the Batch Size, which controls how many items are processed simultaneously.

A Wait Node is optional. You should check if it’s needed to avoid hitting the limit and determine the correct timeout value.

Is this correct? Or is the output done from the loop over items node entered into excel?

in the excel node I set it to input in row 5 for the beginning of the input. if using looping, will it be overwritten?

Just replace the Replace Me Node with the Microsoft Excel 365 node

The Loop Over Items Node splits the items into batches

Thank you, the data has been entered into excel.

how to organize the data inputted in excel sequentially?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.