Data from google spreadsheets isn't retrieved correctly

Hello all,

We’re experiencing issues with the data we retrieve from Google spreadsheet.
Unfortunately, I won’t be able to share copies of the spreadsheets as they’re heavy in data (which may be in cause?!) and that would require to anonymise a big share of data.

Describe the problem/error/question

We have the following workflow when it works well:
We have a trigger every 2 hours, that calls two Google spreadsheet read nodes. The two different nodes read two different files with similar structure. Lately, for example, we had 4752 items on one side and 4782 items on the other. It’s the behaviour expected to add new items/rows manually on the second Gsheet). We then merge the data through a merge node which output the 30 new rows created in order to append them to the first Gsheet through a Spreadsheet write node.

However, lately, we have seen two unexpected behaviours when examining the executions made every 2 hours:
1/ sometimes the data retrieved from the spreadsheet 2 has less items (rows) than previous execution. For example, with the workflow mentioned above, we had in previous executions:

  • 4752 items at 7AM,
  • 4591 items at 9AM,
  • 4752 items at 11AM / 1PM / 3PM / 5PM

2/ In the example from above, at 5PM it retrieve 4752 items/rows in the spreadsheet 2 which is the same number of items as in spreadsheet 1. However, it was not correct anymore because in the afternoon 30 rows / items were added by our spreadsheet users and the node didn’t retrieve them. We executed manually the workflow at 6:23PM and this time, it retrieved the 30 new items/rows correctly.

=> these two behaviors have been happening everyday on 3 workflows that are set as this one with different spreadsheets (but the settings are the same).

Here are the settings of the spreadsheet 1:

Here are the settings of the spreadsheet 2:

(we filter on a column of the spreadsheet - which was well set up at the time we didn’t retrieve correctly)

The merging operation is then working correctly, we really experience an issues with the triggering of data.

What is the error message (if any)?

There is no error message, the workflow seems to be triggered and executed well.

Please share your workflow:

Can you please help on this? It’s getting troublesome for us to trigger manually the workflows everyday and we don’t understand what could cause this.

Information on your n8n setup

  • n8n version: 0.221.2 (1 version behind the latest)
  • Database (default: SQLite): n/a
  • n8n EXECUTIONS_PROCESS setting (default: own, main): triggered every 2 hours
  • Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
  • Operating system: n/a

Hi @oly-dev, I am very sorry you are having trouble with this. Unfortunately I could not reproduce this with any of my own sheets I have tried so far.

Can you confirm if the problem is already occuring on the initial " PW x Vestiare Collective Connected CSV" node? Or is it the PS_Oly node on which the number of items changes unexpectedly? Or is this happening on a completely different node (and if so, can you isolate the problem further)?

And can you make the respective spreadsheet (or a copy of it with the personal data redacted) public so I can try to reproduce the problem?

Hi @MutedJam,

Thanks for your answer.

No the issue actually happens in the node PS_Oly, which sometimes doesn’t retrieve the latest data (a bit randomly - it can takes few 2-hour triggers or a whole day to finally see the right number of items).

It doesn’t happen all the time, so it doesn’t help investigation much to try once to add new items (I already tried in a duplicated workflow with duplicates of the spreadsheets) and the data was triggered correctly.

Thanks for your help.

Oly team

It doesn’t happen all the time, so it doesn’t help investigation much to try once to add new items (I already tried in a duplicated workflow with duplicates of the spreadsheets) and the data was triggered correctly.

Oh, seeing this problem only occurs on your original spreadsheet but not on any copy of it, I think the underlying problem might not be with n8n but with the spreadsheet itself and it wouldn’t make much sense for me to work with the copy.

On a general note, n8n would only return the data it receives from the Google Sheets API and wouldn’t remove any items on its own. Is there a chance other users, applications, or scripts are accessing your original sheet while your n8n workflow runs?

1 Like

Yes I know you return what you get from Google but sometimes there is less data at one trigger, while the one before and after have the same data (it’s like the Google API received too many requests and didn’t load some data at the end).

I’m wondering if it comes from the filtering option, because it really acts like the data is cached and not updated when it should have.
However, I tried to get only a read from the Google Spreadsheet and then a ‘if’ to filter the data (then I get the right amount rows). However, I then encounter a memory-related errors.
Are the spreadsheet data too heavy for this manipulation?

here the workflow that failed:

By the way, the copies of the spreadsheets where exactly the same as the spreadsheets we use but I don’t want to risk a bad manipulation by sharing them.

Thank again,
Oly team

Thanks for clarifying @oly-dev!

I can’t change Google’s caching behaviour, but I’ll check if we can tweak the memory allocation for your cloud instance a little bit. Not sure it’s enough to allow using the If node against all your data, but it seems worth a shot. I’ll get back to you once I get feedback on this.

HI @oly-dev, the memory update has now taken place, however I can see that your cloud instance is still struggling. While our engineering team is looking into this, can you also consider:

  1. Upgrading to the latest available version of n8n cloud via the Manage page of your n8n cloud dashboard. This allows you to avail of the latest performance improvements.
  2. Avoid storing all successful and manual executions. This can be configured on the Manage page as well. The reason behind this is that n8n cloud uses SQLite databases which might struggle with large amounts of execution data (like many spreadsheet rows) being written frequently. The behaviour can still be overwritten for individual workflows in the workflow settings as needed.

I’ll get back to you once I have any feedback on this one.

1 Like

Hi @MutedJam,
Thank for the investigation and actions put in place.
It seems that we’re not having problems anymore. However, we will also use the 2 advices you gave here.



This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.