🚨 Help Needed: Stuck on Sending Data to BigQuery

Hey guys! :smiley:

I’m a new user with n8n, exploring how to integrate various Google Cloud services like BigQuery and Google Sheets into my workflows. So far, I’ve managed to construct a workflow based on processing data between Google Sheets and BigQuery, but I’ve hit a snag.

Describe the problem/error/question

I’m encountering an error when trying to send the data to BigQuery via an HTTP Request node.

I suspect the issue might be related to how I’m setting up my HTTP Request node, particularly with the URL configuration or perhaps network settings. If anyone has encountered a similar issue or has insights on how to resolve this, your feedback would be greatly appreciated!

What is the error message (if any)?

ERROR: The service refused the connection - perhaps it is offline connect ECONNREFUSED ::1:80

Please share your workflow

Here’s a quick overview of my workflow:

  1. Query BigQuery using the native BigQuery node to fetch user table data.
  2. Use a set node to concatenate first name, last name, and split email.
  3. Another set node to concatenate full name with company name.
  4. Route items based on account language (French or English).
  5. Input data into a Google Sheet.
  6. Wait for 10 minutes (an external script completes the sheet with additional data).
  7. Fetch the updated data from Google Sheet.
  8. Remove duplicates.
  9. Set up to retrieve specific data (LinkedIn URLs, job titles, headlines).
  10. Merge this data with the second set of data ensuring the fetched data matches the correct IDs and other key fields from the query.
  11. A custom script node to create SQL Update queries.
  12. Attempt to send these updates to BigQuery via an HTTP Request node.

Information on your n8n setup

  • n8n version: 1.27.2
  • Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

@David_Lopes , it doesn’t look like you are utilizing the API endpoint correctly. Judging by the URL in the node you are using jobs.insert API which meant for updating jobs meta-data only. This endpoint accepts POST data corresponding to Job object. However, the object structure you submit does not seem to correspond to the expected format.

Is it the correct endpoint in the first place? Did you mean to update some table records instead? Unfortunately, I’m not familiar with BigQuery API to provide a definitive solution. The documentation though is here.

Yes, I would like to update some elements of this table:
ProjectId = HL
DatasetId = postgres_public.t_e_user

Ok, I’ve tried with this URL https://bigquery.googleapis.com/bigquery/v2/projects/datasets/{datasetId}

But it didn’t work. If someone has already implemented POST on the Bigquery API, that would be great.

@David_Lopes , why don’t you use the native Google BigQuery node for this task? It appears to support running custom queries on the datasets.

I discovered the solution to my issue: the URL endpoint I’m using is correct. The key was to create body content in raw format with a JSON content type and implement a function containing my specific query before making the HTTP request.

This approach is effective, though occasionally I encounter empty responses, which account for less than 5% of the total data retrieved.

I plan to investigate this minor issue further at a later time, but for now, it’s a significant progress :slight_smile:

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.