Extract json array to insert into a database

Hi all,

I’m quite a newbie at using N8N which is really great.

I’m planning to use it a lot for home usage to manage all kind of stuff at home and do the link between domotic system and Homekit. That’s the context !

For now, I’m strugling about handling data tha I would like to insert into a SQL database, probably Timescaledb.

I request an API with HTTP Request, and I get this :

[
{
"estimated_actuals": [
{
"pv_estimate": 0,
"period_end": "2021-08-19T20:00:00.0000000Z",
"period": "PT30M"
},
{
"pv_estimate": 0.0046,
"period_end": "2021-08-19T19:30:00.0000000Z",
"period": "PT30M"
},
{
"pv_estimate": 0.0651,
"period_end": "2021-08-19T19:00:00.0000000Z",
"period": "PT30M"
},
{
"pv_estimate": 0.6825,
"period_end": "2021-08-19T18:30:00.0000000Z",
"period": "PT30M"
}
]
}
]

I’m strugling with the SET node to extract the data so it can be inserted row after row in a database.

Am I missing something simple or does it require to write a javascript function ?

Thanks a lot for your help !

Welcome to the community @mcc37

Sadly you cannot extract the data with a set node. For that, a function node after the HTTP node is needed. The function node should look as shown below.

const estimated_actuals = items[0].json.estimated_actuals

return estimated_actuals.map(estimate => ({ json: estimate }))
Example workflow
{
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        250,
        300
      ]
    },
    {
      "parameters": {
        "functionCode": "return [\n    {\n      json: {\n        \"estimated_actuals\": [\n                                {\n                                  \"pv_estimate\": 0.0046,\n                                  \"period_end\": \"2021-08-19T19:30:00.0000000Z\",\n                                  \"period\": \"PT30M\"\n                                },\n                                                                {\n                                  \"pv_estimate\": 0.0046,\n                                  \"period_end\": \"2021-08-19T19:30:00.0000000Z\",\n                                  \"period\": \"PT30M\"\n                                },\n                            ]\n  \n        }\n     }  \n  ]\n"
      },
      "name": "Function",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        560,
        300
      ],
      "notesInFlow": true,
      "notes": "HTTP Request Mockup"
    },
    {
      "parameters": {
        "functionCode": "const estimated_actuals = items[0].json.estimated_actuals\n\nreturn estimated_actuals.map(estimate => ({ json: estimate }))\n"
      },
      "name": "Function1",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        800,
        300
      ],
      "notesInFlow": true,
      "notes": "Mapping"
    }
  ],
  "connections": {
    "Start": {
      "main": [
        [
          {
            "node": "Function",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function": {
      "main": [
        [
          {
            "node": "Function1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

Thanks @RicardoE105

I’m gonna try this.

That should be a nice evolve to be able to manipulate this kind of json without code.

Or maybe, I should build a connector for this API that would handle that point and return datas in the right format.

1 Like

FYI a node to do this without code is in the works.

1 Like

Great news @sirdavidoff ! I will be happy to test as available?

Just tested your function @RicardoE105, it works perfectly. Thanks a lot !

Just to share, I did achieve to setup my N8N workflow to pull Estimated PV production at my home from the public Solcast API and insert into a TimescaleDB instance.

I have also another one for Predicted PV production.

I’m gonna be able to compare with real production without code, just amazing :slight_smile:

1 Like

Really cool. Thanks for sharing.