Trying to read a 100mb json file

Describe the problem/error/question

Im trying to Read a json file from local disk and insert the items in a mysql database but already took 1 hour running the workflow and still running…

Its a json with 370k records with 100mb.

What is the error message (if any)?

No error message.

Please share your workflow

{
“nodes”: [
{
“parameters”: {
“conditions”: {
“options”: {
“caseSensitive”: true,
“leftValue”: “”,
“typeValidation”: “strict”,
“version”: 2
},
“conditions”: [
{
“id”: “556abbb0-c8d9-42c0-8eac-4432cd015bb1”,
“leftValue”: “={{ $json["2"] }}”,
“rightValue”: “”,
“operator”: {
“type”: “string”,
“operation”: “notEmpty”,
“singleValue”: true
}
}
],
“combinator”: “and”
},
“options”: {}
},
“id”: “2fd422cd-c96f-4ba8-930a-5532f1c8db87”,
“name”: “Filtra Sem Inscrição”,
“type”: “n8n-nodes-base.filter”,
“typeVersion”: 2.2,
“position”: [
560,
180
]
},
{
“parameters”: {
“fieldToSplitOut”: “consultanacional.dados”,
“options”: {}
},
“id”: “de24bee4-62eb-4f65-b652-57f0ff5b08a0”,
“name”: “Cria lista”,
“type”: “n8n-nodes-base.splitOut”,
“typeVersion”: 1,
“position”: [
380,
180
]
},
{
“parameters”: {
“table”: {
“__rl”: true,
“value”: “nutricionista”,
“mode”: “list”,
“cachedResultName”: “nutricionista”
},
“dataMode”: “defineBelow”,
“valuesToSend”: {
“values”: [
{
“column”: “crn_id”,
“value”: “={{ $json["0"] }}”
},
{
“column”: “nome”,
“value”: “={{ $json["3"] }}”
},
{
“column”: “registro”,
“value”: “={{ $json["2"] }}”
},
{
“column”: “data_cadastro”,
“value”: “={{ $now.format(‘yyyy-LL-dd HH:mm:ss’) }}”
},
{
“column”: “tipo_registro”,
“value”: “={{ $json["15"] }}”
},
{
“column”: “situacao”,
“value”: “={{ $json["16"] }}”
},
{
“column”: “cpf”,
“value”: “={{ $json["7"] }}”
}
]
},
“options”: {
“queryBatching”: “independently”,
“detailedOutput”: true
}
},
“id”: “96ba6f04-cc01-45a5-9564-de720c078f0d”,
“name”: “Insere no banco”,
“type”: “n8n-nodes-base.mySql”,
“typeVersion”: 2.4,
“position”: [
760,
180
],
“credentials”: {
“mySql”: {
“id”: “Q2RO3wnOSF8sPTB2”,
“name”: “Docker-Desktop-Fellipe”
}
}
},
{
“parameters”: {
“toRecipients”: “[email protected]”,
“subject”: “[N8N] - Base CNN Atualizada”,
“bodyContent”: “=Foi realizado uma atualização na base da Consulta Nacional de Nutricionistas\n\nData: {{ $now.format(‘yyyy-LL-dd HH:mm:ss’) }}\nStatus: Êxito\nArquivo importado: Consulta Nacional( Nutricionistas).json\nQuantidade de registros: {{ $json.data.affectedRows }}\n”,
“additionalFields”: {}
},
“id”: “010a799f-e716-48f2-ac03-a9aba22b96ca”,
“name”: “Microsoft Outlook”,
“type”: “n8n-nodes-base.microsoftOutlook”,
“typeVersion”: 2,
“position”: [
960,
180
],
“executeOnce”: true,
“credentials”: {
“microsoftOutlookOAuth2Api”: {
“id”: “xDMhq9ZqTYaRXYBl”,
“name”: “Microsoft Outlook account”
}
}
},
{
“parameters”: {
“path”: “testeste”,
“options”: {}
},
“type”: “n8n-nodes-base.webhook”,
“typeVersion”: 2,
“position”: [
-40,
180
],
“id”: “253f9c3b-c138-4714-beeb-ff8e6bc8b085”,
“name”: “Webhook”,
“webhookId”: “771ec106-1d53-4c9a-86f4-814ed009552b”
},
{
“parameters”: {
“fileSelector”: “/n8n/cnn/Consulta Nacional( Nutricionistas).json”,
“options”: {}
},
“type”: “n8n-nodes-base.readWriteFile”,
“typeVersion”: 1,
“position”: [
180,
180
],
“id”: “f9048cad-4ecd-4236-801e-11e5d32daf67”,
“name”: “Read/Write Files from Disk”
}
],
“connections”: {
“Filtra Sem Inscrição”: {
“main”: [
[
{
“node”: “Insere no banco”,
“type”: “main”,
“index”: 0
}
]
]
},
“Cria lista”: {
“main”: [
[
{
“node”: “Filtra Sem Inscrição”,
“type”: “main”,
“index”: 0
}
]
]
},
“Insere no banco”: {
“main”: [
[
{
“node”: “Microsoft Outlook”,
“type”: “main”,
“index”: 0
}
]
]
},
“Webhook”: {
“main”: [
[
{
“node”: “Read/Write Files from Disk”,
“type”: “main”,
“index”: 0
}
]
]
},
“Read/Write Files from Disk”: {
“main”: [
[
{
“node”: “Cria lista”,
“type”: “main”,
“index”: 0
}
]
]
}
},
“pinData”: {}
}

Share the output returned by the last node

No outpot

Information on your n8n setup

  • n8n version: 1.72.1
  • Database (default: SQLite): sqlite
  • n8n EXECUTIONS_PROCESS setting (default: own, main): regular
  • Running n8n via (Docker, npm, n8n cloud, desktop app): docker
  • Operating system: ubuntu 22.04

Reading a file this large might be challenging. Can you try to check memory/CPU usage during that time? Maybe your specs aren’t enough to handle that file: Memory-related errors | n8n Docs

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.