@airmoi
Yeah, I see how that can be handy. I encountered this “issue” one time. In my mind, this has to be handled in the core instead of handling that on each node.
I think you can solve the “if any query fails, the whole workflow fails” by using the splitInbatches node.
If I understood well you have something like this. In that example the second request needs the be made but cuz the first one failed the second one which should be successful never is executed.
{
"nodes": [
{
"parameters": {},
"name": "Start",
"type": "n8n-nodes-base.start",
"typeVersion": 1,
"position": [
250,
300
]
},
{
"parameters": {
"functionCode": "return [\n {\n json: {\n url: 'https://dog.ceo/api/breeds/image/random2',\n }\n },\n {\n json: {\n url: 'https://dog.ceo/api/breeds/image/random',\n }\n },\n]"
},
"name": "Function",
"type": "n8n-nodes-base.function",
"typeVersion": 1,
"position": [
480,
300
]
},
{
"parameters": {
"url": "={{$node[\"Function\"].data[\"url\"]}}",
"options": {}
},
"name": "HTTP Request",
"type": "n8n-nodes-base.httpRequest",
"typeVersion": 1,
"position": [
710,
300
]
}
],
"connections": {
"Start": {
"main": [
[
{
"node": "Function",
"type": "main",
"index": 0
}
]
]
},
"Function": {
"main": [
[
{
"node": "HTTP Request",
"type": "main",
"index": 0
}
]
]
}
}
}
In this second example using splitInBatches at least the request who failed does not stop the workflow. It’s just “ignored”. Make sure you have the “continue on fail” set to true on the http node.
{
"nodes": [
{
"parameters": {},
"name": "Start",
"type": "n8n-nodes-base.start",
"typeVersion": 1,
"position": [
250,
300
]
},
{
"parameters": {
"functionCode": "return [\n {\n json: {\n url: 'https://dog.ceo/api/breeds/image/random2',\n }\n },\n {\n json: {\n url: 'https://dog.ceo/api/breeds/image/random',\n }\n },\n]"
},
"name": "Function",
"type": "n8n-nodes-base.function",
"typeVersion": 1,
"position": [
480,
300
]
},
{
"parameters": {
"url": "={{$node[\"SplitInBatches\"].data[\"url\"]}}",
"options": {}
},
"name": "HTTP Request",
"type": "n8n-nodes-base.httpRequest",
"typeVersion": 1,
"position": [
910,
300
],
"continueOnFail": true
},
{
"parameters": {
"batchSize": 1
},
"name": "SplitInBatches",
"type": "n8n-nodes-base.splitInBatches",
"typeVersion": 1,
"position": [
690,
300
]
},
{
"parameters": {
"conditions": {
"boolean": [
{
"value1": "={{$node[\"SplitInBatches\"].context[\"noItemsLeft\"]}}",
"value2": true
}
]
}
},
"name": "IF",
"type": "n8n-nodes-base.if",
"typeVersion": 1,
"position": [
1110,
100
]
},
{
"parameters": {},
"name": "NoOp",
"type": "n8n-nodes-base.noOp",
"typeVersion": 1,
"position": [
1340,
80
]
}
],
"connections": {
"Start": {
"main": [
[
{
"node": "Function",
"type": "main",
"index": 0
}
]
]
},
"Function": {
"main": [
[
{
"node": "SplitInBatches",
"type": "main",
"index": 0
}
]
]
},
"HTTP Request": {
"main": [
[
{
"node": "IF",
"type": "main",
"index": 0
}
]
]
},
"SplitInBatches": {
"main": [
[
{
"node": "HTTP Request",
"type": "main",
"index": 0
}
]
]
},
"IF": {
"main": [
[
{
"node": "NoOp",
"type": "main",
"index": 0
}
],
[
{
"node": "SplitInBatches",
"type": "main",
"index": 0
}
]
]
}
}
}