Hi … I just did my update as usual but this time starting hangs on “Waiting for tunnel …”.
Maybe the server is down but maybe I missed a recent change?
I use:
n8n start --tunnel
Best,
daniello
Hi … I just did my update as usual but this time starting hangs on “Waiting for tunnel …”.
Maybe the server is down but maybe I missed a recent change?
I use:
n8n start --tunnel
Best,
daniello
Hi @daniello, I am sorry to hear you’re having trouble. I’ve brought this up with the team and we’ll look into this as soon as we can.
Thanks a lot for flagging this @daniello! Looks like there was an issue with the tunnel server. It got restarted and should now work again fine.
Gald to help … works again
Hey @abhilash,
Thanks for the report I will let the team know about this one now, If possible I would suggest not using the tunnel if you are using n8n in a production environment.
Thanks @Jon for response.
I am using tunnel for testing some of the n8n functionality not in production.
Hey @abhilash,
Can you try starting again and see if it is working again? We have just restarted the tunnel service.
Hello there,
Looks like this is happening again?
Waiting for tunnel ...
Tunnel URL: https://0bc987yli0r3e3ccchlojag2.hooks.n8n.cloud/
IMPORTANT! Do not share with anybody as it would give people access to your n8n instance!
n8n ready on 0.0.0.0, port 5678
Version: 0.197.1
================================
Start Active Workflows:
================================
- My workflow (ID: 1)
node:events:491
throw er; // Unhandled 'error' event
^
Error: connection refused: hooks.n8n.cloud:36358 (check your firewall settings)
at Socket.<anonymous> (/usr/local/lib/node_modules/n8n/node_modules/localtunnel/lib/TunnelCluster.js:52:11)
at Socket.emit (node:events:513:28)
at Socket.emit (node:domain:489:12)
at emitErrorNT (node:internal/streams/destroy:157:8)
at emitErrorCloseNT (node:internal/streams/destroy:122:3)
at processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Tunnel instance at:
at TunnelCluster.<anonymous> (/usr/local/lib/node_modules/n8n/node_modules/localtunnel/lib/Tunnel.js:96:12)
at TunnelCluster.emit (node:events:513:28)
at TunnelCluster.emit (node:domain:489:12)
at Socket.<anonymous> (/usr/local/lib/node_modules/n8n/node_modules/localtunnel/lib/TunnelCluster.js:50:14)
at Socket.emit (node:events:513:28)
[... lines matching original stack trace ...]
at processTicksAndRejections (node:internal/process/task_queues:83:21)
Is it something you guys have a hand on or is it unrelated and just a temporary issue at local tunnel which gets resolved automatically?
Thanks a lot
Hey @aparakian,
Welcome to the community
Just spotted this one, it looks like our monitoring has picked it up so should be resolved shortly if it isn’t already.
Thanks for letting us know
I’ve started having the same problem.
Do you think it needs re-resolving again?
It was working fine before, and I haven’t made any firewall setting changes.
My platform is rpi 4.
I have around %20 success rate when I “n8n start --tunnel”, %80 of the time, following error.
In the %20 of the time it works, it crashes after a while.
/usr/local/lib/node_modules/n8n/dist/ErrorReporting.js:59
throw error;
^
Error: connection refused: hooks.n8n.cloud:38159 (check your firewall settings)
at Socket.<anonymous> (/usr/local/lib/node_modules/n8n/node_modules/localtunnel/lib/TunnelCluster.js:52:11)
at Socket.emit (node:events:513:28)
at Socket.emit (node:domain:489:12)
at emitErrorNT (node:internal/streams/destroy:157:8)
at emitErrorCloseNT (node:internal/streams/destroy:122:3)
at processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Tunnel instance at:
at TunnelCluster.<anonymous> (/usr/local/lib/node_modules/n8n/node_modules/localtunnel/lib/Tunnel.js:96:12)
at TunnelCluster.emit (node:events:513:28)
at TunnelCluster.emit (node:domain:489:12)
at Socket.<anonymous> (/usr/local/lib/node_modules/n8n/node_modules/localtunnel/lib/TunnelCluster.js:50:14)
at Socket.emit (node:events:513:28)
[... lines matching original stack trace ...]
at processTicksAndRejections (node:internal/process/task_queues:83:21)