How to use wait node webhook call?

Hello community.
I’m extremely surprised that no one has been able to find a solution to a simple problem on n8n.

Describe the problem/error/question

I use an external document processing service. I send it a document, and it sends a webhook with JSON to a static URL I specified on the document processing service’s website. It is important to use this not as a trigger, but in the middle of the process.
I tried using wait node webhook call, but it generates a new URL each time. How can I receive a webhook to a static address?
I’ve read all the forum threads, but no one has explained or shown an example of how to do this. Is it even possible to implement such a trivial task on n8n?

What is the error message (if any)?

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @vadim_ivanov Welcome!
This is expected as the Wait node does not expose a fixed URL you can configure Instead, at runtime it generates a per execution resume URL available in expressions as

{{$execution.resumeUrl}}

what i can speak of as a fix is just not to use everything in a single workflow create a separate workflow with a Webhook node This workflow uses the static path you can configure on the external service, as this will make sure the service always gets the same url, let me know if this works.

1 Like

Thanks for an answer. I know that I can use webhook trigger in separate workflow. That is why I so upset n8n doesn’t have so simple solution for this task.

welcome @vadim_ivanov :waving_hand:t2:,

IMO, using the Wait node with the On Webhook Call isn’t the right approach for this case because ATM n8n by design generates a new dynamic URL for wait webhooks (.../webhook-waiting/{execution_id}),

so it needs to know exactly which running execution to resume, and since the execution ID is unique, providing a single static URL to your service will require a second workflow just to handle the mapping (which is a headache to manage).

Instead, I recommend using a Polling Architecture, which keeps everything in a single workflow and is very common for situations like this,

Since you are using Parseur, you can use their API to check the status of a document using the document ID you get from the upload step:

You can build a simple loop inside your workflow:

  1. Upload the file
  2. Wait some time (seconds/minutes..etc)
  3. Check the status via API
  4. Logic: parse the status with conditional branches, loop back if processing, continue if processed..etc

Here is an example of how I would set this up: