OpenWebUI timeout issue after 60s when using with n8n pipe

Hi everyone,

I’m hosting OpenWebUI on DigitalOcean using the official marketplace droplet. I’m using OpenWebUI as a frontend for my AI agent in n8n, connected via this community pipe:
:link: N8N Pipe Function • Open WebUI Community

Everything works great except when the request takes longer than ~60 seconds — OpenWebUI shows an error, even though the n8n workflow is still running and finishes successfully.

Has anyone faced this issue or knows how to increase the timeout or keep the connection alive? I’d appreciate any help or ideas!

Thanks :pray:

1 Like

Hi @hussein_Abusetta

It’s not an n8n issue, it’s on the OpenWebUI side!

As a starting point, check the client timeout setting in OpenWebUI: 🌍 Environment Variable Configuration | Open WebUI

Thanks for replying! I initially thought it was from the OpenWebUI side as well, but after investigating further, I discovered the error was actually coming from n8n.

I tested the workflow directly from Postman and got a 504 error too. I modified the Nginx configuration timeout to 5 minutes, which prevented the error from appearing, but then, the request hangs without returning a response.

I think I should open a new post specifically about n8n timeout issues without mentioning OpenWebUI, as it seems to be an n8n-specific problem.

Thanks again

2 Likes

Hi @hussein_Abusetta

I have a method that works around this issue. This approach ensures that you can manage long-running workflows effectively by leveraging polling and execution status checks.

Here’s a quick summary:

  1. Call the webhook and immediately respond with the execution ID.
  2. Store the execution ID in your Pip function.
  3. Poll the n8n API every 5 seconds to check the execution status using the endpoint: https://www.n8n/api/v1/{execution_id}?includeData=false.
  4. Once the execution status is "finished": true, fetch the full execution data using: https://www.n8n/api/v1/{execution_id}?includeData=true.
  5. Extract the data from the last node, which you can easily identify if you name it “lastNode”.

This way, you can handle long-running tasks without backend timeout issues.

Hope this helps !

1 Like

no, this is not an OpenWebUI side :wink:

1 Like

Yes, you’re right…

There is a similar open GitHub issue for this, by the way:

1 Like

Thank you so much! I had no idea you could fetch execution data using the ID - that’s a really elegant solution.

I ended up resolving my issue with two modifications:

  • Adjusting the Nginx timeout configuration
  • Changing the EXECUTIONS_MODE in docker-compose from queue to regular

The biggest advantage of your approach is that it wouldn’t require changing the execution mode to regular, which offers more flexibility.

Thanks again for sharing this alternative method

Hi @EtienneDouillard
I’m having the same issue with no responses from long running workflows in n8n back to openweb ui. Can you provide any more details about the solution you came up with so I can implement on both sides - ie n8n & openweb ui function?

regards
DC

@hussein_Abusetta can you explain a little bit more the changes you made in your OpenWebUi function? I am having the same error when my workflow takes more than 60 seconds to execute.

I was trying what @EtienneDouillard mentioned about saving the executionId and checking the execution status. Still, i never figured out how to check the status of my execution if I am running n8n in my local environment.

Appreciate your help

Anyone up for modifying the n8n pipe from Cole to support this ?

Hi,

for n8n configuration, please check my reply above. these configurations are necessary.

for OpenWebUI, now I am using this community function N8N Pipeline Function • Open WebUI Community, it works well

Can you post the webhook/respond to webhook from a working workflow ? The example on Open-WebUI-Functions/pipelines/n8n/Open_WebUI_Test_Agent.json at master · owndev/Open-WebUI-Functions · GitHub doesnt really show anything.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.