520 cloudflare....... keeps interrupting my workflow

**the workflow on n8n has became unusable.

The service failed to process your request [item 72]**

api.openai.com | 520: Web server is returning an unknown error Web server is returning an unknown error Error code 520 Visit cloudflare.com for more information. 2025-09-24 13:57:06 UTC You Browser Working Frankfurt Cloudflare Working api.openai.com Host Error What happened?

There is an unknown connection issue between Cloudflare and the origin web server. As a result, the web page can not be displayed.

What can I do? If you are a visitor of this website:

Please try again in a few minutes.

If you are the owner of this website:

There is an issue between Cloudflare’s cache and your origin web server. Cloudflare monitors for these errors and automatically investigates the cause. To help support the investigation, you can pull the corresponding error log from your web server and submit it our support team. Please include the Ray ID (which is at the bottom of this error page).

Cloudflare Ray ID: 9842c7b43b70bb92 • Your IP: Click to reveal 20.218.174.13 • Performance & security by Cloudflare

(function(){function d(){var b=a.getElementById(“cf-footer-item-ip”),c=a.getElementById(“cf-footer-ip-reveal”);b&&"classList"in b&&(b.classList.remove(“hidden”),c.addEventListener(“click”,function(){c.classList.add(“hidden”);a.getElementById(“cf-footer-ip”).classList.remove(“hidden”)}))}var a=document;document.addEventListener&&a.addEventListener(“DOMContentLoaded”,d)})();

This is really annoying, it has cost us $60 + in openAI api cost already that just went straight down the drain

Can you share your workflow? Also please answer:

  • specify the node that throws the error or where you spot it
  • is it the only workflow affected?
  • did it work before?
  • any other network issues?
  • can you share more details on your n8n setup - hosting, proxies, docker?, etc

It’s possible Cloudflare is rate-limiting you. IP based rate-limits may be long - a day for example. Not sure it that’s the cause but could be.


specify the node that throws the error or where you spot it
Both openai message a model node are getting this error now.

  • is it the only workflow affected?
    Yeah it is the only workflow affected, but similar workflow has ran previously

  • did it work before?
    In small batch test it worked, but the full scale 380ish records batch process has failed on both node

  • any other network issues?
    We’ve only seen 502 gateway cloudflare issue so far

  • can you share more details on your n8n setup - hosting, proxies, docker?, etc
    the n8n runs are test runs, since we are not dealing with any customers, it is not production mode, also I’m very new here, first week using this product, our needs can be met running test mode theoretically, but if getting through the 502 cloudflare requires production mode we’ll switch it.

    The workflow is to help us generate game development NPC dialogue based on some requirements.

    Thank you for taking your time to help us.

Thank you for the details I’m still missing:

  • share your workflow (don’t worry, credentials don’t make it through)
  • are you running n8n via pnpm or docker locally?
    Thank you

I am running n8n on n8n cloud, not self hosted.

Thank you. My thesis is that you are sending too many items and end up rate-limited. You have pretty much confirmed it by running smaller tests and failing at the large ones.

Please go through OpenAI’s rate limit docs, and validate if you are hitting them in some way. Depending on the specifics, you can then stagger your processing to be slower, or increase limits if possible, etc.

Would appreciate if you mark this as Solution if it leads you on the correct path.

1 Like

Thank you for taking your time to help!

I’ve decided to take it to implement batch request in code for this.

n8n is still a really nice and fast way to prototype some idea, but I really wish that making staggering and rate limit of these models are easier.

Much appreciated!

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.