HTTP errors often don’t provide more errors, All we tend to do is return the error we get back from the service which sometimes doesn’t help and it makes it tricky for us as we don’t know either and often there is very little we can do as we don’t control the other side.
I am surprised the loop didn’t help but with a 502 there could be a few causes some of them may be unrelated to what you are doing and it could be that someone else has overloaded the other side and it is slowing down requests.
Our developers came back to us about your case, and pinpointed the fact that the source text was inserted in the URL. As it had over 4000 characters, this is too long a request for the API.
Please take a few minutes to go through our API documentation, which suggests to put the source text into the HTTP body of a POST request. As you sent the same request repeatedly, the same error was shown each time.
From our side the error handling has just been improved, suggesting to put the text into the POST body. For more information check the section Translating large volumes of text from our API documentation.
Please let us know if you have any further issues once you’ve edited your request.
This is really odd because my DeepL node is caped at 1200 characters
So now I’m wondering if the DeepL node is following best practice from deepl API ?
because if as suggested the content is in the URL and should be in the body maybe I should design the workflow in a different way? or is it the DeepL node itself that need some update ?
That is interesting and good to know that they were able to work it out, The main thing I can see there is if you are capping it to 1200 how are they getting 4000 through
Looking at the API docs I would have expected a 414 HTTP response though not a 502.
414 The request URL is too long. You can avoid this error by using a POST request instead of a GET request, and sending the parameters in the HTTP body.
Our node is sending the text in the URL so maybe a better long term solution would be to change it to use the body instead, I will do some testing in the morning and see how it goes, In theory it should be a simple enough change and shouldn’t break anything.
I understand better now what’s happening, thanks for the information. i’ll share your comment with DeepL if you’re okay?
Right now the node It’s not possible to use in my use case, it seems to work for small message but 1200 isn’t that much; in fact I would love to be able to use 4000, often my telegram translations are cut because of 1200 slice, so if any long term fix is made my project kinds of depend on a lot on DeepL, it would be really appreciated !
right now automated translations are being done use google but they’re lower quality in term of context.
I don’t think a Github issue is needed as I don’t see this as a bug we are using the URL option which is valid and it looks like we need to change this to body to support more characters which I would say is a feature request and something that would be handled here.
Any chance you can DM me some text to play with that failed? As a quick test I have managed to translate all 6726 characters of “Rappers Delight” from English to German as a test but it would be nice to run some of your data through it if you have anything you can share.
the source of my translation is this RSS feed Militants via Observer on Inoreader you can simply use the rss feed node to be your source of translation and then you select the “title” element as the source of translation. (see my workflow in the OP)
Hmm strange because I was having this error on each of my DeepL workflows, independent of the RSS feed used to feed the translation, I know this one has longer text, maybe you can reproduce it with this one Russia-State via CyberBenB on Inoreader
I had to limit the slice at 200 to be able to run a translation without error.
In my current production stack (cloudron.io) I will have to wait for the release to be done on your side for the cloudron team to upgrade their package of N8N, usually it goes fast, let’s hope for the merge !