I already tried to split the items into single objects but failed.
What I have:
.json file with more than four hundred strings (I left only three as examples in the function node, two translated and one that wasn’t translated yet).
What I’d like to do:
Translate each string and keep the original format so I can re-upload it.
How I’d do (please feel free to share an easier way):
Split items into single objects.
Use the Google Translate node.
Return the .json file to its original format.
What is the error message (if any)?
Please share the workflow
Share the output returned by the last node
Information on your n8n setup
n8n version:
0.182.0
Database you’re using (default: SQLite):
–
Running n8n with the execution process [own(default), main]:
–
Running n8n via [Docker, npm, n8n.cloud, desktop app]:
Desktop app.
Hi @dcbn, so you’d like to translate each of your object properties? This is a bit tricky if the properties themselves aren’t predictable. I’d probably approach it by first converting each property into its own n8n item with one property holding the original name and one property holding the respective value.
You can then send each item to Google Translate and afterwards convert your items into a single object again. I tried to build the full workflow, but it seems Google won’t let me use the Translate API with my free account.
It should work as shown below though. Make sure to enable the Translate node and add a suitable expression in the Set node and you should be good to go.
You always impress me with the speed and quality of your answers!
This is a bit tricky if the properties themselves aren’t predictable.
Do you mean when we have an array with repeating objects (“name”, “date” etc.)?
I’d probably approach it by first converting each property into its own n8n item with one property holding the original name and one property holding the respective value.
Thank you for explaining, It makes sense even that I have to improve my javascript knowledge to better understand it.
You can then send each item to Google Translate and afterwards convert your items into a single object again. I tried to build the full workflow, but it seems Google won’t let me use the Translate API with my free account.
Indeed!
The only Google API I could find was “Cloud Translation API” and the price seems expensive for my usage.
I tried to switch to DeepL but they do not accept registration from my country yet to use the API even in the free plan (they require a credit card from the same country).
Do you think there would be any way to do the same work without using any translation node (Google or DeepL) but using n8n?
If there is no way, I’ll probably try RPA (robot process automation).
Do you mean when we have an array with repeating objects (“name”, “date” etc.)?
Yeah, so from looking at the example dataset you have provided ({ "Share": "Compartilhar", "Choose_option": "Escolha uma opção", "Get_content_fail": "" }) I can see three fields, but I am not sure if these are the only fields your actual JSON data will have. If so, you can ignore the conversion parts of my example workflow and simply use expressions to reference these fields.
Thank you for explaining, It makes sense even that I have to improve my javascript knowledge to better understand it.
Again, if you know the possible field names there’s no need to even think about this. I was just reading “json file with more than four hundred strings” and thinking these could be in >400 different fields with potentially unknown names.
The only Google API I could find was “Cloud Translation API” and the price seems expensive for my usage.
As for translation services, I have tried out the Bing/Azure translation for another request here on the forum. I don’t know the pricing details, but I could at least try it out for free and it worked with n8n. So perhaps you might want to check out the thread over here for details
Yeah, so from looking at the example dataset you have provided ({ "Share": "Compartilhar", "Choose_option": "Escolha uma opção", "Get_content_fail": "" } ) I can see three fields, but I am not sure if these are the only fields your actual JSON data will have. If so, you can ignore the conversion parts of my example workflow and simply use expressions to reference these fields.
Sadly, the .json will always have random strings.
Again, if you know the possible field names there’s no need to even think about this. I was just reading “json file with more than four hundred strings” and thinking these could be in >400 different fields with potentially unknown names.
Each .JSON will have approximately ~400 strings.
The dataset I provided was to show how the JSON is formatted and how it should be done when translated, the first two strings were downloaded after the translation in Transifex, and the last one is an example when I downloaded it before the translation.
"Share": "Compartilhar" (Translated from en-us to pt-br.) "Get_content_fail": "" (Not translated yet.)
As for translation services, I have tried out the Bing/Azure translation for another request here on the forum. I don’t know the pricing details, but I could at least try it out for free and it worked with n8n. So perhaps you might want to check out the thread over here for details
Thank you for linking the thread, I’ll take a look and If I manage to make it work I’ll come back here to tell you (and to close the thread).