For translation tasks, Generative AI on Vertex AI offers two specialized translation models from the Cloud Translation API:
Translation LLM - Google's newest highest quality LLM-style translation offering. Achieves the highest quality translation while serving at reasonable latencies (~3x faster than Gemini 2.0 Flash).
Cloud Translation Neural Machine Translation (NMT) model - Google's premier real-time translation offering achieving ~100ms latency translations. It achieves the highest quality of all models benchmarked serving at comparable latencies and continues to see ongoing quality advancements. NMT can achieve latencies up to 20x faster than Gemini 2.0 Flash.
Key advantages & differentiators of the Translation LLM
- Unmatched Translation Quality - Translation LLM offers the highest translation quality achieving significantly higher performance on benchmarks compared to other benchmarked models. Translation LLM is far more likely to significantly rewrite a sentence to make it sound more natural in the target language, rather than giving less natural "word-for-word" translations that are often seen in other translation models.
- Superior Quality/Latency Trade off - Translation LLM provides LLM-powered translations at latencies significantly better than Gemini 2.0 Flash. Although Translation LLM has higher latencies than the NMT model, it typically provides higher quality responses for a broad range of applications.
Model feature comparison
Feature | Translation LLM (Powered by Gemini) | NMT model |
---|---|---|
Description | A translation-specialized Large Language Model powered by Gemini, fine-tuned for translation. Available with Generative AI on Vertex AI and the Cloud Translation - Advanced API. | Google's Neural Machine Translation model, available through Cloud Translation - Advanced and Cloud Translation - Basic APIs . Optimized for simplicity and scale. |
Quality | Highest quality translation. Outperforms NMT, Gemini 2.0 Flash, and Gemini 2.5 Pro in quality. More likely to rewrite sentences for natural flow. Shows significant error reduction. | Medium to high quality depending on the language pair. Among the best-performing real-time NMT models for many language-domain combinations. |
Latency | Latency is significantly better than Gemini 2.0 Flash, but still slower than NMT. | Fastest Real Time Translation. Low latency, suitable for chat and real-time applications. Achieves latencies up to 20x faster than Gemini 2.0 Flash |
Language support | Supported languages include Arabic, Chinese, Czech, Dutch, English, French, German, Hindi, Indonesian, Italian, Japanese, Korean, Polish, Portuguese, Russian, Spanish, Thai, Turkish, Ukrainian, and Vietnamese. Refer to supported languages for a full list. | Supports languages include Cantonese, Fijian, and Balinese. Translations from any language to any other language in the supported list are possible. Refer to supported languages for a full list. |
Customization | Support for - Advanced Glossaries, Supervised Fine-tuning on Vertex AI for domain/customer-specific adaptations, Adaptive translation for real-time style customization with a few examples. | Support for - Glossaries to control terminology, and training Custom Models with AutoML Translation in Cloud Translation - Advanced API. |
Translation features | HTML translation | HTML, Batch, and Formatted document translation |
API Integration | Cloud Translation - Advanced API, Vertex AI API | Cloud Translation - Basic API, Cloud Translation - Advanced API, Vertex AI API |
Usage
This section shows you how to use Vertex AI Studio to rapidly translate text from one language to another. You can use the Translation LLM or the NMT model to translate text by using the Google Cloud console or API. Note that the languages that each model supports can vary. Before you request translations, check that the model you're using supports your source and target languages.
Console
In the Vertex AI section of the Google Cloud console, go to the Translate text page in Vertex AI Studio.
In the Run settings pane, select a translation model in the Model field.
To change the model settings (such as temperature), expand Advanced.
Set the source and target languages.
In the input field, enter the text to translate.
Click Submit.
To get the code or curl command that demonstrate how to request translations, click
Get code.
Note that in Vertex AI Studio, the Translation LLM lets you provide example translations to tailor model responses to more closely match your style, tone, and industry domain. The model uses your examples as few-shot context before translating your text.
API
Select the model to use for your translations.
Translation LLM
Use the Vertex AI API and Translation LLM to translate text.
REST
Before using any of the request data, make the following replacements:
- PROJECT_NUMBER_OR_ID: The numeric or alphanumeric ID of your Google Cloud project
- LOCATION: The location where you want to run this operation.
For example,
us-central1
. - SOURCE_LANGUAGE_CODE: The language code of the input text. Set to one of the language codes listed in adaptive translation.
- TARGET_LANGUAGE_CODE: The target language to translate the input text to. Set to one of the language codes listed in adaptive translation.
- SOURCE_TEXT: Text in the source language to translate.
- MIME_TYPE (Optional): The format of the source text, such as
text/html
ortext/plain
. By default, the MIME type is set totext/plain
.
HTTP method and URL:
POST https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/cloud-translate-text:predict
Request JSON body:
{ "instances": [ { "source_language_code": "SOURCE_LANGUAGE_CODE", "target_language_code": "TARGET_LANGUAGE_CODE", "contents": [ "SOURCE_TEXT" ], "mimeType": "MIME_TYPE", "model": "projects/PROJECT_ID/locations/LOCATION/models/general/translation-llm" } ] }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
{ "predictions": [ { "translations": [ { "translatedText": "TRANSLATED_TEXT", "model": "projects/PROJECT_ID/locations/LOCATION/models/general/translation-llm" } ] } ] }
Node.js
Before trying this sample, follow the Node.js setup instructions in the Vertex AI quickstart using client libraries. For more information, see the Vertex AI Node.js API reference documentation.
To authenticate to Vertex AI, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.
async function translate() { const request = { instances: [{ source_language_code: SOURCE_LANGUAGE_CODE, target_language_code: TARGET_LANGUAGE_CODE, contents: [SOURCE_TEXT], model: "projects/PROJECT_ID/locations/LOCATION/models/general/translation-llm" }] }; const {google} = require('googleapis'); const aiplatform = google.cloud('aiplatform'); const endpoint = aiplatform.predictionEndpoint('projects/PROJECT_ID/locations/LOCATION/publishers/google/models/cloud-translate-text'); const [response] = await endpoint.predict(request) console.log('Translating') console.log(response) }
Python
Before trying this sample, follow the Python setup instructions in the Vertex AI quickstart using client libraries. For more information, see the Vertex AI Python API reference documentation.
To authenticate to Vertex AI, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.
from google.cloud import aiplatform def translate(): # Create a client client_options = {"api_endpoint": "LOCATION-aiplatform.googleapis.com"} client = aiplatform.gapic.PredictionServiceClient(client_options=client_options) # Initialize the request endpoint_id = f"projects/PROJECT_ID/locations/LOCATION/publishers/google/models/cloud-translate-text" instances=[{ "model": "projects/PROJECT_ID/locations/LOCATION/models/general/translation-llm", "source_language_code": 'SOURCE_LANGUAGE_CODE', "target_language_code": 'TARGET_LANGUAGE_CODE', "contents": ["SOURCE_TEXT"], }] # Make the request response = client.predict(instances=instances, endpoint=endpoint_id) # Handle the response print(response)
NMT
Use the Cloud Translation API and the NMT model to translate text.
REST
Before using any of the request data, make the following replacements:
- PROJECT_NUMBER_OR_ID: the numeric or alphanumeric ID of your Google Cloud project.
- SOURCE_LANGUAGE: (Optional) The language code of the input text. For supported language codes, see Language support.
- TARGET_LANGUAGE: The target language to translate the input text to. Set to one of the supported language codes.
- SOURCE_TEXT: The text to translate.
HTTP method and URL:
POST https://translation.googleapis.com/v3/projects/PROJECT_ID:translateText
Request JSON body:
{ "sourceLanguageCode": "SOURCE_LANGUAGE", "targetLanguageCode": "TARGET_LANGUAGE", "contents": ["SOURCE_TEXT1", "SOURCE_TEXT2"] }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
{ "translations": [ { "translatedText": "TRANSLATED_TEXT1" }, { "translatedText": "TRANSLATED_TEXT2" } ] }
Node.js
Before trying this sample, follow the Node.js setup instructions in the Vertex AI quickstart using client libraries. For more information, see the Vertex AI Node.js API reference documentation.
To authenticate to Vertex AI, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.
Python
Before trying this sample, follow the Python setup instructions in the Vertex AI quickstart using client libraries. For more information, see the Vertex AI Python API reference documentation.
To authenticate to Vertex AI, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.
Custom translations
Customize responses from the Translation LLM by providing your own example translations. Custom translations only work with the Translation LLM.
You can request customized translation through the Vertex AI Studio console or API with one difference. The console supports custom translations only when you provide examples in a TMX or TSV file. The API supports custom translations only when you provide examples (up to 5 sentence pairs) inline as part of the translation request.
Data requirements
If you provide example translations in a file for the Google Cloud console, the examples must be written as segment pairs in a TMX or TSV file. Each pair includes a source language segment and its translated counterpart. For more information, see Prepare example translations in the Cloud Translation documentation.
To get the most accurate results, include specific examples from a wide variety of scenarios. You must include at least five sentence pairs but no more than 10,000 pairs. Also, a segment pair can be at most a total of 512 characters.
Console
In the Vertex AI section of the Google Cloud console, go to the Translate text page in Vertex AI Studio.
In the Run settings pane, configure your translation settings.
- In the Model field, select Translation LLM.
- To change the temperature, expand Advanced.
Click Add examples.
- Select a local file or a file from Cloud Storage. Vertex AI Studio determines the source and target languages from your file.
- Select the number of examples for the model to use before generating a response.
The number of examples you select counts toward the input character limit per request of 3,000.
In the input field, enter the text to translate.
Click Submit.
Vertex AI automatically selects your specified number of reference sentences that are most similar to your input. The translation model identifies patterns from your examples and then applies those patterns when generating a response.
The output limit per request is 3,000 characters. Any text beyond this limit is dropped.
To get the code or curl command that demonstrate how to request translations, click
Get code.
API
To request custom translations, include up to five reference sentence pairs in your translation request. The translation model uses all of them to identify patterns from your examples and then applies those patterns when generating a response.
REST
Before using any of the request data, make the following replacements:
- PROJECT_NUMBER_OR_ID: The numeric or alphanumeric ID of your Google Cloud project
- LOCATION: The location where you want to run this operation.
For example,
us-central1
. - REFERENCE_SOURCE: A sentence in the source language that is part of a reference sentence pair.
- REFERENCE_TARGET: A sentence in the target language that is part of a reference sentence pair.
- SOURCE_LANGUAGE: The language code of the input text.
- TARGET_LANGUAGE: The target language to translate the input text to.
- SOURCE_TEXT: Text in the source language to translate.
- MIME_TYPE (Optional): The format of the source text, such as
text/html
ortext/plain
. By default, the MIME type is set totext/plain
.
HTTP method and URL:
POST https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/translate-llm:predict
Request JSON body:
{ "instances": [ { "reference_sentence_config": { "reference_sentence_pair_lists": [ { "reference_sentence_pairs": [ { "source_sentence": "REFERENCE_SOURCE_1_1", "target_sentence": "REFERENCE_TARGET_1_1" }, { "source_sentence": "REFERENCE_SOURCE_1_2", "target_sentence": "REFERENCE_SOURCE_1_2" } ] } ], "source_language_code": "SOURCE_LANGUAGE_CODE", "target_language_code": "TARGET_LANGUAGE_CODE" }, "content": [ "SOURCE_TEXT" ], "mimeType": "MIME_TYPE" } ] }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
{ "predictions": [ { "languageCode": "TARGET_LANGUAGE", "translations": [ { "translatedText": "TRANSLATED_TEXT" } ] } ] }
Before trying this sample, follow the Node.js setup instructions in the
Vertex AI quickstart using
client libraries.
For more information, see the
Vertex AI Node.js API
reference documentation.
To authenticate to Vertex AI, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
Node.js
async function translate() {
const request = {
instances: [{
"reference_sentence_config": {
"reference_sentence_pair_lists": [{
"reference_sentence_pairs": [{
"source_sentence": 'SAMPLE_REFERENCE_SOURCE_1',
"target_sentence": 'SAMPLE_REFERENCE_TARGET_1'
},
"reference_sentence_pairs": {
"source_sentence": 'SAMPLE_REFERENCE_SOURCE_2',
"target_sentence": 'SAMPLE_REFERENCE_TARGET_2'
}]
}],
"source_language_code": 'SOURCE_LANGUAGE_CODE',
"target_language_code": 'TARGET_LANGUAGE_CODE'
},
"contents": ["SOURCE_TEXT"]
}]
};
const {google} = require('googleapis');
const aiplatform = google.cloud('aiplatform');
const endpoint = aiplatform.predictionEndpoint('projects/PROJECT_ID/locations/LOCATION/publishers/google/models/translate-llm');
const [response] = await endpoint.predict(request)
console.log('Translating')
console.log(response)
}
Before trying this sample, follow the Python setup instructions in the
Vertex AI quickstart using
client libraries.
For more information, see the
Vertex AI Python API
reference documentation.
To authenticate to Vertex AI, set up Application Default Credentials.
For more information, see
Set up authentication for a local development environment.
Python
from google.cloud import aiplatform
from google.protobuf.json_format import MessageToDict
def translate():
# Create a client
client_options = {"api_endpoint": "LOCATION-aiplatform.googleapis.com"}
client = aiplatform.gapic.PredictionServiceClient(client_options=client_options)
# Initialize the request
endpoint_id = f"projects/PROJECT_ID/locations/LOCATION/publishers/google/models/translate-llm"
instances=[{
"reference_sentence_config": {
"reference_sentence_pair_lists": [{
"reference_sentence_pairs": [{
"source_sentence": 'SAMPLE_REFERENCE_SOURCE_1',
"target_sentence": 'SAMPLE_REFERENCE_TARGET_1'
},
{
"source_sentence": 'SAMPLE_REFERENCE_SOURCE_2',
"target_sentence": 'SAMPLE_REFERENCE_TARGET_2'
}]
}],
"source_language_code": 'SOURCE_LANGUAGE_CODE',
"target_language_code": 'TARGET_LANGUAGE_CODE'
},
"content": ["SOURCE_TEXT"]
}]
# Make the request
response = client.predict(
endpoint=endpoint_id, instances=instances,
)
# Handle the response
print(response)
# The predictions are a google.protobuf.Value representation of the model's predictions.
predictions = MessageToDict(response._pb)['predictions']
for prediction in predictions:
print(prediction['translations'])
You can also use the Cloud Translation API to create a dataset and import your example sentence pairs. When you use the Cloud Translation API to request translations, you can include your dataset to customize responses. The dataset persists and can be reused with multiple translation requests. For more information, see Request adaptive translations in the Cloud Translation documentation.
Supported languages
Translation LLM
With the Translation LLM, you can translate to and from any of the following languages.
Language name | Language code |
---|---|
Arabic | ar |
Bengali | bn |
Bulgarian | bg |
Catalan | ca |
Chinese (Simplified) | zh-CN |
Croatian | hr |
Czech | cs |
Danish | da |
Dutch | nl |
English | en |
Estonian | et |
Finnish | fi |
French | fr |
German | de |
Greek | el |
Gujarati | gu |
Hebrew | he |
Hindi | hi |
Hungarian | hu |
Icelandic | is |
Indonesian | id |
Italian | it |
Japanese | ja |
Kannada | kn |
Korean | ko |
Latvian | lv |
Lithuanian | lt |
Malayalam | ml |
Marathi | mr |
Norwegian | no |
Persian | fa |
Polish | pl |
Portuguese | pt |
Punjabi | pa |
Romanian | ro |
Russian | ru |
Slovak | sk |
Slovenian | sl |
Spanish | es |
Swahili | sw |
Swedish | sv |
Tamil | ta |
Telugu | te |
Thai | th |
Turkish | tr |
Ukrainian | uk |
Urdu | ur |
Vietnamese | vi |
Zulu | zu |
NMT
For information about which languages the the Cloud Translation NMT model support, refer to the following documentation: