CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-google-cloud-translate

Google Cloud Translate API client library for translating text between thousands of language pairs with support for adaptive MT, AutoML, and glossaries

Pending
Overview
Eval results
Files

v3-data-structures.mddocs/

V3 Data Structures and Enums

Core data structures, enums, and configuration objects used throughout the Google Cloud Translation API V3. These types define the fundamental data models for glossaries, adaptive MT, AutoML, and common configurations.

Core Import

from google.cloud import translate_v3

Common Data Structures

File and Storage Types

class FileInputSource:
    """
    Input source for files.
    
    Attributes:
        mime_type (str): MIME type of input file (required)
        gcs_source (GcsInputSource): Google Cloud Storage source
        content (bytes): File content as bytes
        display_name (str): Display name for the file
    """

class GcsInputSource:
    """
    Google Cloud Storage input source.
    
    Attributes:
        input_uri (str): GCS URI (gs://bucket/path) (required)
    """

class GcsOutputDestination:
    """
    Google Cloud Storage output destination.
    
    Attributes:
        output_uri_prefix (str): GCS URI prefix for output files (required)
    """

Glossary Data Structures

class Glossary:
    """
    Translation glossary resource.
    
    Attributes:
        name (str): Resource name (projects/{project}/locations/{location}/glossaries/{glossary})
        language_pair (LanguagePair): Language pair for bilateral glossary
        language_codes_set (LanguageCodesSet): Language codes for multilingual glossary
        input_config (GlossaryInputConfig): Input configuration (required for creation)
        entry_count (int): Number of entries in glossary (output only)
        submit_time (Timestamp): Glossary creation time (output only)
        end_time (Timestamp): Glossary completion time (output only)
        display_name (str): Human-readable display name
    """

class LanguagePair:
    """
    Language pair for bilateral glossary.
    
    Attributes:
        source_language_code (str): Source language code (BCP-47) (required)
        target_language_code (str): Target language code (BCP-47) (required)
    """

class LanguageCodesSet:
    """
    Language codes set for multilingual glossary.
    
    Attributes:
        language_codes (list): List of language codes (BCP-47) (required)
    """

class GlossaryEntry:
    """
    Glossary entry with term mappings.
    
    Attributes:
        name (str): Resource name (output only)
        terms_set (GlossaryTermsSet): Set of terms for multilingual entry
        terms_pair (GlossaryTermsPair): Pair of terms for bilateral entry
        description (str): Description of the entry
    """

class GlossaryTermsSet:
    """
    Set of terms for multilingual glossary entry.
    
    Attributes:
        terms (dict): Mapping of language code to term (required)
    """

class GlossaryTermsPair:
    """
    Pair of terms for bilateral glossary entry.
    
    Attributes:
        source_term (GlossaryTerm): Source language term (required)
        target_term (GlossaryTerm): Target language term (required)
    """

class GlossaryTerm:
    """
    Individual glossary term.
    
    Attributes:
        language_code (str): Language code (BCP-47) (required)
        text (str): Term text (required)
    """

Adaptive MT Data Structures

class AdaptiveMtDataset:
    """
    Adaptive MT dataset resource.
    
    Attributes:
        name (str): Resource name (projects/{project}/locations/{location}/adaptiveMtDatasets/{dataset})
        display_name (str): Human-readable display name
        source_language_code (str): Source language code (BCP-47) (required)
        target_language_code (str): Target language code (BCP-47) (required)
        example_count (int): Number of examples in dataset (output only)
        create_time (Timestamp): Dataset creation time (output only)
        update_time (Timestamp): Dataset last update time (output only)
    """

class AdaptiveMtFile:
    """
    Adaptive MT file resource.
    
    Attributes:
        name (str): Resource name (output only)
        display_name (str): Human-readable display name
        entry_count (int): Number of entries in file (output only)
        create_time (Timestamp): File creation time (output only)
        update_time (Timestamp): File last update time (output only)
    """

class AdaptiveMtSentence:
    """
    Adaptive MT sentence pair.
    
    Attributes:
        name (str): Resource name (output only)
        source_sentence (str): Source language sentence (required)
        target_sentence (str): Target language sentence (required)
        create_time (Timestamp): Sentence creation time (output only)
        update_time (Timestamp): Sentence last update time (output only)
    """

class AdaptiveMtTranslation:
    """
    Adaptive MT translation result.
    
    Attributes:
        translated_text (str): Translated text
    """

AutoML Data Structures

class Dataset:
    """
    AutoML translation dataset resource.
    
    Attributes:
        name (str): Resource name (projects/{project}/locations/{location}/datasets/{dataset})
        display_name (str): Human-readable display name
        source_language_code (str): Source language code (BCP-47) (required)
        target_language_code (str): Target language code (BCP-47) (required)
        example_count (int): Total number of examples (output only)
        train_example_count (int): Number of training examples (output only)
        validate_example_count (int): Number of validation examples (output only)
        test_example_count (int): Number of test examples (output only)
        create_time (Timestamp): Dataset creation time (output only)
        update_time (Timestamp): Dataset last update time (output only)
        state (DatasetState): Current dataset state (output only)
    """

class Model:
    """
    AutoML translation model resource.
    
    Attributes:
        name (str): Resource name (projects/{project}/locations/{location}/models/{model})
        display_name (str): Human-readable display name
        dataset (str): Training dataset resource name (required)
        source_language_code (str): Source language code (BCP-47) (output only)
        target_language_code (str): Target language code (BCP-47) (output only)
        train_example_count (int): Training examples used (output only)
        validate_example_count (int): Validation examples used (output only)
        test_example_count (int): Test examples used (output only)
        create_time (Timestamp): Model creation time (output only)
        update_time (Timestamp): Model last update time (output only)
        state (ModelState): Current model state (output only)
    """

class Example:
    """
    Training example for AutoML dataset.
    
    Attributes:
        name (str): Resource name (output only)
        source_text (str): Source language text (required)
        target_text (str): Target language text (required)
        usage (ExampleUsage): Usage type (TRAIN, VALIDATION, TEST) (required)
    """

class DatasetInputConfig:
    """
    Dataset input configuration.
    
    Attributes:
        input_files (list): List of input file configurations (required)
    """

class DatasetOutputConfig:
    """
    Dataset output configuration.
    
    Attributes:
        gcs_destination (GcsOutputDestination): Google Cloud Storage destination (required)
    """

Enums and Constants

Operation and State Enums

class OperationState:
    """
    Long-running operation states.
    
    Values:
        OPERATION_STATE_UNSPECIFIED = 0  # State unspecified
        OPERATION_STATE_RUNNING = 1      # Operation is running
        OPERATION_STATE_SUCCEEDED = 2    # Operation succeeded
        OPERATION_STATE_FAILED = 3       # Operation failed
        OPERATION_STATE_CANCELLING = 4   # Operation is being cancelled
        OPERATION_STATE_CANCELLED = 5    # Operation was cancelled
    """

class DatasetState:
    """
    AutoML dataset states.
    
    Values:
        DATASET_STATE_UNSPECIFIED = 0  # Dataset state unspecified
        CREATING = 1                   # Dataset is being created
        ACTIVE = 2                     # Dataset is active and ready
        DELETING = 3                   # Dataset is being deleted
    """

class ModelState:
    """
    AutoML model states.
    
    Values:
        MODEL_STATE_UNSPECIFIED = 0  # Model state unspecified
        CREATING = 1                 # Model is being created/trained
        ACTIVE = 2                   # Model is active and ready
        DELETING = 3                 # Model is being deleted
        FAILED = 4                   # Model creation/training failed
    """

class ExampleUsage:
    """
    Training example usage types.
    
    Values:
        EXAMPLE_USAGE_UNSPECIFIED = 0  # Usage unspecified
        TRAIN = 1                      # Training example
        VALIDATION = 2                 # Validation example
        TEST = 3                       # Test example
    """

Metadata Structures

class CreateGlossaryMetadata:
    """
    Metadata for glossary creation operation.
    
    Attributes:
        name (str): Glossary resource name
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class UpdateGlossaryMetadata:
    """
    Metadata for glossary update operation.
    
    Attributes:
        name (str): Glossary resource name
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class DeleteGlossaryMetadata:
    """
    Metadata for glossary deletion operation.
    
    Attributes:
        name (str): Glossary resource name
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class CreateDatasetMetadata:
    """
    Metadata for dataset creation operation.
    
    Attributes:
        name (str): Dataset resource name
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class DeleteDatasetMetadata:
    """
    Metadata for dataset deletion operation.
    
    Attributes:
        name (str): Dataset resource name
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class CreateModelMetadata:
    """
    Metadata for model creation operation.
    
    Attributes:
        name (str): Model resource name
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class DeleteModelMetadata:
    """
    Metadata for model deletion operation.
    
    Attributes:
        name (str): Model resource name
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class ImportDataMetadata:
    """
    Metadata for data import operation.
    
    Attributes:
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

class ExportDataMetadata:
    """
    Metadata for data export operation.
    
    Attributes:
        state (OperationState): Current operation state
        submit_time (Timestamp): Operation submit time
    """

Response Structures for Batch Operations

class BatchTransferResourcesResponse:
    """
    Response for batch resource transfer operation.
    
    Attributes:
        transferred_resources (list): List of transferred resource names
    """

class ImportAdaptiveMtFileResponse:
    """
    Response for adaptive MT file import operation.
    
    Attributes:
        adaptive_mt_file_count (int): Number of files imported
        adaptive_mt_sentence_count (int): Number of sentences imported
    """

Pagination Support Types

Many list operations support pagination through dedicated pager classes that handle automatic page iteration:

Synchronous Pagers

class ListGlossariesPager:
    """Pager for ListGlossaries operation."""

class ListGlossaryEntriesPager:
    """Pager for ListGlossaryEntries operation."""

class ListDatasetsPager:
    """Pager for ListDatasets operation."""

class ListModelsPager:
    """Pager for ListModels operation."""

class ListExamplesPager:
    """Pager for ListExamples operation."""

class ListAdaptiveMtDatasetsPager:
    """Pager for ListAdaptiveMtDatasets operation."""

class ListAdaptiveMtFilesPager:
    """Pager for ListAdaptiveMtFiles operation."""

class ListAdaptiveMtSentencesPager:
    """Pager for ListAdaptiveMtSentences operation."""

Asynchronous Pagers

class ListGlossariesAsyncPager:
    """Async pager for ListGlossaries operation."""

class ListGlossaryEntriesAsyncPager:
    """Async pager for ListGlossaryEntries operation."""

class ListDatasetsAsyncPager:
    """Async pager for ListDatasets operation."""

class ListModelsAsyncPager:
    """Async pager for ListModels operation."""

class ListExamplesAsyncPager:
    """Async pager for ListExamples operation."""

class ListAdaptiveMtDatasetsAsyncPager:
    """Async pager for ListAdaptiveMtDatasets operation."""

class ListAdaptiveMtFilesAsyncPager:
    """Async pager for ListAdaptiveMtFiles operation."""

class ListAdaptiveMtSentencesAsyncPager:
    """Async pager for ListAdaptiveMtSentences operation."""

Usage Examples

Working with Glossary Data Structures

from google.cloud import translate_v3

# Create a bilateral glossary entry
glossary_entry = translate_v3.GlossaryEntry()
glossary_entry.terms_pair = translate_v3.GlossaryTermsPair()
glossary_entry.terms_pair.source_term = translate_v3.GlossaryTerm(
    language_code="en",
    text="machine learning"
)
glossary_entry.terms_pair.target_term = translate_v3.GlossaryTerm(
    language_code="es", 
    text="aprendizaje automático"
)

# Create a multilingual glossary entry
multilingual_entry = translate_v3.GlossaryEntry()
multilingual_entry.terms_set = translate_v3.GlossaryTermsSet()
multilingual_entry.terms_set.terms = {
    "en": "artificial intelligence",
    "es": "inteligencia artificial",
    "fr": "intelligence artificielle"
}

Working with Dataset and Model Structures

from google.cloud import translate_v3

# Create dataset configuration
dataset = translate_v3.Dataset()
dataset.display_name = "My Translation Dataset"
dataset.source_language_code = "en"
dataset.target_language_code = "es"

# Create model configuration
model = translate_v3.Model()
model.display_name = "My Custom Translation Model"
model.dataset = "projects/my-project/locations/us-central1/datasets/my-dataset"

Install with Tessl CLI

npx tessl i tessl/pypi-google-cloud-translate

docs

index.md

v2-legacy-api.md

v3-adaptive-mt.md

v3-automl-translation.md

v3-data-structures.md

v3-glossary-management.md

v3-request-response-types.md

v3-translation-services.md

tile.json