Comprehensive Python client library for Google Cloud Vertex AI, offering machine learning tools, generative AI models, and MLOps capabilities
npx @tessl/cli install tessl/pypi-google-cloud-aiplatform@1.111.00
# Google Cloud AI Platform
1
2
A comprehensive Python client library for Google Cloud Vertex AI, offering access to Google's integrated suite of machine learning tools and services. The library enables developers to build, deploy, and manage ML models using both AutoML and custom code approaches, supporting the entire machine learning development lifecycle from data preparation to model deployment and monitoring.
3
4
## Package Information
5
6
- **Package Name**: google-cloud-aiplatform
7
- **Language**: Python
8
- **Installation**: `pip install google-cloud-aiplatform`
9
10
## Core Imports
11
12
For traditional Vertex AI resource management:
13
14
```python
15
import google.cloud.aiplatform as aiplatform
16
17
# Initialize the SDK
18
aiplatform.init(project='your-project-id', location='us-central1')
19
```
20
21
For modern generative AI functionality:
22
23
```python
24
import vertexai
25
from vertexai.generative_models import GenerativeModel
26
from vertexai.language_models import TextGenerationModel
27
from vertexai.vision_models import ImageGenerationModel
28
29
# Initialize Vertex AI
30
vertexai.init(project='your-project-id', location='us-central1')
31
```
32
33
## Basic Usage
34
35
### Traditional Vertex AI SDK
36
37
```python
38
import google.cloud.aiplatform as aiplatform
39
40
# Initialize
41
aiplatform.init(project='my-project', location='us-central1')
42
43
# Create a tabular dataset
44
dataset = aiplatform.TabularDataset.create(
45
display_name="my-dataset",
46
gcs_source="gs://my-bucket/dataset.csv"
47
)
48
49
# Train an AutoML model
50
job = aiplatform.AutoMLTabularTrainingJob(
51
display_name="my-training-job",
52
optimization_prediction_type="classification"
53
)
54
55
model = job.run(
56
dataset=dataset,
57
target_column="label"
58
)
59
60
# Deploy for predictions
61
endpoint = model.deploy(
62
deployed_model_display_name="my-model",
63
machine_type="n1-standard-4"
64
)
65
66
# Make predictions
67
predictions = endpoint.predict([[1.0, 2.0, 3.0]])
68
```
69
70
### Generative AI SDK
71
72
```python
73
import vertexai
74
from vertexai.generative_models import GenerativeModel
75
76
# Initialize
77
vertexai.init(project='my-project', location='us-central1')
78
79
# Use Gemini model
80
model = GenerativeModel('gemini-1.5-pro')
81
response = model.generate_content('Explain quantum computing')
82
print(response.text)
83
84
# Chat conversation
85
chat = model.start_chat()
86
response = chat.send_message('Hello!')
87
print(response.text)
88
```
89
90
## Architecture
91
92
The library is organized into two main APIs that serve different use cases:
93
94
### Traditional Vertex AI SDK (`google.cloud.aiplatform`)
95
96
Resource-based API following Google Cloud patterns:
97
- **Resources**: Models, Endpoints, Datasets, Jobs represent cloud resources
98
- **Lifecycle Management**: Create, update, delete, and monitor resources
99
- **Enterprise Features**: IAM, VPC, monitoring, batch processing
100
- **MLOps Integration**: Pipelines, experiments, feature stores, model registry
101
102
### Generative AI SDK (`vertexai`)
103
104
Modern, streamlined API for AI generation:
105
- **Model-Centric**: Direct interaction with AI models
106
- **Simplified Usage**: Minimal configuration for common tasks
107
- **Multimodal Support**: Text, images, video, and audio processing
108
- **Advanced Features**: Function calling, grounding, safety filtering
109
110
This dual-API design allows developers to choose the right abstraction level for their needs while maintaining compatibility between both approaches.
111
112
## Capabilities
113
114
### Dataset Management
115
116
Comprehensive dataset creation, management, and preparation for various ML tasks including tabular, image, text, video, and time series data.
117
118
```python { .api }
119
class TabularDataset:
120
def create(cls, display_name: str, gcs_source: Union[str, Sequence[str]], **kwargs) -> 'TabularDataset': ...
121
def import_data(self, gcs_source: Union[str, Sequence[str]], **kwargs) -> None: ...
122
123
class ImageDataset:
124
def create(cls, display_name: str, gcs_source: str, import_schema_uri: str, **kwargs) -> 'ImageDataset': ...
125
126
class TextDataset:
127
def create(cls, display_name: str, gcs_source: Union[str, Sequence[str]], **kwargs) -> 'TextDataset': ...
128
```
129
130
[Dataset Management](./datasets.md)
131
132
### Model Training
133
134
AutoML and custom training capabilities for various ML tasks with comprehensive job management and monitoring.
135
136
```python { .api }
137
class AutoMLTabularTrainingJob:
138
def __init__(self, display_name: str, optimization_prediction_type: str, **kwargs): ...
139
def run(self, dataset: TabularDataset, target_column: str, **kwargs) -> Model: ...
140
141
class CustomTrainingJob:
142
def __init__(self, display_name: str, script_path: str, container_uri: str, **kwargs): ...
143
def run(self, dataset: Optional[Dataset] = None, **kwargs) -> Model: ...
144
```
145
146
[Model Training](./training.md)
147
148
### Model Management and Deployment
149
150
Model versioning, deployment, and serving with comprehensive endpoint management and resource optimization.
151
152
```python { .api }
153
class Model:
154
def deploy(self, endpoint: Optional[Endpoint] = None, deployed_model_display_name: str = None, **kwargs) -> Endpoint: ...
155
def upload(cls, display_name: str, artifact_uri: str, serving_container_image_uri: str, **kwargs) -> 'Model': ...
156
157
class Endpoint:
158
def create(cls, display_name: str, **kwargs) -> 'Endpoint': ...
159
def predict(self, instances: List[Dict], **kwargs) -> Prediction: ...
160
```
161
162
[Model Management](./models.md)
163
164
### Generative AI Models
165
166
Modern generative AI capabilities including text generation, chat, multimodal interactions, and function calling.
167
168
```python { .api }
169
class GenerativeModel:
170
def __init__(self, model_name: str, generation_config: Optional[GenerationConfig] = None, **kwargs): ...
171
def generate_content(self, contents: ContentsType, stream: bool = False, **kwargs) -> GenerationResponse: ...
172
def start_chat(self, history: Optional[List[Content]] = None, **kwargs) -> ChatSession: ...
173
174
class TextGenerationModel:
175
def from_pretrained(cls, model_name: str) -> 'TextGenerationModel': ...
176
def predict(self, prompt: str, **kwargs) -> TextGenerationResponse: ...
177
```
178
179
[Generative AI](./generative-ai.md)
180
181
### Computer Vision
182
183
Comprehensive vision AI capabilities including image generation, analysis, and multimodal understanding.
184
185
```python { .api }
186
class ImageGenerationModel:
187
def from_pretrained(cls, model_name: str) -> 'ImageGenerationModel': ...
188
def generate_images(self, prompt: str, number_of_images: int = 1, **kwargs) -> ImageGenerationResponse: ...
189
190
class ImageCaptioningModel:
191
def get_captions(self, image: Image, number_of_results: int = 1, **kwargs) -> List[str]: ...
192
```
193
194
[Computer Vision](./vision.md)
195
196
### ML Pipelines
197
198
Workflow orchestration, scheduling, and complex ML pipeline management with Kubeflow Pipelines integration.
199
200
```python { .api }
201
class PipelineJob:
202
def create(cls, display_name: str, template_path: str, **kwargs) -> 'PipelineJob': ...
203
def run(self, service_account: Optional[str] = None, **kwargs) -> None: ...
204
205
class PipelineJobSchedule:
206
def create(cls, pipeline_job: PipelineJob, display_name: str, cron: str, **kwargs) -> 'PipelineJobSchedule': ...
207
```
208
209
[ML Pipelines](./pipelines.md)
210
211
### Feature Store
212
213
Enterprise feature management with online and offline serving, feature versioning, and monitoring.
214
215
```python { .api }
216
class Featurestore:
217
def create(cls, featurestore_id: str, **kwargs) -> 'Featurestore': ...
218
def create_entity_type(self, entity_type_id: str, **kwargs) -> EntityType: ...
219
220
class Feature:
221
def create(cls, feature_id: str, value_type: str, entity_type: EntityType, **kwargs) -> 'Feature': ...
222
```
223
224
[Feature Store](./feature-store.md)
225
226
### Experiment Tracking
227
228
Comprehensive experiment management, metrics logging, and artifact tracking with integration to popular ML frameworks.
229
230
```python { .api }
231
def init(project: str, location: str, **kwargs) -> None: ...
232
def start_run(run: str, resume: bool = False, **kwargs) -> None: ...
233
def log_params(params: Dict[str, Union[str, int, float]]) -> None: ...
234
def log_metrics(metrics: Dict[str, Union[int, float]]) -> None: ...
235
def log_model(model, artifact_id: Optional[str] = None, **kwargs) -> None: ...
236
237
class Experiment:
238
def create(cls, experiment_id: str, **kwargs) -> 'Experiment': ...
239
```
240
241
[Experiment Tracking](./experiments.md)
242
243
### Vector Search
244
245
High-performance vector similarity search with approximate nearest neighbor capabilities for embedding-based applications.
246
247
```python { .api }
248
class MatchingEngineIndex:
249
def create(cls, display_name: str, contents_delta_uri: str, **kwargs) -> 'MatchingEngineIndex': ...
250
def update_embeddings(self, contents_delta_uri: str, **kwargs) -> None: ...
251
252
class MatchingEngineIndexEndpoint:
253
def create(cls, display_name: str, public_endpoint_enabled: bool = False, **kwargs) -> 'MatchingEngineIndexEndpoint': ...
254
def match(self, deployed_index_id: str, queries: List[List[float]], **kwargs) -> List[List[MatchNeighbor]]: ...
255
```
256
257
[Vector Search](./vector-search.md)
258
259
### Batch Processing
260
261
Large-scale batch inference and data processing with distributed computing integration and resource optimization.
262
263
```python { .api }
264
class BatchPredictionJob:
265
def create(cls, job_display_name: str, model_name: str, instances_format: str, **kwargs) -> 'BatchPredictionJob': ...
266
def create_from_job_spec(cls, job_spec: Dict, **kwargs) -> 'BatchPredictionJob': ...
267
```
268
269
[Batch Processing](./batch.md)
270
271
## Types
272
273
```python { .api }
274
# Common types used across the API
275
276
# Resource management
277
class Prediction:
278
predictions: List[Dict]
279
deployed_model_id: str
280
model_version_id: str
281
model_resource_name: str
282
explanations: Optional[List[Explanation]]
283
284
# Generative AI types
285
ContentsType = Union[
286
str,
287
Image,
288
Part,
289
List[Union[str, Image, Part]],
290
List[Content]
291
]
292
293
GenerationConfigType = Union[GenerationConfig, Dict[str, Any]]
294
SafetySettingsType = Union[SafetySetting, List[SafetySetting], Dict[str, Any]]
295
296
# Training and evaluation
297
class TrainingJob:
298
resource_name: str
299
display_name: str
300
state: JobState
301
create_time: datetime
302
start_time: Optional[datetime]
303
end_time: Optional[datetime]
304
error: Optional[Status]
305
306
# Enums
307
class JobState(Enum):
308
JOB_STATE_UNSPECIFIED = 0
309
JOB_STATE_QUEUED = 1
310
JOB_STATE_PENDING = 2
311
JOB_STATE_RUNNING = 3
312
JOB_STATE_SUCCEEDED = 4
313
JOB_STATE_FAILED = 5
314
JOB_STATE_CANCELLING = 6
315
JOB_STATE_CANCELLED = 7
316
JOB_STATE_PAUSED = 8
317
JOB_STATE_EXPIRED = 9
318
```