CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-zenml

ZenML is a unified MLOps framework that extends battle-tested machine learning operations principles to support the entire AI stack, from classical machine learning models to advanced AI agents.

Overview
Eval results
Files

integrations.mddocs/

Integrations

ZenML includes 67 integrations with various ML/MLOps tools, cloud providers, and frameworks. Each integration provides stack component flavors, materializers, steps, and utilities specific to that tool.

Integration Categories

Cloud Providers

AWS Integration

  • S3 artifact store
  • SageMaker orchestrator and step operator
  • ECR container registry
  • Secrets Manager integration
  • Installation: pip install zenml[aws]

Azure Integration

  • Azure Blob Storage artifact store
  • AzureML orchestrator
  • ACR container registry
  • Azure Key Vault secrets
  • Installation: pip install zenml[azure]

GCP Integration

  • GCS artifact store
  • Vertex AI orchestrator
  • GCR container registry
  • Google Secret Manager
  • Cloud Build image builder
  • Installation: pip install zenml[gcp]

Orchestrators

Kubernetes

  • Kubernetes orchestrator for running pipelines on K8s clusters
  • Installation: pip install zenml[kubernetes]

Kubeflow

  • Kubeflow Pipelines orchestrator
  • Installation: pip install zenml[kubeflow]

Airflow

  • Apache Airflow orchestrator
  • Installation: pip install zenml[airflow]

Tekton

  • Tekton Pipelines orchestrator
  • Installation: pip install zenml[tekton]

Databricks

  • Databricks orchestrator for running on Databricks clusters
  • Installation: pip install zenml[databricks]

Lightning

  • Lightning.ai orchestrator
  • Installation: pip install zenml[lightning]

Modal

  • Modal orchestrator for serverless execution
  • Installation: pip install zenml[modal]

SkyPilot

  • SkyPilot orchestrator with variants for AWS, GCP, Azure, Kubernetes, Lambda
  • Installation: pip install zenml[skypilot]

Experiment Tracking

MLflow

  • MLflow tracking server integration
  • MLflow model registry
  • MLflow model deployer
  • Installation: pip install zenml[mlflow]

Weights & Biases

  • W&B experiment tracking
  • Installation: pip install zenml[wandb]

Neptune

  • Neptune.ai experiment tracking
  • Installation: pip install zenml[neptune]

Comet

  • Comet.ml experiment tracking
  • Installation: pip install zenml[comet]

TensorBoard

  • TensorBoard visualization
  • Installation: pip install zenml[tensorboard]

ML Frameworks

PyTorch

  • PyTorch tensor and model materializers
  • Installation: pip install zenml[pytorch]

TensorFlow

  • TensorFlow tensor and model materializers
  • Installation: pip install zenml[tensorflow]

PyTorch Lightning

  • Lightning module materializers
  • Installation: pip install zenml[pytorch-lightning]

JAX

  • JAX array materializers
  • Installation: pip install zenml[jax]

Scikit-learn

  • Sklearn model materializers
  • Installation: pip install zenml[sklearn]

XGBoost

  • XGBoost booster and DMatrix materializers
  • Installation: pip install zenml[xgboost]

LightGBM

  • LightGBM booster and dataset materializers
  • Installation: pip install zenml[lightgbm]

HuggingFace

  • Transformers model and tokenizer materializers
  • Datasets integration
  • Installation: pip install zenml[huggingface]

LangChain

  • LangChain document and chain materializers
  • Installation: pip install zenml[langchain]

LlamaIndex

  • LlamaIndex materializers
  • Installation: pip install zenml[llama-index]

Data Validation

Great Expectations

  • Data validation and profiling
  • Installation: pip install zenml[great-expectations]

Deepchecks

  • Model validation and monitoring
  • Installation: pip install zenml[deepchecks]

Evidently

  • Data and model monitoring
  • Installation: pip install zenml[evidently]

Whylogs

  • Data logging and profiling
  • Installation: pip install zenml[whylogs]

Model Deployment

BentoML

  • Model serving with BentoML
  • Installation: pip install zenml[bentoml]

Seldon Core

  • Seldon Core model serving
  • Installation: pip install zenml[seldon]

vLLM

  • vLLM for LLM serving
  • Installation: pip install zenml[vllm]

Annotation Tools

Label Studio

  • Data annotation with Label Studio
  • Installation: pip install zenml[label-studio]

Pigeon

  • Simple annotation in notebooks
  • Installation: pip install zenml[pigeon]

Argilla

  • Data annotation and curation
  • Installation: pip install zenml[argilla]

Alerting

Slack

  • Slack notifications via alerter component
  • Installation: pip install zenml[slack]

Discord

  • Discord notifications via alerter component
  • Installation: pip install zenml[discord]

Feature Stores

Feast

  • Feast feature store integration
  • Installation: pip install zenml[feast]

Visualization

Facets

  • Facets data visualization
  • Installation: pip install zenml[facets]

Code Repositories

GitHub

  • GitHub integration for code tracking
  • Installation: pip install zenml[github]

GitLab

  • GitLab integration
  • Installation: pip install zenml[gitlab]

Bitbucket

  • Bitbucket integration
  • Installation: pip install zenml[bitbucket]

Other Tools

Spark

  • Apache Spark integration
  • Installation: pip install zenml[spark]

Kaniko

  • Kaniko image builder
  • Installation: pip install zenml[kaniko]

NumPy

  • NumPy array materializers (core dependency)

Pandas

  • DataFrame materializers (core dependency)

Pillow

  • Image materializers
  • Installation: pip install zenml[pillow]

PyArrow

  • PyArrow table materializers
  • Installation: pip install zenml[pyarrow]

SciPy

  • SciPy sparse matrix materializers
  • Installation: pip install zenml[scipy]

PyCaret

  • PyCaret model materializers
  • Installation: pip install zenml[pycaret]

Neural Prophet

  • Neural Prophet model materializers
  • Installation: pip install zenml[neural-prophet]

HyperAI

  • HyperAI orchestrator
  • Installation: pip install zenml[hyperai]

Usage Examples

Installing Integrations

# Single integration
pip install zenml[aws]

# Multiple integrations
pip install zenml[aws,mlflow,pytorch]

# All integrations
pip install zenml[all]

Activating Integration in Code

from zenml.integrations.aws import AWSIntegration

# Check if integration is installed
if AWSIntegration.check_installation():
    print("AWS integration available")

# Integration is auto-activated when components are used

Using AWS S3 Artifact Store

from zenml.client import Client

client = Client()

# Create S3 artifact store
client.create_stack_component(
    name="s3_store",
    flavor="s3",
    component_type="artifact_store",
    configuration={
        "path": "s3://my-bucket/artifacts",
        "region": "us-east-1"
    }
)

Using MLflow Experiment Tracker

from zenml import step, pipeline
from zenml.client import Client

client = Client()

# Create MLflow experiment tracker
client.create_stack_component(
    name="mlflow_tracker",
    flavor="mlflow",
    component_type="experiment_tracker",
    configuration={
        "tracking_uri": "http://localhost:5000"
    }
)

# Use in step
@step(experiment_tracker="mlflow_tracker")
def train_model(data: list) -> dict:
    import mlflow
    mlflow.log_param("param1", "value1")
    mlflow.log_metric("accuracy", 0.95)
    return {"model": "trained"}

Using PyTorch with Materializers

from zenml import step
import torch
import torch.nn as nn

@step
def train_pytorch_model(data: list) -> nn.Module:
    """PyTorch models automatically use PyTorch materializer."""
    model = nn.Sequential(
        nn.Linear(10, 5),
        nn.ReLU(),
        nn.Linear(5, 1)
    )
    # Training logic
    return model

@step
def evaluate_model(model: nn.Module, test_data: list) -> float:
    """Model automatically deserialized."""
    # Evaluation logic
    return 0.95

Using SageMaker Orchestrator

from zenml.client import Client

client = Client()

# Create SageMaker orchestrator
client.create_stack_component(
    name="sagemaker",
    flavor="sagemaker",
    component_type="orchestrator",
    configuration={
        "execution_role": "arn:aws:iam::123456789:role/SageMaker",
        "region": "us-east-1",
        "instance_type": "ml.m5.large"
    }
)

# Create stack with SageMaker
client.create_stack(
    name="sagemaker_stack",
    components={
        "orchestrator": "sagemaker",
        "artifact_store": "s3_store"
    }
)

# Activate and use
client.activate_stack("sagemaker_stack")

Using Great Expectations for Data Validation

from zenml import step
import pandas as pd

@step
def validate_data(data: pd.DataFrame) -> pd.DataFrame:
    """Validate data with Great Expectations."""
    from great_expectations.core import ExpectationSuite
    from great_expectations.dataset import PandasDataset

    ge_df = PandasDataset(data)

    # Define expectations
    ge_df.expect_column_to_exist("feature1")
    ge_df.expect_column_values_to_not_be_null("feature1")
    ge_df.expect_column_values_to_be_between("feature1", 0, 100)

    # Validate
    results = ge_df.validate()

    if not results.success:
        raise ValueError("Data validation failed")

    return data

Using Slack Alerter

from zenml import pipeline
from zenml.hooks import alerter_success_hook, alerter_failure_hook
from zenml.client import Client

client = Client()

# Create Slack alerter
client.create_stack_component(
    name="slack_alerter",
    flavor="slack",
    component_type="alerter",
    configuration={
        "slack_token": "xoxb-your-token",
        "default_slack_channel_id": "C01234567"
    }
)

# Use in pipeline
@pipeline(
    on_success=alerter_success_hook("slack_alerter"),
    on_failure=alerter_failure_hook("slack_alerter")
)
def monitored_pipeline():
    # Pipeline definition
    pass

Listing Available Integrations

from zenml.integrations.registry import integration_registry

# List all integrations
for name, integration in integration_registry.integrations.items():
    print(f"{name}: {integration.NAME}")

# Check specific integration
from zenml.integrations.pytorch import PyTorchIntegration

print(f"PyTorch Integration: {PyTorchIntegration.NAME}")
print(f"Requirements: {PyTorchIntegration.REQUIREMENTS}")

Integration Combinations

Full ML Stack Example

from zenml.client import Client

client = Client()

# AWS + MLflow + PyTorch stack
stack_components = {
    "orchestrator": "sagemaker",
    "artifact_store": "s3_store",
    "container_registry": "ecr",
    "experiment_tracker": "mlflow",
    "model_registry": "mlflow_registry"
}

client.create_stack(
    name="ml_production",
    components=stack_components
)

Install with Tessl CLI

npx tessl i tessl/pypi-zenml

docs

artifact-config.md

artifacts.md

client.md

config.md

enums.md

exceptions.md

hooks.md

index.md

integrations.md

materializers.md

metadata-tags.md

models.md

pipelines-and-steps.md

pydantic-models.md

services.md

stack-components.md

stacks.md

types.md

utilities.md

tile.json