CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-mistralai

Python Client SDK for the Mistral AI API with chat completions, embeddings, fine-tuning, and agent capabilities.

Pending
Overview
Eval results
Files

models.mddocs/

Models

List and manage available models including base models and fine-tuned models. The models API provides access to model metadata, capabilities, and management operations.

Capabilities

List Models

Retrieve information about all available models including base models and fine-tuned models.

def list(**kwargs) -> ModelList:
    """
    List all available models.

    Returns:
    ModelList containing model information and metadata
    """

Retrieve Model

Get detailed information about a specific model including its capabilities and configuration.

def retrieve(model_id: str, **kwargs) -> Union[BaseModelCard, FTModelCard]:
    """
    Retrieve detailed information about a specific model.

    Parameters:
    - model_id: Unique identifier of the model

    Returns:
    Model card with detailed information (BaseModelCard for base models, 
    FTModelCard for fine-tuned models)
    """

Delete Model

Delete a fine-tuned model that is no longer needed.

def delete(model_id: str, **kwargs) -> DeleteModelOut:
    """
    Delete a fine-tuned model.

    Parameters:
    - model_id: Unique identifier of the fine-tuned model to delete

    Returns:
    Deletion confirmation with model information

    Note: Only fine-tuned models can be deleted, not base models
    """

Usage Examples

List Available Models

from mistralai import Mistral

client = Mistral(api_key="your-api-key")

# Get all available models
models = client.models.list()

print(f"Total models: {len(models.data)}")
for model in models.data:
    print(f"- {model.id}: {model.description or 'No description'}")
    if hasattr(model, 'capabilities'):
        print(f"  Capabilities: {', '.join(model.capabilities)}")
    print()

Get Model Details

# Get details for a specific model
model_id = "mistral-small-latest"
model_info = client.models.retrieve(model_id)

print(f"Model ID: {model_info.id}")
print(f"Type: {model_info.type}")
print(f"Created: {model_info.created}")

if hasattr(model_info, 'max_context_length'):
    print(f"Max context length: {model_info.max_context_length}")

if hasattr(model_info, 'capabilities'):
    print(f"Capabilities: {model_info.capabilities}")

if hasattr(model_info, 'description'):
    print(f"Description: {model_info.description}")

Model Filtering and Selection

# Filter models by capabilities
models = client.models.list()
chat_models = []
embedding_models = []

for model in models.data:
    if hasattr(model, 'capabilities'):
        if 'completion' in model.capabilities or 'chat' in model.capabilities:
            chat_models.append(model)
        if 'embedding' in model.capabilities:
            embedding_models.append(model)

print("Chat/Completion Models:")
for model in chat_models:
    print(f"  - {model.id}")

print("\nEmbedding Models:")
for model in embedding_models:
    print(f"  - {model.id}")

Delete Fine-tuned Model

# Delete a fine-tuned model (only works for custom fine-tuned models)
try:
    result = client.models.delete("ft-model-id-example")
    print(f"Deleted model: {result.id}")
    print(f"Status: {result.deleted}")
except Exception as e:
    print(f"Error deleting model: {e}")

Types

Model List Types

class ModelList:
    object: str
    data: List[Union[BaseModelCard, FTModelCard]]

class BaseModelCard:
    id: str
    object: str
    created: int
    owned_by: str
    type: str
    description: Optional[str]
    max_context_length: Optional[int]
    aliases: Optional[List[str]]
    capabilities: Optional[List[str]]
    default_model_temperature: Optional[float]
    max_temperature: Optional[float]

class FTModelCard:
    id: str
    object: str
    created: int
    owned_by: str
    type: str
    description: Optional[str]
    capabilities: Optional[List[str]]
    job: Optional[str]
    archived: Optional[bool]

Model Operations

class DeleteModelOut:
    id: str
    object: str
    deleted: bool

Model Capabilities

class ModelCapabilities:
    completion: bool
    chat: bool
    embedding: bool
    fine_tuning: bool
    function_calling: bool
    vision: bool

Model Categories

Base Models

Chat/Completion Models:

  • mistral-large-latest: Most capable model for complex tasks
  • mistral-small-latest: Efficient model for most tasks
  • mistral-medium-latest: Balanced performance and capability
  • codestral-latest: Specialized for code generation and completion

Embedding Models:

  • mistral-embed: General-purpose text embedding model

Fine-tuned Models

Fine-tuned models have custom identifiers and are created through the fine-tuning API. They inherit capabilities from their base models but are customized for specific use cases.

Model Selection Guidelines

  • mistral-large-latest: Complex reasoning, analysis, creative tasks
  • mistral-small-latest: General conversations, simple tasks, cost-effective
  • codestral-latest: Code generation, programming assistance, technical documentation
  • mistral-embed: Semantic search, similarity comparison, text classification

Model Properties

  • Context Length: Maximum number of tokens the model can process in a single request
  • Temperature Range: Supported temperature values for controlling randomness
  • Capabilities: List of supported features (completion, chat, embedding, etc.)
  • Aliases: Alternative names for the same model
  • Ownership: Organization or entity that owns/created the model

Install with Tessl CLI

npx tessl i tessl/pypi-mistralai

docs

agents.md

audio.md

batch.md

beta.md

chat-completions.md

classification.md

embeddings.md

files.md

fim.md

fine-tuning.md

index.md

models.md

ocr.md

tile.json