CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-azure-ai-contentsafety

Microsoft Azure AI Content Safety client library for Python providing text and image content analysis APIs with harm category detection and blocklist management

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

content-analysis.mddocs/

Content Analysis

Comprehensive content analysis capabilities for detecting harmful material in text and images. The Azure AI Content Safety service analyzes content across four harm categories (hate, self-harm, sexual, violence) and returns severity scores to help applications make content moderation decisions.

Capabilities

Text Analysis

Analyzes text content for potentially harmful material across four categories with configurable severity levels and blocklist integration.

def analyze_text(
    self, 
    options: Union[AnalyzeTextOptions, dict, IO[bytes]], 
    **kwargs
) -> AnalyzeTextResult:
    """
    Analyze text for potentially harmful content.
    
    Parameters:
    - options: Text analysis request containing text and analysis options
    - content_type: Body parameter content-type (default: "application/json")
    - stream: Whether to stream the response (default: False)
    
    Returns:
    AnalyzeTextResult: Analysis results with category scores and blocklist matches
    
    Raises:
    HttpResponseError: On analysis failure or service errors
    """

Usage Example:

from azure.ai.contentsafety import ContentSafetyClient
from azure.ai.contentsafety.models import AnalyzeTextOptions, TextCategory, AnalyzeTextOutputType
from azure.core.credentials import AzureKeyCredential

client = ContentSafetyClient(
    endpoint="https://your-resource.cognitiveservices.azure.com",
    credential=AzureKeyCredential("your-api-key")
)

# Basic text analysis
request = AnalyzeTextOptions(
    text="Text content to analyze for harmful material"
)
result = client.analyze_text(request)

# Print category results
for analysis in result.categories_analysis:
    print(f"Category: {analysis.category}, Severity: {analysis.severity}")

# Advanced text analysis with custom options
advanced_request = AnalyzeTextOptions(
    text="Text to analyze",
    categories=[TextCategory.HATE, TextCategory.VIOLENCE],
    output_type=AnalyzeTextOutputType.EIGHT_SEVERITY_LEVELS,
    blocklist_names=["my-custom-blocklist"],
    halt_on_blocklist_hit=True
)
result = client.analyze_text(advanced_request)

client.close()

Image Analysis

Analyzes image content for potentially harmful visual material across the same four categories as text analysis.

def analyze_image(
    self, 
    options: Union[AnalyzeImageOptions, dict, IO[bytes]], 
    **kwargs
) -> AnalyzeImageResult:
    """
    Analyze image for potentially harmful content.
    
    Parameters:
    - options: Image analysis request containing image data and analysis options
    - content_type: Body parameter content-type (default: "application/json")
    - stream: Whether to stream the response (default: False)
    
    Returns:
    AnalyzeImageResult: Analysis results with category scores
    
    Raises:
    HttpResponseError: On analysis failure or service errors
    """

Usage Example:

from azure.ai.contentsafety import ContentSafetyClient
from azure.ai.contentsafety.models import AnalyzeImageOptions, ImageData, ImageCategory
from azure.core.credentials import AzureKeyCredential
import base64

client = ContentSafetyClient(
    endpoint="https://your-resource.cognitiveservices.azure.com",
    credential=AzureKeyCredential("your-api-key")
)

# Analyze image from file (base64 encoded bytes)
with open("image.jpg", "rb") as image_file:
    image_content = base64.b64encode(image_file.read())
    image_data = ImageData(content=image_content)

request = AnalyzeImageOptions(
    image=image_data,
    categories=[ImageCategory.SEXUAL, ImageCategory.VIOLENCE]
)
result = client.analyze_image(request)

# Print results
for analysis in result.categories_analysis:
    print(f"Category: {analysis.category}, Severity: {analysis.severity}")

# Analyze image from URL
url_request = AnalyzeImageOptions(
    image=ImageData(blob_url="https://example.com/image.jpg")
)
result = client.analyze_image(url_request)

client.close()

Request Models

class AnalyzeTextOptions:
    """Text analysis request parameters."""
    def __init__(
        self,
        *,
        text: str,
        categories: Optional[List[Union[str, TextCategory]]] = None,
        blocklist_names: Optional[List[str]] = None,
        halt_on_blocklist_hit: Optional[bool] = None,
        output_type: Optional[Union[str, AnalyzeTextOutputType]] = None
    ): ...

class AnalyzeImageOptions:
    """Image analysis request parameters."""
    def __init__(
        self,
        *,
        image: ImageData,
        categories: Optional[List[Union[str, ImageCategory]]] = None,
        output_type: Optional[Union[str, AnalyzeImageOutputType]] = None
    ): ...

class ImageData:
    """Image data container for analysis."""
    def __init__(
        self,
        *,
        content: Optional[bytes] = None,  # Base64-encoded image bytes
        blob_url: Optional[str] = None
    ): ...

Response Models

class AnalyzeTextResult:
    """Text analysis results."""
    blocklists_match: Optional[List[TextBlocklistMatch]]
    categories_analysis: List[TextCategoriesAnalysis]

class AnalyzeImageResult:
    """Image analysis results."""
    categories_analysis: List[ImageCategoriesAnalysis]

class TextCategoriesAnalysis:
    """Text category analysis result."""
    category: TextCategory
    severity: Optional[int]

class ImageCategoriesAnalysis:
    """Image category analysis result."""
    category: ImageCategory
    severity: Optional[int]

class TextBlocklistMatch:
    """Blocklist match information."""
    blocklist_name: str
    blocklist_item_id: str
    blocklist_item_text: str

Error Handling

All content analysis methods can raise Azure Core exceptions:

from azure.core.exceptions import HttpResponseError, ClientAuthenticationError

try:
    result = client.analyze_text(request)
except ClientAuthenticationError:
    print("Authentication failed - check your credentials")
except HttpResponseError as e:
    print(f"Analysis failed: {e.status_code} - {e.message}")

Common error scenarios:

  • 401 Unauthorized: Invalid API key or expired token
  • 403 Forbidden: Insufficient permissions or quota exceeded
  • 429 Too Many Requests: Rate limit exceeded
  • 400 Bad Request: Invalid request parameters or content format

Install with Tessl CLI

npx tessl i tessl/pypi-azure-ai-contentsafety

docs

blocklist-management.md

content-analysis.md

index.md

tile.json