tessl install tessl/pypi-posthog@6.7.0Integrate PostHog into any python application.
Agent Success
Agent success rate when using this tile
89%
Improvement
Agent success rate improvement when using this tile compared to baseline
1.03x
Baseline
Agent success rate without this tile
86%
Build a simple AI chat application that makes OpenAI API calls and automatically tracks usage analytics including streaming response handling.
Create a chat application that:
Makes streaming chat completion requests - Send prompts to OpenAI and receive streaming responses (where tokens arrive incrementally)
Tracks all interactions - Automatically capture analytics events for each chat completion including:
Returns usable responses - The streaming response should be properly handled so the complete text can be returned to the user
The analytics tracking should work seamlessly with streaming responses, accumulating chunks and recording metrics without requiring manual implementation.
Implement the following test scenarios:
@generates
"""
AI Chat Assistant with Analytics
"""
def setup_ai_client(openai_api_key: str, analytics_api_key: str):
"""
Initialize the AI client with analytics tracking enabled.
Args:
openai_api_key: API key for OpenAI
analytics_api_key: API key for analytics platform
Returns:
Configured AI client ready for chat completions
"""
pass
def send_chat_message(client, user_id: str, message: str, stream: bool = True) -> str:
"""
Send a chat message and get a response.
Args:
client: Configured AI client
user_id: User identifier for analytics
message: The chat message to send
stream: Whether to use streaming responses
Returns:
The complete response text from the AI
"""
passProvides analytics and event tracking capabilities, including AI observability features for tracking LLM usage.
@satisfied-by
Provides OpenAI API client for chat completions.
@satisfied-by