or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
pypipkg:pypi/modal@1.1.x
tile.json

tessl/pypi-modal

tessl install tessl/pypi-modal@1.1.0

Python client library for Modal, a serverless cloud computing platform that enables developers to run Python code in the cloud with on-demand access to compute resources.

Agent Success

Agent success rate when using this tile

85%

Improvement

Agent success rate improvement when using this tile compared to baseline

1.6x

Baseline

Agent success rate without this tile

53%

task.mdevals/scenario-10/

Real-time Log Aggregator

Build a real-time log aggregation system that runs multiple sandbox processes in parallel and streams their outputs to a centralized log file, processing each log line as it becomes available.

Requirements

Process Management

Create a function that spawns multiple sandbox instances, each running a command that generates continuous output (such as a monitoring script or a long-running process). The system should handle at least 3 concurrent sandboxes.

Stream Processing

Implement real-time streaming that:

  • Reads output from each sandbox process as it becomes available
  • Prefixes each log line with the sandbox identifier and timestamp
  • Writes the processed log lines to an output file progressively
  • Continues until all processes complete

Output Format

Each log line in the aggregated output should follow this format:

[sandbox-<id>] [<timestamp>] <original-log-line>

Where:

  • <id> is a unique identifier for each sandbox
  • <timestamp> is in ISO 8601 format (e.g., 2024-01-15T10:30:45)
  • <original-log-line> is the actual output from the process

Error Handling

The system should:

  • Handle process failures gracefully
  • Continue aggregating logs from remaining processes if one fails
  • Include error messages in the aggregated log with an [ERROR] prefix

Test Cases

  • When three sandboxes each run echo -e "line1\nline2\nline3", the output file contains exactly 9 lines with proper prefixes @test
  • When a sandbox process fails mid-execution, the aggregator continues processing other sandboxes @test
  • When processes complete at different times, all output is captured before the aggregator exits @test
  • Log lines are written to the output file as they arrive, not buffered until completion @test

Implementation

@generates

API

async def aggregate_logs(commands: list[str], output_path: str) -> None:
    """
    Spawns multiple sandbox processes and aggregates their output streams.

    Args:
        commands: List of shell commands to execute in separate sandboxes
        output_path: Path to the output file for aggregated logs

    The function processes output streams in real-time, writing formatted
    log lines to the output file as they become available from each sandbox.
    """
    pass

Dependencies { .dependencies }

modal { .dependency }

Provides serverless sandbox execution and stream processing capabilities.