tessl install tessl/pypi-modal@1.1.0Python client library for Modal, a serverless cloud computing platform that enables developers to run Python code in the cloud with on-demand access to compute resources.
Agent Success
Agent success rate when using this tile
85%
Improvement
Agent success rate improvement when using this tile compared to baseline
1.6x
Baseline
Agent success rate without this tile
53%
Build a distributed log aggregation system that processes log files from multiple sources in parallel and outputs consolidated results in real-time.
Your team needs a system to process large volumes of log files stored across multiple directories. The system should analyze logs for error patterns, generate statistics, and display progress information during processing. Since processing is distributed across multiple remote workers, visibility into what each worker is doing is crucial for debugging and monitoring.
Create a function process_log_file(file_path: str) -> dict that:
file_name, total_lines, errors, warningsCreate a function process_log_batch(file_paths: list[str]) -> list[dict] that:
process_log_fileCreate a CLI entry point that:
.log files in the directory (recursively)@test "processes single log file correctly"
# test_log_processor.py
from log_aggregator import process_log_file
import tempfile
import os
def test_process_single_file():
# Create a test log file
with tempfile.NamedTemporaryFile(mode='w', suffix='.log', delete=False) as f:
f.write("2024-01-01 10:00:00 INFO Application started\n")
f.write("2024-01-01 10:01:00 ERROR Database connection failed\n")
f.write("2024-01-01 10:02:00 WARNING Memory usage high\n")
f.write("2024-01-01 10:03:00 INFO Processing request\n")
f.write("2024-01-01 10:04:00 ERROR Invalid user input\n")
temp_path = f.name
try:
result = process_log_file(temp_path)
assert result['total_lines'] == 5
assert result['errors'] == 2
assert result['warnings'] == 1
assert temp_path.endswith(result['file_name'])
finally:
os.unlink(temp_path)@test "processes multiple files in batch"
# test_log_processor.py
from log_aggregator import process_log_batch
import tempfile
import os
def test_process_batch():
# Create multiple test log files
temp_files = []
for i in range(3):
with tempfile.NamedTemporaryFile(mode='w', suffix='.log', delete=False) as f:
f.write(f"2024-01-01 10:00:00 INFO File {i}\n")
f.write("2024-01-01 10:01:00 ERROR Test error\n")
temp_files.append(f.name)
try:
results = process_log_batch(temp_files)
assert len(results) == 3
assert all(r['total_lines'] == 2 for r in results)
assert all(r['errors'] == 1 for r in results)
finally:
for path in temp_files:
os.unlink(path)Provides serverless compute infrastructure for distributed processing.