tessl install tessl/pypi-vllm@0.10.0A high-throughput and memory-efficient inference and serving engine for LLMs
Agent Success
Agent success rate when using this tile
69%
Improvement
Agent success rate improvement when using this tile compared to baseline
1.33x
Baseline
Agent success rate without this tile
52%
A Python application that generates creative story continuations with configurable parameters using a language model.
@generates
class StoryGenerator:
"""
A service for generating story continuations using an LLM.
"""
def __init__(self, model_name: str = "facebook/opt-125m"):
"""
Initialize the story generator with a specified model.
Args:
model_name: The name or path of the language model to use
"""
pass
def generate(
self,
prompt: str,
max_tokens: int = 100,
temperature: float = 0.7,
num_variations: int = 1
) -> list[str]:
"""
Generate story continuation(s) based on the given prompt.
Args:
prompt: The starting text for the story
max_tokens: Maximum number of tokens to generate
temperature: Controls randomness (0.0 = deterministic, 1.0 = very random)
num_variations: Number of different continuations to generate
Returns:
A list of generated story continuations
Raises:
ValueError: If prompt is empty
"""
passProvides high-performance inference and text generation capabilities.
vllm>=0.6.0