or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

async-inference.mdchat-completions.mdconfiguration.mdindex.mdparameters-types.mdtext-classification.mdtext-embeddings.mdtext-generation.mdtext-scoring.md
tile.json

tile.json

{
  "name": "tessl/pypi-vllm",
  "version": "0.10.0",
  "docs": "docs/index.md",
  "describes": "pkg:pypi/vllm@0.10.2",
  "summary": "A high-throughput and memory-efficient inference and serving engine for LLMs"
}