or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

caching.mdchat-completion.mdgrammar.mdindex.mdllama-model.mdlow-level.mdserver.mdtokenization.mdvision.md
tile.json

tile.json

{
  "name": "tessl/pypi-llama-cpp-python",
  "version": "0.3.0",
  "docs": "docs/index.md",
  "describes": "pkg:pypi/llama-cpp-python@0.3.16",
  "summary": "Python bindings for the llama.cpp library providing high-performance LLM inference with OpenAI-compatible APIs."
}