TypeScript client SDK for the LangSmith LLM tracing, evaluation, and monitoring platform.
Comprehensive guides for using LangSmith effectively, from setup to production deployment.
This section provides step-by-step guides for all major LangSmith features, best practices, and common workflows.
Next Steps: Explore Evaluation or Workflows
Next Steps: Deploy with Production Workflows
Next Steps: Explore Advanced Features
Learn how to:
Quick Start:
npm install langsmith
export LANGCHAIN_API_KEY=your_api_key
export LANGCHAIN_PROJECT=my-projectRelated: Client Configuration • Utilities
Code snippets for:
Use When: You need a fast code example without explanation
Related: Tracing • API Reference
Complete tracing documentation:
traceable() decorator - Automatic function tracinggetCurrentRunTree() - Access current trace contextKey Sections:
Related: Run Trees • Integrations • API Runs
Dataset-based evaluation:
Key Sections:
Related: Comparative Evaluation • Datasets API • Testing
Compare multiple experiments:
Use When: Comparing different models, prompts, or implementations
Related: Evaluation • Workflows
Integration with test frameworks:
Specific Integrations:
Related: Evaluation
Production-ready patterns:
Key Workflows:
Related: Evaluation • API Reference
→ Start with Tracing Quick Start → See examples in Quick Reference
→ Follow Evaluation Guide → Create dataset with Datasets API
→ Use Comparative Evaluation → See A/B Testing Workflow
→ Check SDK Wrappers → See OpenAI Wrapper or Anthropic Wrapper
→ Follow Production Workflow → Configure Client for Production
→ Use Privacy Features → Or Data Anonymization
→ Check Testing Overview → Follow Jest or Vitest guide