Skip to content

Runtime Tutorials

Progressive learning path for building LLM applications with Runtime.

Follow these tutorials in sequence to build expertise:

Tutorial 1: First Pipeline

  • Create and configure pipelines
  • Connect to LLM providers
  • Execute basic requests
  • Handle responses and costs

Tutorial 2: Multi-Turn Conversations

  • Manage conversation state
  • Implement Redis persistence
  • Handle context windows
  • Build chatbot interfaces

Tutorial 3: MCP Integration

  • Set up MCP servers
  • Register external tools
  • Handle tool calls
  • Build tool-enabled agents

Tutorial 4: Validation & Guardrails

  • Implement content filters
  • Add custom validators
  • Handle validation errors
  • Build safe applications

Tutorial 5: Production Deployment

  • Error handling strategies
  • Monitoring and logging
  • Cost optimization
  • Scalability patterns

Tutorial 6: Advanced Patterns

  • Multi-provider fallback
  • Streaming optimization
  • Custom middleware
  • Performance tuning
  • Go 1.21 or higher
  • Basic Go programming knowledge
  • LLM API keys (OpenAI, Claude, or Gemini)
  • Terminal/command line familiarity
  • Quick Start: Tutorial 1 only (15 min)
  • Core Skills: Tutorials 1-3 (65 min)
  • Production Ready: Tutorials 1-5 (125 min)
  • Complete Path: All tutorials (155 min)

Start with First Pipeline to build your first LLM application.