Customer Support Example

This example demonstrates how to use PromptKit Arena to test a customer support chatbot across multiple LLM providers.

Scenario

A customer support bot helps users with:

Files

Running the Example

cd examples/customer-support
promptarena run -c config.arena.yaml

Running with Streaming Output

To see streaming tokens in real-time (useful for the streaming demos):

promptarena run -c config.arena.yaml --streaming

Running a Specific Scenario

To run just the streaming-with-tools demo:

promptarena run -c config.arena.yaml --scenario streaming-tools-demo

Scenarios

Streaming with Tools Demo

The streaming-tools-demo scenario demonstrates the PredictStreamWithTools feature where the LLM can stream tokens back while also invoking tools:

  1. Turn 1: User asks about order status → LLM calls get_order_status tool while streaming
  2. Turn 2: Follow-up question → LLM uses previous tool result in streamed response
  3. Turn 3: Another order check → LLM calls the tool again
  4. Turn 4: General question → Streamed response without tool call

This showcases how streaming and tool calls work together seamlessly.

Expected Outcomes

The test evaluates: