Customer Support Example
This example demonstrates how to use PromptKit Arena to test a customer support chatbot across multiple LLM providers.
Scenario
A customer support bot helps users with:
- Product questions
- Order tracking
- Returns and refunds
- Technical troubleshooting
Files
config.arena.yaml- Main configurationprompts/support-bot.yaml- System prompt for the support botscenarios/support-conversations.scenario.yaml- Test conversation scenariosscenarios/streaming-demo.scenario.yaml- Demonstrates streaming configurationscenarios/streaming-tools-demo.scenario.yaml- Demonstrates streaming with tool callstools/example-tool.tool.yaml- Mock tool for order status lookupproviders/- LLM provider configurations
Running the Example
cd examples/customer-support
promptarena run -c config.arena.yaml
Running with Streaming Output
To see streaming tokens in real-time (useful for the streaming demos):
promptarena run -c config.arena.yaml --streaming
Running a Specific Scenario
To run just the streaming-with-tools demo:
promptarena run -c config.arena.yaml --scenario streaming-tools-demo
Scenarios
Streaming with Tools Demo
The streaming-tools-demo scenario demonstrates the PredictStreamWithTools feature where the LLM can stream tokens back while also invoking tools:
- Turn 1: User asks about order status → LLM calls
get_order_statustool while streaming - Turn 2: Follow-up question → LLM uses previous tool result in streamed response
- Turn 3: Another order check → LLM calls the tool again
- Turn 4: General question → Streamed response without tool call
This showcases how streaming and tool calls work together seamlessly.
Expected Outcomes
The test evaluates:
- Tone consistency (professional, helpful, empathetic)
- Accurate information retrieval
- Tool invocation when needed (order status lookups)
- Appropriate escalation handling
- Response quality across providers
Was this page helpful?