Logging
Structured logging with context enrichment and per-module log level control.
Overview
Section titled “Overview”The PromptKit logging system provides:
- Structured logging: Built on Go’s
log/slogfor JSON and text output - Context enrichment: Automatic field extraction from context (turn ID, provider, scenario)
- Per-module levels: Configure different log levels for different modules
- PII redaction: Automatic redaction of API keys and sensitive data
- Common fields: Add fields that appear in every log entry
Import Path
Section titled “Import Path”import "github.com/AltairaLabs/PromptKit/runtime/logger"Quick Start
Section titled “Quick Start”Basic Logging
Section titled “Basic Logging”import "github.com/AltairaLabs/PromptKit/runtime/logger"
// Log at different levelslogger.Info("Processing request", "user_id", "12345")logger.Debug("Request details", "method", "POST", "path", "/api/chat")logger.Warn("Rate limit approaching", "remaining", 10)logger.Error("Request failed", "error", err)Context-Aware Logging
Section titled “Context-Aware Logging”// Add context fields that appear in all logs within this contextctx := logger.WithLoggingContext(ctx, &logger.LoggingFields{ Scenario: "customer-support", Provider: "openai", SessionID: "sess-123",})
// Fields automatically included in log outputlogger.InfoContext(ctx, "Processing turn")// Output includes: scenario=customer-support provider=openai session_id=sess-123Configuration
Section titled “Configuration”LoggingConfig Schema
Section titled “LoggingConfig Schema”Configuration follows the K8s-style resource format:
apiVersion: promptkit.altairalabs.ai/v1alpha1kind: LoggingConfigmetadata: name: production-loggingspec: defaultLevel: info format: json commonFields: service: my-app environment: production modules: - name: runtime.pipeline level: debug - name: providers level: warnConfiguration Fields
Section titled “Configuration Fields”| Field | Type | Default | Description |
|---|---|---|---|
spec.defaultLevel | string | info | Default log level for all modules |
spec.format | string | text | Output format: json or text |
spec.commonFields | map | {} | Fields added to every log entry |
spec.modules | array | [] | Per-module log level overrides |
Log Levels
Section titled “Log Levels”| Level | Description | Use Case |
|---|---|---|
trace | Most verbose | Detailed debugging, request/response bodies |
debug | Debug information | Development, troubleshooting |
info | Normal operations | Production default |
warn | Warning conditions | Recoverable errors, deprecations |
error | Error conditions | Failures requiring attention |
Programmatic Configuration
Section titled “Programmatic Configuration”import "github.com/AltairaLabs/PromptKit/runtime/logger"
cfg := &logger.LoggingConfigSpec{ DefaultLevel: "info", Format: logger.FormatJSON, CommonFields: map[string]string{ "service": "my-app", }, Modules: []logger.ModuleLoggingSpec{ {Name: "runtime", Level: "debug"}, {Name: "providers.openai", Level: "warn"}, },}
if err := logger.Configure(cfg); err != nil { log.Fatal(err)}Environment Variable
Section titled “Environment Variable”Set the default log level via environment variable:
export LOG_LEVEL=debugContext Enrichment
Section titled “Context Enrichment”Available Context Keys
Section titled “Available Context Keys”| Key | Function | Description |
|---|---|---|
turn_id | WithTurnID() | Conversation turn identifier |
scenario | WithScenario() | Scenario name |
provider | WithProvider() | LLM provider name |
session_id | WithSessionID() | Session identifier |
model | WithModel() | Model name |
stage | WithStage() | Execution stage |
component | WithComponent() | Component name |
Adding Context Fields
Section titled “Adding Context Fields”// Individual fieldsctx = logger.WithTurnID(ctx, "turn-1")ctx = logger.WithProvider(ctx, "openai")ctx = logger.WithModel(ctx, "gpt-4o")
// Multiple fields at oncectx = logger.WithLoggingContext(ctx, &logger.LoggingFields{ TurnID: "turn-1", Scenario: "support-chat", Provider: "openai", Model: "gpt-4o", SessionID: "sess-abc123", Stage: "execution", Component: "pipeline",})Extracting Context Fields
Section titled “Extracting Context Fields”fields := logger.ExtractLoggingFields(ctx)fmt.Printf("Current turn: %s\n", fields.TurnID)fmt.Printf("Provider: %s\n", fields.Provider)Per-Module Log Levels
Section titled “Per-Module Log Levels”Module Names
Section titled “Module Names”Module names are derived from package paths within PromptKit:
| Package | Module Name |
|---|---|
runtime/pipeline | runtime.pipeline |
runtime/logger | runtime.logger |
providers/openai | providers.openai |
tools/arena/engine | tools.arena.engine |
Hierarchical Matching
Section titled “Hierarchical Matching”Module levels are matched hierarchically. A module inherits the log level of its parent if not explicitly configured:
mc := logger.NewModuleConfig(slog.LevelInfo)mc.SetModuleLevel("runtime", slog.LevelWarn)mc.SetModuleLevel("runtime.pipeline", slog.LevelDebug)
// Results:// - "runtime" -> Warn// - "runtime.pipeline" -> Debug// - "runtime.pipeline.stage" -> Debug (inherits from runtime.pipeline)// - "runtime.streaming" -> Warn (inherits from runtime)// - "providers.openai" -> Info (default)Using Module Config
Section titled “Using Module Config”mc := logger.GetModuleConfig()
// Check current level for a modulelevel := mc.LevelFor("runtime.pipeline")
// Update levels dynamicallymc.SetModuleLevel("providers", slog.LevelDebug)mc.SetDefaultLevel(slog.LevelWarn)PII Redaction
Section titled “PII Redaction”Automatic Redaction
Section titled “Automatic Redaction”The logger automatically redacts sensitive data in debug logs:
// API keys are redactedlogger.APIRequest("openai", "POST", url, headers, body)// Output: sk-1234...[REDACTED] instead of full keySupported Patterns
Section titled “Supported Patterns”| Pattern | Example | Redacted Form |
|---|---|---|
| OpenAI keys | sk-abc123... | sk-a...[REDACTED] |
| Google keys | AIzaXYZ... | AIza...[REDACTED] |
| Bearer tokens | Bearer xyz... | Bearer [REDACTED] |
Manual Redaction
Section titled “Manual Redaction”redacted := logger.RedactSensitiveData(sensitiveString)LLM-Specific Logging
Section titled “LLM-Specific Logging”Log LLM Calls
Section titled “Log LLM Calls”// Log API calllogger.LLMCall("openai", "assistant", 5, 0.7, "model", "gpt-4o", "stream", true,)
// Log responselogger.LLMResponse("openai", "assistant", 150, 200, 0.0001, "model", "gpt-4o", "finish_reason", "stop",)
// Log errorlogger.LLMError("openai", "assistant", err, "model", "gpt-4o", "status_code", 429,)Log Tool Calls
Section titled “Log Tool Calls”// Log tool requestlogger.ToolCall("openai", 5, 3, "auto", "model", "gpt-4o",)
// Log tool responselogger.ToolResponse("openai", 200, 150, 2, 0.00005, "tools_executed", []string{"get_weather", "search"},)Log API Details
Section titled “Log API Details”// Debug-level API request logginglogger.APIRequest("openai", "POST", url, headers, body)
// Debug-level API response logginglogger.APIResponse("openai", 200, responseBody, nil)Output Formats
Section titled “Output Formats”Text Format (Default)
Section titled “Text Format (Default)”Human-readable format for development:
time=2024-01-15T10:30:00Z level=INFO msg="Processing request" turn_id=turn-1 provider=openaiJSON Format
Section titled “JSON Format”Structured format for log aggregation:
{"time":"2024-01-15T10:30:00Z","level":"INFO","msg":"Processing request","turn_id":"turn-1","provider":"openai"}Testing
Section titled “Testing”Capture Log Output
Section titled “Capture Log Output”func TestLogging(t *testing.T) { var buf bytes.Buffer logger.SetOutput(&buf) defer logger.SetOutput(nil) // Reset to stderr
logger.Info("test message", "key", "value")
output := buf.String() if !strings.Contains(output, "test message") { t.Error("Expected message in output") }}Set Test Log Level
Section titled “Set Test Log Level”func TestVerbose(t *testing.T) { logger.SetLevel(slog.LevelDebug) defer logger.SetLevel(slog.LevelInfo)
// Debug messages now visible logger.Debug("detailed info")}Best Practices
Section titled “Best Practices”1. Use Context for Correlation
Section titled “1. Use Context for Correlation”// Pass context through your call chainfunc HandleRequest(ctx context.Context, req Request) { ctx = logger.WithLoggingContext(ctx, &logger.LoggingFields{ SessionID: req.SessionID, Provider: req.Provider, })
processRequest(ctx, req)}
func processRequest(ctx context.Context, req Request) { // Logs automatically include session_id and provider logger.InfoContext(ctx, "Processing")}2. Configure Module Levels for Debugging
Section titled “2. Configure Module Levels for Debugging”# Quiet down noisy modules, verbose for the one you're debuggingspec: defaultLevel: warn modules: - name: runtime.pipeline level: debug3. Use Appropriate Log Levels
Section titled “3. Use Appropriate Log Levels”// Trace: Very detailed, typically only for developmentlogger.Debug("Request body", "body", requestJSON) // Use trace-level sparingly
// Debug: Useful for troubleshootinglogger.Debug("Cache miss", "key", cacheKey)
// Info: Normal operationslogger.Info("Request completed", "duration_ms", elapsed)
// Warn: Something unexpected but recoverablelogger.Warn("Retry succeeded", "attempt", 3)
// Error: Something went wronglogger.Error("Request failed", "error", err)4. Include Structured Data
Section titled “4. Include Structured Data”// Good: Structured fields for queryinglogger.Info("User action", "user_id", userID, "action", "login", "ip", remoteAddr,)
// Avoid: Unstructured string formattinglogger.Info(fmt.Sprintf("User %s logged in from %s", userID, remoteAddr))API Reference
Section titled “API Reference”Package Functions
Section titled “Package Functions”| Function | Description |
|---|---|
Info(msg, args...) | Log at info level |
InfoContext(ctx, msg, args...) | Log at info level with context |
Debug(msg, args...) | Log at debug level |
DebugContext(ctx, msg, args...) | Log at debug level with context |
Warn(msg, args...) | Log at warn level |
WarnContext(ctx, msg, args...) | Log at warn level with context |
Error(msg, args...) | Log at error level |
ErrorContext(ctx, msg, args...) | Log at error level with context |
SetLevel(level) | Set the global log level |
SetVerbose(bool) | Set debug (true) or info (false) level |
SetOutput(writer) | Set log output destination |
Configure(cfg) | Apply logging configuration |
ParseLevel(string) | Parse string to slog.Level |
Context Functions
Section titled “Context Functions”| Function | Description |
|---|---|
WithTurnID(ctx, id) | Add turn ID to context |
WithScenario(ctx, name) | Add scenario name to context |
WithProvider(ctx, name) | Add provider name to context |
WithSessionID(ctx, id) | Add session ID to context |
WithModel(ctx, name) | Add model name to context |
WithStage(ctx, name) | Add execution stage to context |
WithComponent(ctx, name) | Add component name to context |
WithLoggingContext(ctx, fields) | Add multiple fields to context |
ExtractLoggingFields(ctx) | Get all logging fields from context |
LLM Logging Functions
Section titled “LLM Logging Functions”| Function | Description |
|---|---|
LLMCall(provider, role, messages, temp, attrs...) | Log LLM API call |
LLMResponse(provider, role, tokensIn, tokensOut, cost, attrs...) | Log LLM response |
LLMError(provider, role, err, attrs...) | Log LLM error |
ToolCall(provider, messages, tools, choice, attrs...) | Log tool call |
ToolResponse(provider, tokensIn, tokensOut, toolCalls, cost, attrs...) | Log tool response |
APIRequest(provider, method, url, headers, body) | Log API request (debug) |
APIResponse(provider, statusCode, body, err) | Log API response (debug) |
RedactSensitiveData(input) | Redact API keys from string |
See Also
Section titled “See Also”- Pipeline Reference - Pipeline execution and middleware
- Providers Reference - LLM provider implementations
- Arena Config Reference - Arena configuration including logging