Skip to content

Logging

Structured logging with context enrichment and per-module log level control.

The PromptKit logging system provides:

  • Structured logging: Built on Go’s log/slog for JSON and text output
  • Context enrichment: Automatic field extraction from context (turn ID, provider, scenario)
  • Per-module levels: Configure different log levels for different modules
  • PII redaction: Automatic redaction of API keys and sensitive data
  • Common fields: Add fields that appear in every log entry
import "github.com/AltairaLabs/PromptKit/runtime/logger"
import "github.com/AltairaLabs/PromptKit/runtime/logger"
// Log at different levels
logger.Info("Processing request", "user_id", "12345")
logger.Debug("Request details", "method", "POST", "path", "/api/chat")
logger.Warn("Rate limit approaching", "remaining", 10)
logger.Error("Request failed", "error", err)
// Add context fields that appear in all logs within this context
ctx := logger.WithLoggingContext(ctx, &logger.LoggingFields{
Scenario: "customer-support",
Provider: "openai",
SessionID: "sess-123",
})
// Fields automatically included in log output
logger.InfoContext(ctx, "Processing turn")
// Output includes: scenario=customer-support provider=openai session_id=sess-123

Configuration follows the K8s-style resource format:

apiVersion: promptkit.altairalabs.ai/v1alpha1
kind: LoggingConfig
metadata:
name: production-logging
spec:
defaultLevel: info
format: json
commonFields:
service: my-app
environment: production
modules:
- name: runtime.pipeline
level: debug
- name: providers
level: warn
FieldTypeDefaultDescription
spec.defaultLevelstringinfoDefault log level for all modules
spec.formatstringtextOutput format: json or text
spec.commonFieldsmap{}Fields added to every log entry
spec.modulesarray[]Per-module log level overrides
LevelDescriptionUse Case
traceMost verboseDetailed debugging, request/response bodies
debugDebug informationDevelopment, troubleshooting
infoNormal operationsProduction default
warnWarning conditionsRecoverable errors, deprecations
errorError conditionsFailures requiring attention
import "github.com/AltairaLabs/PromptKit/runtime/logger"
cfg := &logger.LoggingConfigSpec{
DefaultLevel: "info",
Format: logger.FormatJSON,
CommonFields: map[string]string{
"service": "my-app",
},
Modules: []logger.ModuleLoggingSpec{
{Name: "runtime", Level: "debug"},
{Name: "providers.openai", Level: "warn"},
},
}
if err := logger.Configure(cfg); err != nil {
log.Fatal(err)
}

Set the default log level via environment variable:

Terminal window
export LOG_LEVEL=debug
KeyFunctionDescription
turn_idWithTurnID()Conversation turn identifier
scenarioWithScenario()Scenario name
providerWithProvider()LLM provider name
session_idWithSessionID()Session identifier
modelWithModel()Model name
stageWithStage()Execution stage
componentWithComponent()Component name
// Individual fields
ctx = logger.WithTurnID(ctx, "turn-1")
ctx = logger.WithProvider(ctx, "openai")
ctx = logger.WithModel(ctx, "gpt-4o")
// Multiple fields at once
ctx = logger.WithLoggingContext(ctx, &logger.LoggingFields{
TurnID: "turn-1",
Scenario: "support-chat",
Provider: "openai",
Model: "gpt-4o",
SessionID: "sess-abc123",
Stage: "execution",
Component: "pipeline",
})
fields := logger.ExtractLoggingFields(ctx)
fmt.Printf("Current turn: %s\n", fields.TurnID)
fmt.Printf("Provider: %s\n", fields.Provider)

Module names are derived from package paths within PromptKit:

PackageModule Name
runtime/pipelineruntime.pipeline
runtime/loggerruntime.logger
providers/openaiproviders.openai
tools/arena/enginetools.arena.engine

Module levels are matched hierarchically. A module inherits the log level of its parent if not explicitly configured:

mc := logger.NewModuleConfig(slog.LevelInfo)
mc.SetModuleLevel("runtime", slog.LevelWarn)
mc.SetModuleLevel("runtime.pipeline", slog.LevelDebug)
// Results:
// - "runtime" -> Warn
// - "runtime.pipeline" -> Debug
// - "runtime.pipeline.stage" -> Debug (inherits from runtime.pipeline)
// - "runtime.streaming" -> Warn (inherits from runtime)
// - "providers.openai" -> Info (default)
mc := logger.GetModuleConfig()
// Check current level for a module
level := mc.LevelFor("runtime.pipeline")
// Update levels dynamically
mc.SetModuleLevel("providers", slog.LevelDebug)
mc.SetDefaultLevel(slog.LevelWarn)

The logger automatically redacts sensitive data in debug logs:

// API keys are redacted
logger.APIRequest("openai", "POST", url, headers, body)
// Output: sk-1234...[REDACTED] instead of full key
PatternExampleRedacted Form
OpenAI keyssk-abc123...sk-a...[REDACTED]
Google keysAIzaXYZ...AIza...[REDACTED]
Bearer tokensBearer xyz...Bearer [REDACTED]
redacted := logger.RedactSensitiveData(sensitiveString)
// Log API call
logger.LLMCall("openai", "assistant", 5, 0.7,
"model", "gpt-4o",
"stream", true,
)
// Log response
logger.LLMResponse("openai", "assistant", 150, 200, 0.0001,
"model", "gpt-4o",
"finish_reason", "stop",
)
// Log error
logger.LLMError("openai", "assistant", err,
"model", "gpt-4o",
"status_code", 429,
)
// Log tool request
logger.ToolCall("openai", 5, 3, "auto",
"model", "gpt-4o",
)
// Log tool response
logger.ToolResponse("openai", 200, 150, 2, 0.00005,
"tools_executed", []string{"get_weather", "search"},
)
// Debug-level API request logging
logger.APIRequest("openai", "POST", url, headers, body)
// Debug-level API response logging
logger.APIResponse("openai", 200, responseBody, nil)

Human-readable format for development:

time=2024-01-15T10:30:00Z level=INFO msg="Processing request" turn_id=turn-1 provider=openai

Structured format for log aggregation:

{"time":"2024-01-15T10:30:00Z","level":"INFO","msg":"Processing request","turn_id":"turn-1","provider":"openai"}
func TestLogging(t *testing.T) {
var buf bytes.Buffer
logger.SetOutput(&buf)
defer logger.SetOutput(nil) // Reset to stderr
logger.Info("test message", "key", "value")
output := buf.String()
if !strings.Contains(output, "test message") {
t.Error("Expected message in output")
}
}
func TestVerbose(t *testing.T) {
logger.SetLevel(slog.LevelDebug)
defer logger.SetLevel(slog.LevelInfo)
// Debug messages now visible
logger.Debug("detailed info")
}
// Pass context through your call chain
func HandleRequest(ctx context.Context, req Request) {
ctx = logger.WithLoggingContext(ctx, &logger.LoggingFields{
SessionID: req.SessionID,
Provider: req.Provider,
})
processRequest(ctx, req)
}
func processRequest(ctx context.Context, req Request) {
// Logs automatically include session_id and provider
logger.InfoContext(ctx, "Processing")
}
# Quiet down noisy modules, verbose for the one you're debugging
spec:
defaultLevel: warn
modules:
- name: runtime.pipeline
level: debug
// Trace: Very detailed, typically only for development
logger.Debug("Request body", "body", requestJSON) // Use trace-level sparingly
// Debug: Useful for troubleshooting
logger.Debug("Cache miss", "key", cacheKey)
// Info: Normal operations
logger.Info("Request completed", "duration_ms", elapsed)
// Warn: Something unexpected but recoverable
logger.Warn("Retry succeeded", "attempt", 3)
// Error: Something went wrong
logger.Error("Request failed", "error", err)
// Good: Structured fields for querying
logger.Info("User action",
"user_id", userID,
"action", "login",
"ip", remoteAddr,
)
// Avoid: Unstructured string formatting
logger.Info(fmt.Sprintf("User %s logged in from %s", userID, remoteAddr))
FunctionDescription
Info(msg, args...)Log at info level
InfoContext(ctx, msg, args...)Log at info level with context
Debug(msg, args...)Log at debug level
DebugContext(ctx, msg, args...)Log at debug level with context
Warn(msg, args...)Log at warn level
WarnContext(ctx, msg, args...)Log at warn level with context
Error(msg, args...)Log at error level
ErrorContext(ctx, msg, args...)Log at error level with context
SetLevel(level)Set the global log level
SetVerbose(bool)Set debug (true) or info (false) level
SetOutput(writer)Set log output destination
Configure(cfg)Apply logging configuration
ParseLevel(string)Parse string to slog.Level
FunctionDescription
WithTurnID(ctx, id)Add turn ID to context
WithScenario(ctx, name)Add scenario name to context
WithProvider(ctx, name)Add provider name to context
WithSessionID(ctx, id)Add session ID to context
WithModel(ctx, name)Add model name to context
WithStage(ctx, name)Add execution stage to context
WithComponent(ctx, name)Add component name to context
WithLoggingContext(ctx, fields)Add multiple fields to context
ExtractLoggingFields(ctx)Get all logging fields from context
FunctionDescription
LLMCall(provider, role, messages, temp, attrs...)Log LLM API call
LLMResponse(provider, role, tokensIn, tokensOut, cost, attrs...)Log LLM response
LLMError(provider, role, err, attrs...)Log LLM error
ToolCall(provider, messages, tools, choice, attrs...)Log tool call
ToolResponse(provider, tokensIn, tokensOut, toolCalls, cost, attrs...)Log tool response
APIRequest(provider, method, url, headers, body)Log API request (debug)
APIResponse(provider, statusCode, body, err)Log API response (debug)
RedactSensitiveData(input)Redact API keys from string