mod callbacks#
- module callbacks#
Callback type aliases used by the runtime middleware pipeline.
The public middleware registration APIs accept boxed or shared closures with the signatures defined in this module. These aliases centralize those signatures so the runtime can compose tool and LLM middleware consistently across bindings.
Types
- type EventSubscriberFn#
Consume runtime lifecycle events after they are emitted.
Event subscribers are invoked for scope, tool, LLM, and mark events after the runtime has built the final event payload.
- type LlmCollectorFn#
Per-chunk collector used by the streaming LLM runtime.
- type LlmConditionalFn#
Decide whether an LLM call is allowed to continue.
The callback receives the current
LlmRequestand can allow execution, reject it with a guardrail reason, or return an error.
- type LlmExecutionFn#
Wrap or replace non-streaming LLM execution.
A non-streaming execution intercept receives the logical provider name, the current request, and the continuation representing the rest of the chain.
- type LlmExecutionNextFn#
Continuation type invoked by non-streaming LLM execution intercepts.
Execution intercepts use this callable to continue the non-streaming LLM pipeline after applying their own logic.
- type LlmFinalizerFn#
Finalizer used to synthesize the aggregate streaming response payload.
- type LlmJsonStream#
Stream of JSON chunks produced by the managed streaming LLM pipeline.
- type LlmRequestInterceptFn#
Rewrite or annotate an LLM request before execution.
Request intercepts can transform the wire request, attach or replace a normalized
AnnotatedLlmRequest, or both.
- type LlmSanitizeRequestFn#
Sanitize an LLM request before the runtime records it.
LLM request sanitizers affect the serialized request payload emitted on start events. They do not mutate the caller-owned
LlmRequestunless a separate request intercept does so.
- type LlmSanitizeResponseFn#
Sanitize an LLM response before the runtime records it.
These callbacks rewrite the JSON response payload captured on LLM-end events, which is useful for redaction or payload normalization.
- type LlmStreamExecutionFn#
Wrap or replace streaming LLM execution.
A streaming execution intercept can observe or modify the request before invoking the continuation, and it can also replace the returned stream.
- type LlmStreamExecutionNextFn#
Continuation type invoked by streaming LLM execution intercepts.
This callable represents the remainder of the streaming LLM execution chain and resolves to a stream of JSON response chunks.
- type LlmStreamExecutionRegistryRef<'a>#
Scope-local registry references passed into streaming execution-chain builders.
- type LlmStreamExecutionRegistryRefs<'a>#
Slice of scope-local streaming execution registries.
- type ToolConditionalFn#
Decide whether a tool call is allowed to continue.
The callback receives the tool name and the current argument payload. It can return
Ok(None)to allow execution,Ok(Some(reason))to reject the call with a guardrail message, or an error to abort evaluation entirely.
- type ToolExecutionFn#
Wrap or replace tool execution.
A tool execution intercept receives the tool name, the current argument payload, and the continuation representing the rest of the chain.
- type ToolExecutionNextFn#
Continuation type invoked by tool execution intercepts.
Execution intercepts receive this callable as their
nextcontinuation and can call it with modified arguments, wrap it, or skip it entirely.
- type ToolInterceptFn#
Rewrite tool arguments before execution.
Tool request intercepts run in priority order and can transform the JSON payload that is eventually passed into the tool execution callback.
- type ToolSanitizeFn#
Sanitize a tool request payload before the runtime records it.
Tool sanitize callbacks are used only for observability payloads. They can rewrite the JSON arguments recorded on tool-start events without changing the caller-owned request that is passed to the tool implementation.