TransformFrame API

TransformFrame manages coordinate frame trees for real-time robotics — the spatial relationships between your robot's base, sensors, and the world. All lookups are lock-free and wait-free. Register frames with FrameBuilder, query transforms with TransformQuery, and update dynamic frames at sensor rates.

New to coordinate frames? See TransformFrame Concepts for an introduction.

Quick Reference

MethodReturnsDescription
TransformFrame::new(config)Result<Self>Create a new transform frame tree
.register_frame(name, parent)Result<()>Add a frame to the tree
.update_transform(frame, transform, timestamp)Result<()>Update a dynamic frame's transform
.tf(from, to)Result<Transform>Look up transform between two frames
.tf_at(from, to, time)Result<Transform>Look up transform at a specific time
.is_stale(frame, max_age)boolCheck if a frame's data is too old
.frame_exists(name)boolCheck if a frame is registered
.diagnostics()FrameDiagnosticsGet tree statistics and health

Creating a TransformFrame

use horus::prelude::*;

// Default: 256 frames, 128 static, 32-entry history
let tf = TransformFrame::new();

// Presets
let tf = TransformFrame::small();    // 256 frames (~550KB)
let tf = TransformFrame::medium();   // 1024 frames (~2.2MB)
let tf = TransformFrame::large();    // 4096 frames (~9MB)
let tf = TransformFrame::massive();  // 16384 frames (~35MB)

// Custom
let tf = TransformFrame::with_config(
    TransformFrameConfig::custom()
        .max_frames(512)
        .history_len(64)
        .enable_overflow(false)  // Hard RT: no heap fallback
        .build()?
);
ConstructorDescription
TransformFrame::new()Default (256 frames, small preset)
TransformFrame::small()256 frames, embedded/single robot
TransformFrame::medium()1024 frames, complex/multi-robot
TransformFrame::large()4096 frames, simulations
TransformFrame::massive()16384 frames, 100+ robots
TransformFrame::with_config(config)Custom configuration

TransformFrameConfig

FieldTypeDefaultDescription
max_framesusize256Maximum frames (16–65536)
max_static_framesusizemax_frames/2Static frame slots
history_lenusize32History buffer per dynamic frame (4–256)
enable_overflowbooltrueFallback to HashMap if capacity exceeded
chain_cache_sizeusize64Cache for transform chain lookups

Presets

PresetCapacityStaticMemoryUse Case
small()256128~550KBSingle robots, embedded
medium()1024512~2.2MBComplex robots, multi-robot
large()40962048~9MBMulti-robot simulations
massive()163848192~35MBLarge-scale simulations (100+ robots)
unlimited()4096204835MB+Dynamic/unpredictable counts (no RT guarantee)
rt_fixed(max_frames)custommax_frames/2variableHard real-time (no overflow)

TransformFrameConfigBuilder

let config = TransformFrameConfig::custom()
    .max_frames(512)
    .max_static_frames(256)
    .history_len(64)
    .enable_overflow(false)
    .chain_cache_size(128)
    .build()?;

println!("Memory: {}", config.memory_estimate());  // "1.1 MB"
MethodReturnsDescription
TransformFrameConfig::custom()TransformFrameConfigBuilderStart custom config
.max_frames(n)SelfSet max frames
.max_static_frames(n)SelfSet static frame slots
.history_len(n)SelfSet history buffer size
.enable_overflow(bool)SelfEnable/disable heap fallback
.chain_cache_size(n)SelfSet chain cache size
.build()Result<TransformFrameConfig, String>Validate and build
config.validate()Result<(), String>Check constraints
config.estimated_memory_bytes()usizeMemory estimate in bytes
config.memory_estimate()StringHuman-readable memory (e.g., "2.2 MB")

Registering Frames

FrameBuilder (Fluent API)

The recommended way to register frames:

// Root frame (no parent)
tf.add_frame("world").build()?;

// Dynamic child frame
tf.add_frame("base_link").parent("world").build()?;

// Static frame with fixed offset (e.g., sensor mount)
tf.add_frame("camera")
    .parent("base_link")
    .static_transform(&Transform::xyz(0.1, 0.0, 0.5))
    .build()?;
MethodReturnsDescription
tf.add_frame(name)FrameBuilderStart registering a frame
.parent(name)SelfSet parent frame
.static_transform(transform)SelfMark as static with fixed transform (requires parent)
.build()Result<FrameId>Register and return frame ID

Direct Registration

// Dynamic frame
let id = tf.register_frame("base_link", Some("world"))?;

// Static frame with fixed transform
let id = tf.register_static_frame("camera", Some("base_link"), &Transform::xyz(0.1, 0.0, 0.5))?;

// Unregister (dynamic only)
tf.unregister_frame("temp_frame")?;

Frame Lookup

MethodReturnsDescription
frame_id(name)Option<FrameId>Get frame ID by name (cache this for hot paths)
frame_name(id)Option<String>Get name by ID
has_frame(name)boolCheck if frame exists
all_frames()Vec<String>All frame names
frame_count()usizeTotal registered frames
parent(name)Option<String>Parent frame name
children(name)Vec<String>Direct child frames

Updating Transforms

Update dynamic frames with new transform data at sensor rates:

let now = timestamp_now();
tf.update_transform("base_link", &Transform::xyz(1.0, 2.0, 0.0), now)?;

// By ID (fastest, lock-free, no name lookup)
let id = tf.frame_id("base_link").unwrap();
tf.update_transform_by_id(id, &Transform::xyz(1.0, 2.0, 0.0), now)?;

// Update static frame offset
tf.set_static_transform("camera", &Transform::xyz(0.1, 0.0, 0.6))?;
MethodReturnsDescription
update_transform(name, transform, timestamp_ns)Result<()>Update by name (validates transform)
update_transform_by_id(id, transform, timestamp_ns)Result<()>Update by ID (fastest, lock-free)
set_static_transform(name, transform)Result<()>Update a static frame's offset

update_transform(name, transform, timestamp_ns)

Updates a dynamic frame's transform at the current timestamp.

Parameters

NameTypeRequiredDescription
name&stryesFrame name (must be registered and dynamic, not static)
transform&TransformyesNew transform relative to parent. NaN/Inf rejected. Quaternions auto-normalized.
timestamp_nsu64yesTimestamp in nanoseconds. Use timestamp_now() for current time.

Returns

Result<()>Ok on success, Err if frame doesn't exist or transform is invalid.

Behavior

  • Validates transform: rejects NaN/Inf, auto-normalizes quaternions
  • Stores in history buffer for temporal interpolation (LERP + SLERP)
  • Lock-free write — safe to call from RT nodes at sensor rates

When to use: Call every tick from sensor nodes that produce position data (odometry, SLAM, IMU integration).

update_transform_by_id(id, transform, timestamp_ns)

Same as update_transform but uses FrameId instead of name string. Skips name lookup — use this in hot loops.

Parameters

NameTypeRequiredDescription
idFrameIdyesFrame ID from frame_id(name). Cache this at init time.
transform&TransformyesNew transform. Same validation as update_transform.
timestamp_nsu64yesTimestamp in nanoseconds.

Transforms are automatically validated — NaN/Inf values are rejected and quaternions are auto-normalized.


Querying Transforms

TransformQuery (Fluent API)

The recommended way to look up transforms:

// Latest transform from camera to world
let t = tf.query("camera").to("world").lookup()?;

// At a specific timestamp (with LERP + SLERP interpolation)
let t = tf.query("camera").to("world").at(timestamp_ns)?;

// Strict: error if extrapolation needed
let t = tf.query("camera").to("world").at_strict(timestamp_ns)?;

// With tolerance window
let t = tf.query("camera").to("world").at_with_tolerance(timestamp_ns, 100_000_000)?;  // 100ms

// Transform a point
let world_pt = tf.query("lidar").to("world").point([1.0, 0.0, 0.0])?;

// Transform a vector (rotation only, no translation)
let world_vec = tf.query("imu").to("world").vector([0.0, 0.0, 9.81])?;

// Check availability
if tf.query("sensor").to("world").can_at(timestamp_ns) {
    let t = tf.query("sensor").to("world").at(timestamp_ns)?;
}

// Get the frame chain
let chain = tf.query("camera").to("world").chain()?;  // ["camera", "base_link", "world"]

TransformQueryFrom

MethodReturnsDescription
tf.query(src)TransformQueryFromStart query from source frame
.to(dst)TransformQuerySet destination frame

TransformQuery Methods

MethodReturnsDescription
lookup()Result<Transform>Latest transform
at(timestamp_ns)Result<Transform>At timestamp with interpolation
at_strict(timestamp_ns)Result<Transform>Error if extrapolation needed
at_with_tolerance(timestamp_ns, tolerance_ns)Result<Transform>Error if gap exceeds tolerance
point(xyz)Result<[f64; 3]>Transform a 3D point
vector(xyz)Result<[f64; 3]>Transform a vector (rotation only)
can_at(timestamp_ns)boolCheck if transform available at timestamp
can_at_with_tolerance(timestamp_ns, tolerance_ns)boolCheck within tolerance window
chain()Result<Vec<String>>Frame chain from source to destination
wait(timeout)Result<Transform>Block until available (feature wait)
wait_at(timestamp_ns, timeout)Result<Transform>Block until timestamp data available (feature wait)
wait_async(timeout)Result<Transform>Async wait (feature async-wait)
wait_at_async(timestamp_ns, timeout)Result<Transform>Async wait at timestamp (feature async-wait)

Key Method Details

lookup

pub fn lookup(&self) -> Result<Transform>

Get the latest known transform between source and destination frames. Returns the most recent transform regardless of timestamp.

Returns: Ok(Transform) with translation (x, y, z) and rotation (quaternion).

Errors:

  • TransformError::FrameNotFound — Source or destination frame doesn't exist
  • TransformError::NoPath — No chain of frames connects source to destination
  • TransformError::EmptyHistory — Frame exists but no transforms have been published yet

at

pub fn at(&self, timestamp_ns: u64) -> Result<Transform>

Get the transform at a specific timestamp, with automatic interpolation between stored samples.

Parameters:

  • timestamp_ns: u64 — Timestamp in nanoseconds since epoch. Use horus::now().as_nanos().

Behavior: If the requested time falls between two stored transforms, SLERP interpolation is used for rotation and linear interpolation for translation. If the time is outside stored history, extrapolation is used (may be inaccurate).

Errors: Same as lookup() plus TransformError::Stale if the transform data is too old.

wait

pub fn wait(&self, timeout: Duration) -> Result<Transform>

Block until the transform becomes available or timeout elapses. Use this when your node starts before the transform publisher.

Parameters:

  • timeout: Duration — Maximum time to wait. Use 5_u64.secs().

Returns: Ok(Transform) once available, Err(TransformError::Timeout) if timeout elapses.

Direct Query Methods

For cases where you don't need the fluent builder:

MethodReturnsDescription
tf(src, dst)Result<Transform>Latest transform src→dst
tf_by_id(src_id, dst_id)Option<Transform>By ID (fastest, lock-free)
tf_at(src, dst, timestamp_ns)Result<Transform>At timestamp with interpolation
tf_at_strict(src, dst, timestamp_ns)Result<Transform>Error if extrapolation needed
tf_at_with_tolerance(src, dst, ts, tol)Result<Transform>Error if gap exceeds tolerance
can_transform(src, dst)boolCheck if path exists
can_transform_at(src, dst, timestamp_ns)boolCheck at timestamp
can_transform_at_with_tolerance(src, dst, ts, tol)boolCheck within tolerance
transform_point(src, dst, point)Result<[f64; 3]>One-shot point transform
transform_vector(src, dst, vector)Result<[f64; 3]>Rotation-only vector transform

Argument order: tf.tf("camera", "world") means "transform FROM camera TO world". This is the opposite of ROS2 TF2's lookupTransform(target, source).


Transform

A 3D rigid body transform (translation + quaternion rotation) in f64 precision.

Fields

pub struct Transform {
    pub translation: [f64; 3],  // [x, y, z] in meters
    pub rotation: [f64; 4],     // [x, y, z, w] Hamilton quaternion
}

Constructors

let t = Transform::identity();
let t = Transform::new([1.0, 2.0, 0.0], [0.0, 0.0, 0.0, 1.0]);
let t = Transform::xyz(1.0, 2.0, 0.0);
let t = Transform::x(1.0);         // X-axis only
let t = Transform::yaw(1.57);      // Z-axis rotation (radians)
let t = Transform::rpy(0.0, 0.0, 1.57);  // Roll, pitch, yaw
let t = Transform::from_euler([1.0, 2.0, 0.0], [0.0, 0.0, 1.57]);
let t = Transform::from_translation([1.0, 2.0, 0.0]);
let t = Transform::from_rotation([0.0, 0.0, 0.707, 0.707]);
let t = Transform::from_matrix(matrix_4x4);

// Chainable
let t = Transform::xyz(1.0, 0.0, 0.0).with_yaw(1.57);
let t = Transform::xyz(1.0, 0.0, 0.5).with_rpy(0.0, 0.1, 0.0);
ConstructorDescription
identity()No translation or rotation
new(translation, rotation)Custom translation + quaternion (auto-normalized)
from_translation(xyz)Translation only
from_rotation(xyzw)Rotation only (auto-normalized)
from_euler(translation, rpy)Translation + Euler angles (radians)
xyz(x, y, z)Translation shorthand
x(v) / y(v) / z(v)Single-axis translation
yaw(angle) / pitch(angle) / roll(angle)Single-axis rotation (radians)
rpy(roll, pitch, yaw)Combined rotation (radians)
from_matrix(m)From 4x4 homogeneous matrix (row-major)
.with_yaw(angle)Compose yaw rotation (chainable)
.with_rpy(r, p, y)Compose RPY rotation (chainable)

Operations

MethodReturnsDescription
compose(other)TransformCompose transforms (self * other)
inverse()TransformReverse direction
transform_point(xyz)[f64; 3]Apply rotation + translation to point
transform_vector(xyz)[f64; 3]Apply rotation only (no translation)
interpolate(other, t)TransformLERP + SLERP interpolation (t in [0, 1])
to_euler()[f64; 3]Convert rotation to [roll, pitch, yaw] radians
to_matrix()[[f64; 4]; 4]Convert to 4x4 homogeneous matrix
validate()Result<(), String>Check for NaN/Inf and valid quaternion
validated()Result<Self, String>Validate and auto-normalize
is_identity(epsilon)boolApproximate identity test
translation_magnitude()f64Translation vector length
rotation_angle()f64Rotation angle in radians

tf(src, dst) — Shorthand Transform Lookup

Looks up the latest transform between two named frames.

Signature

pub fn tf(&self, src: &str, dst: &str) -> HorusResult<Transform>

Parameters

NameTypeRequiredDescription
src&stryesSource frame name (e.g., "camera")
dst&stryesDestination frame name (e.g., "world")

Returns

HorusResult<Transform> — the transform that converts coordinates from src frame to dst frame.

Errors

ErrorCondition
FrameNotFoundSource or destination frame not registered
NoPathBetweenFramesFrames exist but have no common ancestor in the tree

Behavior

  • Returns the latest transform regardless of timestamp — use tf_at() for time-specific lookups
  • Equivalent to query(src).to(dst).lookup() but shorter
  • Lock-free read — safe to call from RT nodes at high frequency

Example

use horus::prelude::*;

// Where is the camera relative to the world?
let camera_to_world = tf.tf("camera", "world")?;
println!("Camera at: ({:.2}, {:.2}, {:.2})",
    camera_to_world.translation[0],
    camera_to_world.translation[1],
    camera_to_world.translation[2]);

When to use: Simple one-off lookups. For complex queries (time-specific, point transformation, chain inspection), use the fluent query() API instead.


Staleness Detection

Check if frames have gone stale (e.g., sensor disconnected):

// Using wall-clock time
if tf.is_stale_now("imu", 500_000_000) {  // 500ms
    hlog!(warn, "IMU data is stale!");
}

// Using custom time (for simulation)
if tf.is_stale("imu", 500_000_000, sim_time_ns) {
    hlog!(warn, "IMU stale in sim time");
}

// Time since last update
if let Some(age_ns) = tf.time_since_last_update_now("imu") {
    println!("IMU age: {}ms", age_ns / 1_000_000);
}
MethodReturnsDescription
is_stale(name, max_age_ns, now_ns)boolStale relative to custom time
is_stale_now(name, max_age_ns)boolStale relative to wall-clock
time_since_last_update(name, now_ns)Option<u64>Age in nanoseconds (custom time)
time_since_last_update_now(name)Option<u64>Age in nanoseconds (wall-clock)
time_range(name)Option<(u64, u64)>Buffer window (oldest_ns, newest_ns)

Diagnostics

TransformFrameStats

let stats = tf.stats();
println!("{}", stats);  // Pretty-printed summary

println!("Total: {}, Static: {}, Dynamic: {}",
    stats.total_frames, stats.static_frames, stats.dynamic_frames);
println!("Tree depth: {}, Roots: {}", stats.tree_depth, stats.root_count);
FieldTypeDescription
total_framesusizeTotal registered frames
static_framesusizeStatic frame count
dynamic_framesusizeDynamic frame count
max_framesusizeConfigured max
history_lenusizeHistory buffer size
tree_depthusizeMaximum tree depth
root_countusizeFrames with no parent

FrameInfo

if let Some(info) = tf.frame_info("camera") {
    println!("Frame: {}, Parent: {:?}, Static: {}", info.name, info.parent, info.is_static);
    println!("Depth: {}, Children: {}", info.depth, info.children_count);
}

// All frames
let all = tf.frame_info_all();
FieldTypeDescription
nameStringFrame name
idFrameIdNumeric ID (for hot-path lookups)
parentOption<String>Parent name (None = root)
is_staticboolStatic vs dynamic
time_rangeOption<(u64, u64)>(oldest_ns, newest_ns)
children_countusizeDirect children
depthusizeDepth in tree (root = 0)

Tree Export

tf.print_tree();                    // Print to stderr
let text = tf.format_tree();        // Human-readable tree string
let dot = tf.frames_as_dot();       // Graphviz DOT format
let yaml = tf.frames_as_yaml();     // YAML (TF2-style)
tf.validate()?;                     // Validate tree structure

timestamp_now()

use horus::prelude::*;

let now: u64 = timestamp_now();  // Current time in nanoseconds (UNIX epoch)

FrameId

Type alias for u32. Use for hot-path lookups to avoid string-based name resolution:

let id: FrameId = tf.frame_id("base_link").unwrap();

// Fast by-ID operations (no string lookup)
tf.update_transform_by_id(id, &transform, timestamp_now())?;
let t = tf.tf_by_id(src_id, dst_id);

Complete Example

use horus::prelude::*;

fn main() -> Result<()> {
    let tf = TransformFrame::new();

    // Build frame tree
    tf.add_frame("world").build()?;
    tf.add_frame("odom").parent("world").build()?;
    tf.add_frame("base_link").parent("odom").build()?;
    tf.add_frame("lidar")
        .parent("base_link")
        .static_transform(&Transform::xyz(0.2, 0.0, 0.3))
        .build()?;
    tf.add_frame("camera")
        .parent("base_link")
        .static_transform(&Transform::xyz(0.1, 0.0, 0.5).with_rpy(0.0, 0.1, 0.0))
        .build()?;

    // Update dynamic frames
    let now = timestamp_now();
    tf.update_transform("odom", &Transform::xyz(1.0, 0.5, 0.0).with_yaw(0.3), now)?;
    tf.update_transform("base_link", &Transform::xyz(0.05, 0.02, 0.0), now)?;

    // Query transforms
    let lidar_to_world = tf.query("lidar").to("world").lookup()?;
    println!("Lidar position in world: {:?}", lidar_to_world.translation);

    // Transform a LiDAR point into world frame
    let world_pt = tf.query("lidar").to("world").point([5.0, 0.0, 0.0])?;
    println!("Obstacle at world: ({:.2}, {:.2}, {:.2})", world_pt[0], world_pt[1], world_pt[2]);

    // Check staleness
    if tf.is_stale_now("base_link", 100_000_000) {  // 100ms
        println!("Odometry is stale!");
    }

    // Print tree
    tf.print_tree();

    Ok(())
}

Common Frame Configurations

Mobile Robot with Camera and Lidar

world → odom → base_link → laser_link
                         → camera_link → camera_optical
let tf = TransformFrame::new();

// Static frames (don't change)
tf.register_frame("world", None)?;
tf.register_frame("odom", Some("world"))?;
tf.register_frame("base_link", Some("odom"))?;

// Sensor mounts (static transforms from CAD/measurement)
tf.register_frame("laser_link", Some("base_link"))?;
tf.update_transform("laser_link",
    &Transform::new([0.15, 0.0, 0.3], [0.0, 0.0, 0.0, 1.0]),
    timestamp_now())?;

tf.register_frame("camera_link", Some("base_link"))?;
tf.update_transform("camera_link",
    &Transform::new([0.1, -0.05, 0.25], [0.0, 0.0, 0.0, 1.0]),
    timestamp_now())?;

// Query: transform lidar point to camera frame
let laser_to_cam = tf.tf("laser_link", "camera_link")?;
let point_in_cam = laser_to_cam.transform_point([2.0, 0.0, 0.0]);

Robot Arm (6-DOF)

base → link1 → link2 → link3 → link4 → link5 → link6 → tool → gripper
let tf = TransformFrame::new();

tf.register_frame("base", None)?;
for i in 1..=6 {
    let parent = if i == 1 { "base" } else { &format!("link{}", i - 1) };
    tf.register_frame(&format!("link{}", i), Some(parent))?;
}
tf.register_frame("tool", Some("link6"))?;
tf.register_frame("gripper", Some("tool"))?;

// Update joint angles each tick (from forward kinematics)
// tf.update_transform("link1", &fk_transform, timestamp_now())?;

// Query: where is the gripper in base frame?
let gripper_pose = tf.tf("gripper", "base")?;

See Also