Transform Frames (C++)
Build a coordinate frame tree for a robot with multiple sensors, set static transforms for sensor mounts, update dynamic transforms from localization, and query spatial relationships between frames.
Problem
Your robot has a LiDAR, a camera, and an IMU, each mounted at different positions and orientations on the chassis. A LiDAR point at (1.5, 0.3) in the LiDAR frame is useless until you know where that is in the robot's body frame or the world frame. You need a transform tree that tracks the spatial relationship between every sensor and the world, updated in real time as the robot moves.
How It Works
A transform tree is a directed acyclic graph (usually a tree) where:
- Each node is a coordinate frame (e.g.,
world,base_link,lidar,camera) - Each edge stores a transform: translation (x, y, z) and rotation (quaternion)
- Static transforms (sensor mounts) are set once at startup and never change
- Dynamic transforms (localization) are updated every tick
To find the transform between any two frames, HORUS composes transforms along the path in the tree. For example, to get lidar in world coordinates:
T_lidar_in_world = T_base_in_world * T_lidar_in_base
The lookup(source, target) function does this composition automatically, walking the tree from source to target and multiplying transforms along the way.
Quaternion convention: HORUS uses (x, y, z, w) quaternion ordering. An identity rotation is {0, 0, 0, 1}. A 5-degree pitch is {0, 0, sin(2.5deg), cos(2.5deg)} = {0, 0, 0.0436, 0.999}.
When To Use
- Any robot with multiple sensors mounted at different positions
- When you need sensor fusion that requires aligning data into a common frame
- When localization updates the robot's pose in the world frame
Prerequisites
- HORUS installed (Installation Guide)
- Basic understanding of nodes and topics (C++ Quick Start)
Complete Code
#include <horus/horus.hpp>
#include <cmath>
#include <cstdio>
using namespace horus::literals;
class LocalizationNode : public horus::Node {
public:
LocalizationNode(horus::TransformFrame& tf)
: Node("localizer"), tf_(tf)
{
odom_sub_ = subscribe<horus::msg::Odometry>("odom");
}
void tick() override {
if (auto odom = odom_sub_->recv()) {
// Update base_link position in world frame from odometry
double x = odom->get()->pose.x;
double y = odom->get()->pose.y;
double theta = odom->get()->pose.theta;
// Convert heading to quaternion (rotation around Z axis)
double half_angle = theta / 2.0;
tf_.update("base_link",
{x, y, 0.0}, // translation
{0, 0, std::sin(half_angle), std::cos(half_angle)}, // quaternion (x,y,z,w)
horus::now_ns()); // timestamp
}
}
void enter_safe_state() override {
// Stop updating transforms — downstream nodes see stale data
// and should interpret it as "localization degraded"
horus::blackbox::record("localizer",
"Entered safe state, transform updates halted");
}
private:
horus::TransformFrame& tf_;
horus::Subscriber<horus::msg::Odometry>* odom_sub_;
};
class SensorProcessorNode : public horus::Node {
public:
SensorProcessorNode(horus::TransformFrame& tf)
: Node("sensor_processor"), tf_(tf)
{
scan_sub_ = subscribe<horus::msg::LaserScan>("lidar.scan");
}
void tick() override {
auto scan = scan_sub_->recv();
if (!scan) return;
// ── Transform a LiDAR point into world coordinates ──────────
// Example: closest obstacle point in LiDAR frame
float min_range = 999.0f;
int min_idx = 0;
for (int i = 0; i < 360; i++) {
if (scan->get()->ranges[i] > 0.05f &&
scan->get()->ranges[i] < min_range) {
min_range = scan->get()->ranges[i];
min_idx = i;
}
}
// Point in LiDAR frame: (range * cos(angle), range * sin(angle), 0)
double angle_rad = min_idx * M_PI / 180.0;
double lx = min_range * std::cos(angle_rad);
double ly = min_range * std::sin(angle_rad);
// Look up the transform from lidar to world
if (auto t = tf_.lookup("lidar", "world")) {
// t->translation gives lidar origin in world frame
// Full point transform: world_point = R * lidar_point + t
// For 2D (z=0): approximate with translation only
double wx = t->translation[0] + lx;
double wy = t->translation[1] + ly;
if (++tick_count_ % 50 == 0) {
char buf[128];
std::snprintf(buf, sizeof(buf),
"Closest obstacle: lidar=(%.2f, %.2f) world=(%.2f, %.2f)",
lx, ly, wx, wy);
horus::log::info("tf_demo", buf);
}
}
// ── Check sensor alignment ──────────────────────────────────
if (tf_.can_transform("camera", "lidar")) {
// Camera-to-lidar transform exists — can align detections
// Use for matching camera bounding boxes with LiDAR point clouds
}
}
private:
horus::TransformFrame& tf_;
horus::Subscriber<horus::msg::LaserScan>* scan_sub_;
int tick_count_ = 0;
};
int main() {
horus::TransformFrame tf;
// ── Build frame tree ────────────────────────────────────────────
// world
// +-- base_link (dynamic — updated by localization)
// +-- lidar (static — front of robot, 20cm forward, 30cm up)
// +-- camera (static — front, 10cm forward, 25cm up, 5 deg pitch down)
// +-- imu (static — center, 15cm up)
tf.register_frame("world");
tf.register_frame("base_link", "world");
tf.register_frame("lidar", "base_link");
tf.register_frame("camera", "base_link");
tf.register_frame("imu", "base_link");
// Static transforms: sensor mounts (set once, timestamp 0 = static)
// These must match the physical mounting on your robot
tf.update("lidar", {0.20, 0.0, 0.30}, {0, 0, 0, 1}, 0);
tf.update("camera", {0.10, 0.0, 0.25}, {0, 0, 0.0436, 0.999}, 0); // ~5 deg pitch
tf.update("imu", {0.0, 0.0, 0.15}, {0, 0, 0, 1}, 0);
horus::log::info("tf_demo", "Frame tree built: world -> base_link -> {lidar, camera, imu}");
// ── Build scheduler ─────────────────────────────────────────────
horus::Scheduler sched;
sched.tick_rate(50_hz).name("tf_demo");
// Localization updates base_link in world frame
LocalizationNode localizer(tf);
sched.add(localizer)
.order(0) // run first — other nodes need fresh transforms
.budget(2_ms)
.on_miss(horus::Miss::Warn) // warn on overrun — stale transforms degrade accuracy
.build();
// Sensor processor queries transforms to map LiDAR points into world
SensorProcessorNode processor(tf);
sched.add(processor)
.order(10) // after localization — uses updated transforms
.budget(5_ms)
.on_miss(horus::Miss::Skip) // skip if overrun — stale LiDAR data is not useful
.build();
sched.spin();
}
Common Errors
| Symptom | Cause | Fix |
|---|---|---|
lookup() returns null | Frame tree is disconnected or frames not registered | Check that parent frames are registered before children |
| LiDAR points in wrong world position | Static transform does not match physical sensor mount | Measure sensor offsets on the actual robot hardware |
| Transform stale / outdated | Localization node runs after consumer node | Set localization order(0) so it runs first each tick |
| Rotation wrong | Quaternion components in wrong order | HORUS uses (x, y, z, w) — identity is {0, 0, 0, 1} |
| Frame tree breaks on second run | Stale SHM from previous run | Run horus clean --shm between runs |
| Camera-lidar alignment off | Camera pitch angle wrong in static transform | Recalculate quaternion: sin(angle/2) for z-component |
Design Decisions
| Choice | Rationale |
|---|---|
| Static transforms with timestamp 0 | Sensor mounts do not change — setting once avoids redundant updates |
| Dynamic base_link updated per tick | Robot pose changes continuously from localization |
horus::Node subclass for consumers | State (transform references) persists cleanly across ticks |
Localization at order(0) | Downstream nodes need fresh transforms — localization must run first |
can_transform() before complex queries | Graceful handling when frame tree is incomplete during startup |
| Quaternion rotation (not Euler angles) | No gimbal lock, composable, compact (4 floats vs 3x3 matrix) |
| Tree structure (not graph) | Single parent per frame guarantees unique transform paths |
Variations
Robot arm with end effector — chain of joint transforms:
// Each joint is a child of the previous joint
tf.register_frame("base_link", "world");
tf.register_frame("shoulder", "base_link");
tf.register_frame("elbow", "shoulder");
tf.register_frame("wrist", "elbow");
tf.register_frame("end_effector", "wrist");
// Update joint angles each tick (quaternion from joint angle)
void update_joint(const std::string& name, double angle_rad) {
double half = angle_rad / 2.0;
tf_.update(name, {0, 0, link_length_},
{0, 0, std::sin(half), std::cos(half)}, horus::now_ns());
}
Multi-robot transforms — separate subtrees per robot:
tf.register_frame("world");
tf.register_frame("robot_a", "world");
tf.register_frame("robot_a/lidar", "robot_a");
tf.register_frame("robot_b", "world");
tf.register_frame("robot_b/lidar", "robot_b");
// Now you can lookup robot_a/lidar -> robot_b/lidar for fleet coordination
Publish transforms on a topic for cross-process sharing:
// Publisher node
horus::msg::TFMessage tf_msg{};
tf_msg.frame = "base_link";
tf_msg.parent = "world";
tf_msg.translation = {x, y, 0};
tf_msg.rotation = {0, 0, sin_half, cos_half};
tf_pub_->send(tf_msg);
// Subscriber node in another process
if (auto msg = tf_sub_->recv()) {
tf_.update(msg->get()->frame, msg->get()->translation,
msg->get()->rotation, horus::now_ns());
}
Key Takeaways
- Register the frame tree once at startup; update static transforms with timestamp 0
- Localization must run before any node that queries transforms (
order(0)) lookup()composes transforms along the tree path automatically — no manual chain multiplication- Measure sensor offsets on the actual robot — CAD models drift from physical reality
can_transform()is cheap — use it to guard lookups during startup when the tree may be incomplete