Simulation
HORUS separates your robot logic from the hardware it runs on. The same node code that reads from a physical LiDAR or IMU works identically with simulated sensors, because both publish to the same shared-memory topics using the same message types.
Overview
horus-sim3d is a 3D physics simulator distributed as a HORUS CLI plugin. It is not bundled with HORUS -- you install it separately:
horus install horus-sim3d
Once installed, one flag swaps between simulation and real hardware:
horus run # real hardware
horus run --sim # simulated sensors via horus-sim3d
Your robot code does not change. Nodes subscribe to topics like turtlebot.front_lidar.scan regardless of whether the data comes from a physical RPLidar or a simulated laser scanner.
Quick Start
1. Create a Project
horus new my_robot
cd my_robot
2. Add a Robot Description
Edit horus.toml to point at your URDF file:
[robot]
name = "turtlebot"
description = "robot.urdf"
Place your URDF file in the project root (or a subdirectory -- the path is relative to the project root).
3. Define Hardware
Add a [hardware] section listing every device. Mark devices that should run in simulation with sim = true:
[hardware]
lidar = { use = "rplidar", port = "/dev/ttyUSB0", sim = true, noise = 0.01 }
imu = { use = "bno055", bus = "/dev/i2c-1", sim = true }
4. Run in Simulation
horus run --sim
This launches horus-sim3d, loads your URDF, and starts publishing simulated sensor data on the same topics your nodes already subscribe to. Only devices with sim = true are replaced by the simulator -- the rest connect to real hardware as usual.
5. Run on Real Hardware
When you are ready for the real robot:
horus run
No code changes. The scheduler loads the [hardware] section and connects to physical hardware. The sim field is ignored when running without --sim.
Configuration
The [robot] Section
The [robot] table in horus.toml tells HORUS about your robot model:
[robot]
name = "turtlebot" # used in topic naming
description = "robot.urdf" # URDF file path (passed to the simulator)
simulator = "sim3d" # which simulator plugin (optional, default: sim3d)
| Key | Required | Description |
|---|---|---|
name | Yes | Robot name. Used as the first segment in all topic names (e.g., turtlebot.front_lidar.scan) |
description | No | Path to URDF, Xacro, SDF, or MJCF file relative to project root. Passed to the simulator for model loading |
simulator | No | Simulator plugin name. Defaults to "sim3d". HORUS launches horus-{simulator} when you run --sim |
The [hardware] Section
[hardware] defines all your devices in one place. Each device can optionally include sim = true to indicate it should be replaced by the simulator when running in --sim mode.
When you run horus run --sim, devices with sim = true are handed off to the simulator. Devices without sim = true (or with sim = false) stay connected to real hardware. This lets you mix real and simulated hardware in a single configuration block.
[hardware]
lidar = { use = "rplidar", port = "/dev/ttyUSB0", sim = true, noise = 0.01 }
imu = { use = "bno055", bus = "/dev/i2c-1", sim = true }
camera = { use = "realsense" }
# camera has no sim = true -- stays real even in --sim mode
Mixed Mode
You can selectively simulate individual devices while keeping others connected to real hardware:
horus run --sim lidar # only lidar is simulated, IMU + camera stay real
horus run --sim lidar camera # lidar + camera simulated, IMU stays real
horus run --sim # all devices with sim = true are simulated
This is useful for testing a new perception algorithm against simulated LiDAR while the robot's IMU and motors are connected to real hardware.
Hardware Types
Each entry in [hardware] specifies a device source:
| Type | TOML Syntax | What It Does |
|---|---|---|
| Terra | use = "rplidar" | Uses a pre-built Terra hardware driver |
| Package | package = "horus-driver-ati-netft" | Installs and runs a registry driver package |
| Node | node = "ConveyorDriver" | Runs a local node registered via register_driver! |
| Crate | crate = "rplidar-driver" | Adds a Rust crate from crates.io to .horus/Cargo.toml |
| PyPI | pip = "adafruit-bno055" | Adds a Python package from PyPI to .horus/pyproject.toml |
| Exec | exec = "./sensor_bridge.py" | Launches as a subprocess with health monitoring |
Any device can include sim = true alongside its source to mark it as simulatable. When --sim is active for that device, the simulator takes over instead of loading the real driver.
Devices can include additional parameters as key-value pairs:
[hardware.arm]
use = "dynamixel"
port = "/dev/ttyUSB0"
baudrate = 1000000
servo_ids = [1, 2, 3, 4, 5, 6]
sim = true
Topic Naming Convention
HORUS uses a shared topic naming convention that both real hardware and the simulator follow. This is why your code works in both modes without changes.
Format: {robot_name}.{sensor_name}.{data_type}
Topics use dots as separators (not slashes). This is required because HORUS topics are backed by shared memory files, and shm_open() on macOS does not allow slashes in names.
Standard Data Type Suffixes
| Suffix | Sensor / Use | Message Type |
|---|---|---|
scan | 2D/3D LiDAR | LaserScan |
imu | IMU (accel + gyro) | Imu |
gps | GPS receiver | Gps |
image | RGB camera | Image |
depth | Depth camera | Image |
camera_info | Camera intrinsics | CameraInfo |
odom | Wheel odometry | Odometry |
cmd_vel | Velocity commands | CmdVel |
joint_state | Joint positions/velocities | JointState |
joint_cmd | Joint commands | JointCommand |
wrench | Force/torque sensor | Wrench |
sonar | Ultrasonic sonar | Sonar |
encoder | Rotary encoder | Encoder |
radar | Radar with Doppler | Radar |
segmentation | Semantic segmentation | Image |
thermal | Thermal/IR camera | Image |
event_camera | Dynamic vision sensor | EventCamera |
pointcloud | 3D point cloud | PointCloud |
Examples
turtlebot.front_lidar.scan # 2D LiDAR scan data
turtlebot.imu_sensor.imu # IMU readings
turtlebot.rgb_camera.image # RGB camera image
turtlebot.rgb_camera.depth # Depth from an RGB-D camera
turtlebot.rgb_camera.camera_info # Camera intrinsics
turtlebot.odom # Robot-level odometry (no sensor_name)
turtlebot.cmd_vel # Velocity commands (no sensor_name)
turtlebot.joint_state # Joint positions (no sensor_name)
Robot-level topics (odometry, velocity commands, joint state) omit the sensor name segment -- they apply to the whole robot, not a specific sensor.
Sim Control Services
When horus-sim3d is running, your nodes can control the simulation at runtime through service topics. Each service has a .request topic (you write to) and a .response topic (you read from).
| Service | Topic Name | Request Type | What It Does |
|---|---|---|---|
| Spawn | sim.spawn | SpawnRequest | Spawn a model (URDF, SDF) or primitive (box, sphere, cylinder) |
| Despawn | sim.despawn | DespawnRequest | Remove an entity by ID |
| Teleport | sim.teleport | TeleportRequest | Instantly move an entity to a new pose |
| Pause | sim.pause | PauseRequest | Pause physics and sensor updates |
| Resume | sim.resume | ResumeRequest | Resume a paused simulation |
| Raycast | sim.raycast | RaycastRequest | Cast a ray and get the hit point, normal, and entity ID |
| Get State | sim.state.get | GetStateRequest | Query sim time, entity count, paused status, physics dt |
| Set Param | sim.param.set | SetParamRequest | Change gravity or physics timestep at runtime |
These service names are defined in horus_library::topic_convention::service and are simulator-agnostic -- any simulator plugin that implements them will work with horus run --sim.
Example: Spawning an Obstacle
Rust:
use horus::prelude::*;
use horus_library::messages::simulation::*;
// Create request/response topics
let req: Topic<SpawnRequest> = Topic::new("sim.spawn.request")?;
let resp: Topic<SpawnResponse> = Topic::new("sim.spawn.response")?;
// Spawn a box at position (2, 0.5, 0) with size 1x1x1
req.send(SpawnRequest {
model: "box".into(),
name: "obstacle_1".into(),
position: [2.0, 0.5, 0.0],
scale: [1.0, 1.0, 1.0],
..Default::default()
});
// Wait for response
std::thread::sleep(std::time::Duration::from_millis(50));
if let Some(response) = resp.recv() {
if response.success {
println!("Spawned entity {} with ID {}", response.name, response.entity_id);
}
}
Python:
# simplified
# NOTE: SpawnRequest/SpawnResponse are not yet available in Python bindings.
# Use dict-based topics for simulation control from Python:
import horus
spawn_req = horus.Topic("sim.spawn.request")
spawn_req.send({
"model": "box",
"name": "obstacle_1",
"position": [2.0, 0.5, 0.0],
"scale": [1.0, 1.0, 1.0],
})
sim.toml (Optional)
sim.toml is an optional configuration file for horus-sim3d. It provides fine-grained control over physics, sensors, actuators, world setup, and rendering. Place it in your project root and pass it with --config:
horus sim3d --config sim.toml --robot robot.urdf
sim.toml is entirely optional. horus-sim3d has sensible defaults for all settings. You only need it when you want to tune physics parameters, add sensor noise, override sensor rates, define world objects, or configure actuator models.
Physics Configuration
[physics]
dt = 0.001 # timestep in seconds (default: 1/240)
solver = "newton" # "newton", "lcp", or "smooth"
integrator = "semi_implicit_euler" # "semi_implicit_euler", "rk4", "velocity_verlet", "implicit_euler"
max_iterations = 100
substeps = 1
[physics.contact]
damping = 0.5
stiffness = 100000.0
World Configuration
[world]
ground = true # enable ground plane
gravity = [0.0, -9.81, 0.0] # gravity vector [x, y, z] in m/s^2
scene = "warehouse.yaml" # optional scene file
[[world.objects]]
name = "wall_1"
model = "box"
position = [5.0, 1.0, 0.0]
scale = [0.2, 2.0, 10.0]
is_static = true
[world.terrain]
type = "heightfield"
source = "terrain.png"
size = [100.0, 100.0]
height_scale = 5.0
Sensor Overrides
[sensors.front_lidar]
type = "lidar"
link = "lidar_link" # URDF link to attach to
rate_hz = 10 # publish rate (overrides URDF value)
rays = 360
range = [0.1, 30.0]
fov = 360.0
noise = { type = "gaussian", std = 0.01 }
[sensors.imu_sensor]
type = "imu"
link = "imu_link"
rate_hz = 100
noise = { type = "gaussian", std = 0.005 }
Actuator Configuration
[actuators.drive]
type = "differential_drive"
topic = "cmd_vel"
wheel_radius = 0.033
wheel_separation = 0.16
max_speed = 1.0
max_torque = 2.0
[actuators.drive.motor_model]
type = "dc" # DC motor model for sim-to-real fidelity
[actuators.drive.latency]
command_ms = 5 # simulated command latency
sensor_ms = 2 # simulated sensor feedback latency
Full sim.toml Structure
| Section | Purpose |
|---|---|
[robot] | Robot name and URDF path |
[physics] | Timestep, solver, integrator, contact parameters |
[world] | Ground plane, gravity, terrain, static/dynamic objects |
[sensors.*] | Per-sensor configuration: type, rate, noise, link attachment |
[actuators.*] | Per-actuator configuration: type, motor model, latency |
[controllers.*] | Controller configuration (PID, etc.) |
[materials.*] | Physics material definitions (friction, restitution) |
[topics] | Topic naming and remapping |
[visual] | Rendering settings and camera modes |
[recording] | Trajectory and sensor data recording |
[rl] | Reinforcement learning training configuration |
URDF Sensors
horus-sim3d reads sensor definitions from your URDF file's <gazebo> elements. This is the primary source for sensor placement and configuration:
<robot name="turtlebot">
<link name="lidar_link">
<!-- link geometry -->
</link>
<gazebo reference="lidar_link">
<sensor type="ray" name="front_lidar">
<ray>
<scan>
<horizontal>
<samples>360</samples>
<min_angle>-3.14159</min_angle>
<max_angle>3.14159</max_angle>
</horizontal>
</scan>
<range>
<min>0.1</min>
<max>30.0</max>
</range>
</ray>
<update_rate>10</update_rate>
</sensor>
</gazebo>
</robot>
The [sensors] section in sim.toml overlays on top of what is in the URDF. You do not need to duplicate sensor definitions -- only specify what you want to override:
# Override just the rate and add noise -- geometry comes from the URDF
[sensors.front_lidar]
type = "lidar"
rate_hz = 20 # override URDF's 10 Hz to 20 Hz
noise = { type = "gaussian", std = 0.02 }
If a sensor name in sim.toml matches a URDF sensor name, the sim.toml values are merged onto the URDF values. If the name does not match any URDF sensor, sim3d spawns it as an additional sensor.
Supported Sensor Types
horus-sim3d supports 16 sensor types:
| Type | sim.toml type | Data Suffix | Description |
|---|---|---|---|
| 2D/3D LiDAR | lidar | scan | Ray-based laser scanner |
| IMU | imu | imu | Accelerometer + gyroscope |
| RGB Camera | rgb_camera | image | Color camera with render pipeline |
| Depth Camera | depth_camera | depth | Depth-only camera |
| Stereo Camera | stereo_camera | image | Stereo camera pair |
| GPS | gps | gps | GNSS receiver |
| Force/Torque | force_torque | wrench | 6-axis force/torque on a joint |
| Contact | contact | contact | Binary contact detection |
| Encoder | encoder | encoder | Rotary or absolute encoder on a joint |
| Sonar | sonar | sonar | Ultrasonic distance sensor |
| Radar | radar | radar | Doppler radar |
| Thermal Camera | thermal_camera | thermal | Infrared camera |
| Event Camera | event_camera | event_camera | Dynamic vision sensor (neuromorphic) |
| Barometer | barometer | barometer | Atmospheric pressure sensor |
| Magnetometer | magnetometer | magnetometer | 3-axis compass |
| Altimeter | altimeter | altimeter | Altitude sensor |
Complete Example
Here is a complete project setup for a differential-drive robot with LiDAR, IMU, and a camera.
horus.toml:
[package]
name = "my-robot"
version = "0.1.0"
[robot]
name = "turtlebot"
description = "robot.urdf"
[hardware]
lidar = { use = "rplidar", port = "/dev/ttyUSB0", sim = true, noise = 0.01 }
imu = { use = "bno055", bus = "/dev/i2c-1", sim = true }
camera = { use = "realsense", sim = true }
src/main.rs (works in both sim and real):
use horus::prelude::*;
fn main() -> anyhow::Result<()> {
let scan: Topic<LaserScan> = Topic::new("turtlebot.front_lidar.scan")?;
let imu_data: Topic<Imu> = Topic::new("turtlebot.imu_sensor.imu")?;
let cmd: Topic<CmdVel> = Topic::new("turtlebot.cmd_vel")?;
Scheduler::new()
.add("navigator", |ctx| {
// Read latest sensor data
if let Some(scan) = scan.recv() {
let min_range = scan.ranges.iter().copied().fold(f32::MAX, f32::min);
if min_range < 0.5 {
// Obstacle close -- turn
cmd.send(CmdVel { linear_x: 0.0, angular_z: 0.5, ..Default::default() });
} else {
// Clear path -- drive forward
cmd.send(CmdVel { linear_x: 0.3, angular_z: 0.0, ..Default::default() });
}
}
})
.rate(10.hz())
.build()?;
Ok(())
}
Run it:
# Test in simulation first
horus run --sim
# Deploy to real hardware
horus run
sim3d Deep Dive
The horus-sim3d plugin has extensive documentation covering every aspect of the simulator. These guides go deeper than this overview page.
References
| Guide | What it covers |
|---|---|
| CLI Reference | All sim3d flags: --mode, --world, --speed, --no-gui, --namespace, --driver-mode |
| sim.toml Configuration | Complete reference for physics, world, sensors, actuators, materials, visual, recording, RL (1300+ lines) |
| Scene Format | YAML world file schema — static objects, terrain, lighting, spawn points |
| Robot Loading | URDF, MJCF, SDF parsing — how robot models are loaded and configured |
| Sensor Reference | All 16 sensor types — configuration, noise models, rate tuning, URDF overlay |
| Actuators Reference | Motor models, latency simulation, traction, battery |
| Topic Wiring | How sim3d topics map to horus hardware topics — the sim/real swap mechanism |
| Physics Engine | Featherstone ABA/RNEA/CRBA, LCP contacts, solver selection, integrators |
| Performance Tuning | Substeps, dt, headless mode, GPU acceleration, profiling |
| Multi-Robot | Namespace isolation, multi-sim coordination |
| Recording | Trajectory capture, sensor data recording, video export |
| RL Training | GymVecEnv, domain randomization, curriculum, Python bindings |
| Cloud Deployment | Running sim3d headless in cloud/CI environments |
| Editor | Entity inspector, visual debugging tools |
Tutorials
| Tutorial | What you build |
|---|---|
| 1: Basic Simulation | Load a world, spawn objects, control physics |
| 2: Robot Simulation | Load URDF, wire sensors, drive a robot |
| 3: Sensors | Configure all sensor types, add noise, tune rates |
| 4: Reinforcement Learning | Set up RL training with vectorized environments |
How horus run --sim Works
When you run horus run --sim, the CLI:
- Reads
[robot].nameand[robot].descriptionfromhorus.toml - Sets
HORUS_SIM_MODE=1so the hardware system identifies devices withsim = true - Launches sim3d as a background process:
horus-sim3d --driver-mode --robot robot.urdf --robot-name turtlebot - The
--driver-modeflag tells sim3d to use the shared topic naming convention ({robot}.{sensor}.{type}) without a namespace prefix -- so topics match what real hardware produces - sim3d parses the URDF, creates physics world, attaches sensors, and starts publishing to horus SHM topics
- Your code runs normally -- nodes subscribe to the same topics regardless of sim or real
- On exit, horus kills the sim3d process
If sim3d is not installed, you get a helpful message: Install with: horus install horus-sim3d
Next Steps
- Deterministic Mode -- reproducible execution with virtual time for simulation and testing
- Package Management -- install plugins and packages from the HORUS registry
- Standard Messages -- message types used by topics