Skip to main content
  • World + object placement — 20
  • Detection node + metadata correctness — 25
  • Zenoh publishing + key design — 20
  • Ingest worker + idempotent DB writes — 25
  • Reproducibility report — 10
You will extend the TurtleBot maze simulation by placing semantic objects, detecting them from the robot camera, and publishing structured detection events over Zenoh. Those events will be ingested into PostgreSQL as durable records. Your final system will stream perception events from the robot to the database through Zenoh key expressions.

Background — Zenoh + ROS 2 + TurtleBot

Watch these before you begin, then study the reference implementation.

Zenoh/ROS 2 Bridge Presentation and TurtleBot4 Demo

The TurtleBot demo walkthrough starts at 25:25.

Live RMW Zenoh Workshop

Reference: zturtle-python

The zturtle-python demo shows a working TurtleBot3 controlled entirely over Zenoh — it subscribes to Twist messages for wheel velocity, publishes a camera feed, and handles WiFi network transitions by reconnecting to a new peer/router. Study this implementation before designing your detection event pipeline.

Environment Requirements

You will run:
  • TurtleBot3 maze simulation (provided repo)
  • ROS 2 Humble
  • Zenoh router (zenohd)
  • PostgreSQL
  • Python 3.10+
Expected ROS topics include:
  • /odom
  • /tf, /tf_static
  • /camera/image_raw (or your configured camera topic)
  • /cmd_vel

1. Maze Object Placement

Bench layout

Modify the maze world:
  • Create 3–4 benches touching maze walls.
  • Bench height must allow camera visibility.
  • Place 3–5 objects per bench.
  • Use COCO-class objects only (cup, bottle, book, laptop, mouse, etc.).
  • Each bench must have a different object set.

Deliverable

Create world_assets.md containing:
  • Bench IDs
  • Bench approximate poses
  • Object list per bench
  • Two screenshots per bench from robot camera view

2. Detection Node

Implement a detector process that:
  • Subscribes to the camera topic
  • Runs object detection (Ultralytics YOLO recommended)
  • Extracts bounding boxes, class, confidence
  • Samples robot state at the same timestamp

Required metadata per frame

From image message:
  • timestamp
  • frame_id
  • width, height, encoding
From /odom:
  • x, y, yaw
  • vx, vy, wz
From TF:
  • base frame
  • camera frame
  • base→camera transform (4×4) if available
Time alignment must use the image timestamp.

3. Zenoh Event Publishing

Start Zenoh

Run a Zenoh router:
zenohd

Key expression design

All detection events must use:
maze/<robot_id>/<run_id>/detections/v1/<event_id>
Run metadata (exactly once):
maze/<robot_id>/<run_id>/runmeta/v1

Detection event JSON schema

Publish JSON with this structure:
{
  "schema": "maze.detection.v1",
  "event_id": "uuid",
  "run_id": "uuid",
  "robot_id": "tb3_sim",
  "sequence": 1023,
  "image": {
    "topic": "/camera/image_raw",
    "stamp": {"sec": 0, "nanosec": 0},
    "frame_id": "camera_link",
    "width": 640,
    "height": 480,
    "encoding": "rgb8",
    "sha256": "hex"
  },
  "odometry": {
    "topic": "/odom",
    "frame_id": "odom",
    "x": 1.2,
    "y": 3.4,
    "yaw": 0.5,
    "vx": 0.1,
    "vy": 0.0,
    "wz": 0.02
  },
  "tf": {
    "base_frame": "base_footprint",
    "camera_frame": "camera_link",
    "t_base_camera": [16],
    "tf_ok": true
  },
  "detections": [
    {
      "det_id": "uuid",
      "class_id": 41,
      "class_name": "cup",
      "confidence": 0.91,
      "bbox_xyxy": [x1,y1,x2,y2]
    }
  ]
}

Reliability rule

Ensure correctness by:
  • UUID event_id
  • Monotonic sequence per run
  • Idempotent database inserts

4. PostgreSQL Schema (Required)

CREATE TABLE detection_events (
  event_id uuid PRIMARY KEY,
  run_id uuid NOT NULL,
  robot_id text NOT NULL,
  sequence bigint NOT NULL,
  stamp timestamptz NOT NULL,

  image_frame_id text,
  image_sha256 text,
  width int,
  height int,
  encoding text,

  x double precision,
  y double precision,
  yaw double precision,
  vx double precision,
  vy double precision,
  wz double precision,

  tf_ok boolean,
  t_base_camera double precision[],

  raw_event jsonb NOT NULL,

  UNIQUE(run_id, robot_id, sequence)
);

CREATE TABLE detections (
  det_pk bigserial PRIMARY KEY,
  event_id uuid REFERENCES detection_events(event_id) ON DELETE CASCADE,
  det_id uuid,
  class_id int,
  class_name text,
  confidence double precision,
  x1 double precision,
  y1 double precision,
  x2 double precision,
  y2 double precision,
  UNIQUE(event_id, det_id)
);

5. Zenoh Ingest Worker

Implement a Zenoh subscriber that:
  • Subscribes to maze/**/detections/v1/*
  • Parses JSON
  • Inserts rows in one transaction
  • Ignores duplicates via constraints
  • Logs failures

Deliverables

  • Detector code
  • Zenoh publisher logic
  • Zenoh ingest worker
  • SQL schema
  • world_assets.md
  • Run report with counts and class histogram