HORUS: A Mixed Reality Interface for Managing Teams of Mobile Robots

1DIBRIS Department, RICE Laboratory, University of Genoa, Italy
* Corresponding author: omotoye.adekoya@edu.unige.it
Accepted to IEEE UR 2026
The 23rd International Conference on Ubiquitous Robots (UR 2026) · July 15-18, 2026 · Ritsumeikan University, Ibaraki, Osaka, Japan
RICE Lab logo1 University of Genoa logo1
HORUS logo

HORUS delivers a mixed-reality command layer for multi-robot teams, blending situational awareness, task allocation, and teleoperation inside Meta Quest 3.

Publication Update

HORUS has been accepted to the 23rd International Conference on Ubiquitous Robots (IEEE UR 2026).

July 15-18, 2026 · Ritsumeikan University · Ibaraki, Osaka, Japan

This page will continue to host the paper, video, code, and release resources. The official proceedings citation can replace the current arXiv entry once the conference record is published.

Abstract

Mixed Reality (MR) interfaces have been extensively explored for controlling mobile robots, but there is limited research on their application to managing teams of robots. This paper presents HORUS: Holistic Operational Reality for Unified Systems, a Mixed Reality interface offering a comprehensive set of tools for managing multiple mobile robots simultaneously. HORUS enables operators to monitor individual robot statuses, visualize sensor data projected in real time, and assign tasks to single robots, subsets of the team, or the entire group, all from a Mini-Map (Ground Station). The interface also provides different teleoperation modes: a mini-map mode that allows teleoperation while observing the robot model and its transform on the mini-map, and a semi-immersive mode that offers a flat, screen-like view in either single or stereo view (3D). We conducted a user study in which participants used HORUS to manage a team of mobile robots tasked with finding clues in an environment, simulating search and rescue tasks. This study compared HORUS's full-team management capabilities with individual robot teleoperation. The experiments validated the versatility and effectiveness of HORUS in multi-robot coordination, demonstrating its potential to advance human-robot collaboration in dynamic, team-based environments.

Video

System Highlights

  • Mixed reality interface for robot control and team management.
  • Multi-robot task allocation with real-time status awareness.
  • Live sensor data visualization and flexible camera views.
  • Gesture-based controls optimized for Meta Quest 3.

Teleoperation Modes

  • Minimap (ground-station) mode for 2D navigation and task assignment.
  • Semi-immersive mode with large virtual displays for multi-camera feeds.
  • Full immersion mode with direct front-camera video feed.

Getting Started

Install the latest APK on Meta Quest 3, then run the HORUS Bridge on your laptop to connect robots and stream data into the interface.

BibTeX

Current citation entry uses the arXiv metadata until the official UR 2026 proceedings record is released.

@misc{adekoya2025horus,
  title     = {HORUS: A Mixed Reality Interface for Managing Teams of Mobile Robots},
  author    = {Adekoya, Omotoye Shamsudeen and Sgorbissa, Antonio and Recchiuto, Carmine Tommaso},
  year      = {2025},
  eprint    = {2506.02622},
  archivePrefix = {arXiv},
  primaryClass = {cs.RO},
  url       = {https://github.com/RICE-unige/horus},
  pdf       = {https://arxiv.org/abs/2506.02622},
  note      = {arXiv preprint arXiv:2506.02622}
}