IROS'26 Workshop: Beyond Exteroception

Interoceptive Perception for Resilient Robotics — Sep 27, 2026

Modern robots increasingly rely on external sensors—cameras, LiDARs, and radars—as their primary perceptual modality. Yet biological organisms evolved a fundamentally different strategy: they first understand their own body through vestibular and proprioceptive feedback before interpreting the external world. This workshop explores internal perception, the use of inertial measurement units (IMUs), joint encoders, force/torque sensors, and other body-mounted proprioceptive sensors, as a primary, not auxiliary, source of perceptual intelligence for resilient robotics.

We argue that robust autonomy demands perception systems that are not only world-facing but also self-aware of their motion, dynamics, and physical state. Topics span learning-based inertial odometry, cross-embodiment proprioceptive motion models, adaptive sensor fusion under degradation, and the emerging role of humanoid robots as testbeds for internal-sensing research. The workshop brings together researchers from state estimation, legged locomotion, inertial navigation, and neuroscience-inspired robotics to define the foundations of this underexplored paradigm. Featuring invited talks, a contributed poster session, a panel discussion, and the inaugural Learning IMU Odometry Challenge, this workshop aims to catalyze a community around instinct-like perception for resilient robots.


Topics

  • Learning-based inertial odometry and navigation
  • IMU foundation models and cross-platform generalization
  • Proprioceptive state estimation for legged and humanoid robots
  • Multi-IMU fusion and automatic spatial-temporal calibration
  • Adaptive sensor fusion under environmental degradation
  • Self-supervised and bilevel optimization for online IMU adaptation
  • Vestibular and proprioceptive inspiration from neuroscience
  • Sim-to-real transfer for inertial and proprioceptive models
  • Benchmarks and metrics for internal perception robustness
  • Contact-rich perception and force-aware state estimation
  • Differentiable factor graphs for learning to optimize
  • Integration of internal sensing with visual/geometric foundation models


Invited Speakers


Luca Carlone

Associate Professor

Massachusetts Institute of Technology

Certifiable Perception and Robust State Estimation

Davide Scaramuzza

Professor of Robotics and Perception

University of Zurich

Proprioceptive State Estimation for Legged Robots

Ayoung Kim

Professor

Seoul National University

Robust Localization in Degraded Environments via Internal Sensing

Jakob Engel

Director of Research

Meta Reality Labs

Egocentric Machine Perception and Spatial AI

Daniel Gehrig

Researcher

University of Pennsylvania

TBD Speaker

TBD

TBD Speaker

TBD

TBD Speaker

TBD


Schedule

Time Speaker Topic

8:30 - 8:40 AM

Shibo Zhao

Opening Address & Challenge Introduction

8:40 - 9:10 AM

Luca Carlone

Massachusetts Institute of Technology

Certifiable Perception and Robust State Estimation

9:10 - 9:40 AM

Davide Scaramuzza

University of Zurich

Proprioceptive State Estimation for Legged Robots

9:40 - 10:10 AM

Ayoung Kim

Seoul National University

Robust Localization in Degraded Environments via Internal Sensing

10:10 - 10:30 AM

Challenge Spotlight

Top 3 team spotlight talks (5 min presentation + 2 min Q&A each)

10:30 - 11:00 AM

Coffee Break — Poster session from challenge teams and contributed papers

11:00 - 11:30 AM

Jakob Engel

Meta Reality Labs

Egocentric Machine Perception and Spatial AI for Grounded 3D Understanding

11:30 - 12:00 PM

TBD Speaker 1

TBD

12:00 - 1:30 PM

Lunch Break — Lunch and networking

1:30 - 2:00 PM

TBD Speaker 2

TBD

2:00 - 2:30 PM

TBD Speaker 3

TBD

2:30 - 3:00 PM

Poster Session

Contributed abstracts and demos

3:00 - 3:30 PM

Coffee Break — Networking and posters

3:30 - 4:00 PM

TBD Speaker 4

TBD

4:00 - 4:30 PM

Daniel Gehrig

University of Pennsylvania

TBD

4:30 - 5:00 PM

Panel Discussion

Future of Internal Perception

5:00 - 5:15 PM

Shibo Zhao

Closing Remarks

5:15 - 5:45 PM

Open Networking — Networking among attendees


Challenge

Learning IMU Odometry Challenge

We will provide large-scale IMU datasets across multiple robot platforms, including quadrupeds, humanoids, UAVs, and UGVs. The challenge will identify robust solutions for learning IMU odometry across diverse motion patterns and investigate network designs that generalize across platforms and behaviors.

Challenge participants are evaluated across four scenarios:

  • Quadruped motion estimation
  • Humanoid motion estimation
  • UAV/UGV motion estimation
  • Cross-robot generalization

All datasets, evaluation tools, and starter code are open-sourced, and the challenge is open to anyone. Top teams will present spotlight talks during the workshop. A public leaderboard will be available, and participants are invited to submit short papers describing their solutions.


Workshop Organizers


Shibo Zhao

Ph.D. Candidate, Robotics Institute

Carnegie Mellon University

Yuheng Qiu

Ph.D. Student, Mechanical Engineering

Carnegie Mellon University

Sifan Zhou

Visiting Student, Robotics Institute

Carnegie Mellon University

Muqin Cao

Postdoc, Robotics Institute

Carnegie Mellon University

Junbin Yuan

Ph.D. Student, Mechanical Engineering

Carnegie Mellon University

Junyi Geng

Assistant Professor, Aerospace Engineering

Pennsylvania State University

Wenshan Wang

Systems Scientist, Robotics Institute

Carnegie Mellon University

Chen Wang

Assistant Professor, CSE

University at Buffalo

Guanya Shi

Assistant Professor, Robotics Institute

Carnegie Mellon University

Sebastian Scherer

Research Professor, Robotics Institute

Carnegie Mellon University


Acknowledgment

This workshop is part of IROS 2026. For questions, please contact shiboz@andrew.cmu.edu.