Machine learning-driven self-discovery of the robot body morphology

Fernando Díaz Ledezma, Sami Haddadin

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


The morphology of a robot is typically assumed to be known, and data from external measuring devices are used mainly for its kinematic calibration. In contrast, we take an agent-centric perspective and ponder the vaguely explored question of whether a robot could learn elements of its morphology by itself, relying on minimal prior knowledge and depending only on unorganized proprioceptive signals. To answer this question, we propose a mutual information-based representation of the relationships between the proprioceptive signals of a robot, which we call proprioceptive information graphs (π-graphs). Leveraging the fact that the information structure of the sensorimotor apparatus is dependent on the embodiment of the robot, we use the π-graph to look for pairwise signal relationships that reflect the underlying kinematic first-order principles applicable to the robot's structure. In our discussion, we show that analysis of the π-graph leads to the inference of two fundamental elements of the robot morphology: its mechanical topology and corresponding kinematic description, that is, the location and orientation of the robot's joints. Results from a robot manipulator, a hexapod, and a humanoid robot show that the correct topology and kinematic description can be effectively inferred from their π-graph either offline or online, regardless of the number of links and body configuration.

Original languageEnglish
Article numberadh0972
JournalScience Robotics
Issue number85
StatePublished - 1 Dec 2023


Dive into the research topics of 'Machine learning-driven self-discovery of the robot body morphology'. Together they form a unique fingerprint.

Cite this