NVIDIA's research team has recently made significant breakthroughs in the field of robot control. Their neural network system, HOVER, achieves efficient control of humanoid robots with an extremely streamlined parameter count, outperforming even specially designed control systems in performance.

This HOVER system, requiring only 1.5 million parameters, can handle complex robot motion control. In contrast, common large language models often require billions of parameters. This astonishing parameter efficiency highlights the ingenuity of the system's design.

image.png

HOVER's training takes place in NVIDIA's Isaac simulation environment, which accelerates robot movements by 10,000 times. NVIDIA researcher Jim Fan revealed that this means a year's worth of training in the virtual space can be completed with just 50 minutes of GPU computation.

A major highlight of the system is its exceptional adaptability. It can be directly transferred from the simulation environment to real robots without additional tuning and supports various input methods: it can track head and hand movements through XR devices like Apple Vision Pro, obtain full-body position data through motion capture or RGB cameras, collect joint angles through exoskeletons, or even be controlled using standard game controllers.

More surprisingly, HOVER outperforms systems specifically developed for a single input method in every control scenario. Lead author Tairan He speculates that this might stem from the system's deep understanding of physical concepts like balance and precise limb control, enabling knowledge transfer between different control methods.

The system is developed based on the open-source H2O & OmniH2O projects and can control any humanoid robot that can run in the Isaac simulator. NVIDIA has already made examples and code available on GitHub, bringing new possibilities to the field of robot research and development.