The field of video generation is experiencing a revolutionary breakthrough. The AI company HeyGen has recently launched a digital human motion control system that allows for significant control over the body movements of virtual avatars for the first time. This technological advancement enables digital humans not only to perform basic head micro-expressions but also to fluidly execute complex body movements such as playing musical instruments and dancing, and even accurately control finger joints to perform specific gestures.
In the demonstration video, the natural grasping motion of a virtual character holding flowers has drawn industry attention. Although the current display primarily focuses on the manipulation of single objects, the underlying technology already possesses a framework for object interaction capabilities. Analysts point out that this feature has potential applications in product display, and future iterations may overcome the limitations of existing display formats.
This upgrade continues HeyGen's innovative path in the field of digital humans. The previously released virtual avatar generation technology has already achieved seamless integration with Sora's generated scenes. The new version introduces kinematic control algorithms, reducing action response latency to under 12 milliseconds. Producers can now achieve pixel-level control over the joint angles and movement trajectories of digital humans through a parameterized adjustment interface, replacing the time-consuming and labor-intensive motion capture processes used in traditional film production.
It is noteworthy that the generative virtual human solution adopted by HeyGen distinctly separates itself from traditional digital cloning technologies. The system does not rely on real human modeling data but instead autonomously generates physically plausible virtual avatars through deep neural networks. The technical white paper indicates that this architecture supports the real-time generation of data for over 200 joint positions, combined with reinforcement learning algorithms, allowing digital human movements to exhibit biomechanical characteristics.
Industry data shows that video production efficiency using this system has improved by approximately 47%, with the cost of dynamic scene production reduced to one-eighth of traditional methods. The engineering team has revealed that the third-generation control system currently under development will integrate haptic feedback simulation, with plans to achieve physical interaction capabilities between digital humans and virtual objects by the end of 2024.
Official website: https://app.heygen.com/