The iOS client of Runway has received a significant update, enabling Apple users to experience the powerful capabilities of the Gen3 model on their mobile devices. This update not only enhances user experience but also represents a major leap forward for Runway in the field of AI video generation.

QQ截图20240717111544.jpg

In June of this year, Runway introduced the new Gen3 model, which has seen significant improvements in fidelity, consistency, and motion performance, taking a solid step towards building a universal world model.

Gen-3Alpha is trained on a new large-scale multimodal training infrastructure, capable of processing and integrating different types of data such as text, images, and videos to generate high-quality multimodal outputs. This combined training method enhances the model's ability to generate dynamic and static content.

Gen-3Alpha supports a variety of generative tools, including text-to-video, image-to-video, and text-to-image conversion tools, providing creators with a rich array of creative options.

By using time-intensive captions for training, Gen-3Alpha has enhanced its understanding of scene details and temporal changes, capable of generating creative transitions and precise keyframe-controlled video content.

Gen-3Alpha can generate expressive and realistic human characters, offering a wide range of movements, gestures, and emotions, providing new opportunities for narrative creation. With support for motion brushes, advanced camera controls, and director mode, Gen-3Alpha offers creators greater creative freedom and control.

Download link: https://apps.apple.com/us/app/runwayml/id1665024375