Recently, the official Stable AI platform for SVD video generation, stable video, has opened its public beta to the public, allowing all users to experience it. It is understood that this platform's functionality is based on the SVD model and has added the capability to control camera movements, enabling users to generate videos more flexibly. During the public beta phase, users can enjoy a free quota of 150 credits per day, which can be used to generate 15 videos.
This article will share the public beta address of stable video and a detailed tutorial on how to use it.
The stable video platform supports two formats: text-to-video and image-to-video. Below, we will use image-to-video as an example to demonstrate the detailed usage of stable video.
Step 1: Choose the Video Generation Mode
Upon entering the stable video official website, select "Start with Image" to directly enter the image-to-video mode. For text-to-video, choose "Start with Text."
Step 2: Upload an Image
After selecting the image mode, upload an image directly and click generate to turn the image into a video.
For example, by uploading a photo of AI robots fighting, without adjusting the camera motion parameters, the video effect is automatically generated.
If you want to generate a more ideal video, you need to adjust the camera motion parameters, including zoom track, tilt, and more.
Camera Motion Parameters refer to the controls that can be applied to the camera effects of the generated video, as explained below:
Camera, select between locked or panning modes.
Tilt
Orbit
Pan
Zoom, can be enlarged or reduced.
Dolly (sliding mode)
Move
Step 3: Generate the Video
After uploading the image and adjusting the parameters, click generate and wait for the platform to create the video.
The generated video will be in the history list of your personal account.
Step 5: Download or Share the Video
In the history list, click on the target video to share or download the AI-generated video.
Step 6: Obtain Prompt Parameter Templates
After generating the video, you can view the prompt parameters used in the video interface. If you like the effect, you can directly select this parameter template to generate other videos.
Similarly, you can also use the prompt parameter templates from the official showcase examples to generate videos.
That concludes the detailed introduction on how to use stable video.
Public Beta Entry: https://top.aibase.com/tool/stable-video