In the latest research, a novel approach called Flash Diffusion has brought revolutionary breakthroughs to image generation technology. This method accelerates the generation process of pre-trained diffusion models by training a predictive model to produce multi-step denoised predictions in a single step.

image.png

Product Entry: https://top.aibase.com/tool/flash-diffusion

Researchers report that the Flash Diffusion method not only achieves state-of-the-art FID and CLIP-Score performance in few-step image generation but also requires less GPU time and fewer trainable parameters compared to existing methods. Additionally, the method demonstrates efficiency and versatility across various tasks such as text-to-image conversion, inpainting, face swapping, and super-resolution.

The researchers highlight that the innovation of the Flash Diffusion method lies in its use of adjustable distributions to select time steps, aiding the predictive model in better locating specific time steps. Furthermore, the method employs adversarial objectives, training a discriminator to differentiate between generated and real samples, and applies this in the latent space to reduce computational demands. The research team also utilizes distribution matching distillation loss to ensure that the generated samples closely resemble the data distribution learned by the predictive model.

image.png

Moreover, the researchers demonstrated the adaptability of the Flash Diffusion method to different backbone networks, including UNet-based denoisers (SD1.5, SDXL) and DiT(Pixart-α), as well as adapters. In multiple examples, the method significantly reduced the number of sampling steps while maintaining high-quality image generation.

The emergence of the Flash Diffusion method has infused new life into image generation technology, greatly enhancing the efficiency and versatility of the generation process. This groundbreaking method is expected to have a profound impact across various fields and bring new opportunities and challenges to related research areas.