Episode notes
Listen to Full Audio at https://podcasts.apple.com/us/podcast/scientist-vs-storyteller-benchmarking-gpt-5-2-claude/id1684415169?i=1000752001078
For years, Latent Diffusion Models—the tech behind Stable Diffusion and DALL-E—have relied on a bit of an 'art form' called KL-regularization. Basically, researchers had to manually guess how much to compress an image before the AI started to lose the details. If you compressed too much, the image got blurry. Too little, and the model became too expensive to train.
Enter Unified Latents, or UL.
In a new paper out of DeepMind Amsterdam, resear ...
Keywords
Latent DiffusionUnified Latents