Decoder with Nilay Patel
Description
Today, we’re going to talk about reality, and whether we can label photos and videos to protect our shared understanding of the world around us. To do this, I sat down with Verge reporter Jess Weatherbed, who covers creative tools for us — a space that’s been totally upended by generative AI.
We’ve been talking about how the photos and videos taken by our phones are getting more and more processed for years on The Verge. Here in 2026, we’re in the middle of a full-on reality crisis, as fake and manipulated ultra-believable images and videos flood onto social platforms at scale. So Jess and I discussed the limitations of AI labeling standards like C2PA, and why social media execs like Instagram boss Adam Mosseri are now sounding the alarm.
Read the full transcript on The Verge.
Links:
This system can sort real pictures from AI fakes — why aren’t we using it? | The Verge
You can’t trust your eyes to tell you what’s real, says Instagram | The Verge
Instagram’s boss is missing the point about AI on the platform | The Verge
Sora is showing us how broken deepfake detection is | The Verge
Reality still matters | The Verge
No one’s ready for this | The Verge
What is a photo, @WhiteHouse edition | The Verge
Google Gemini is getting better at identifying AI fakes | The Verge
Let’s compare Apple, Google & Samsung’s definitions of 'photo’ | The Verge
The Pixel 8 and the what-is-a-photo apocalypse | The Verge
Subscribe to The Verge to access the ad-free version of Decoder!
Credits:
Decoder is a production of The Verge and part of the Vox Media Podcast Network.
Decoder is produced by Kate Cox and Nick Statt and edited by Ursa Wright. Our editorial director is Kevin McShane.
The Decoder music is by Breakmaster Cylinder.
Learn more about your ad choices. Visit podcastchoices.com/adchoices