
"When Instagram CEO Adam Mosseri looks into the rapidly approaching future of AI-generated content, he sees a massive problem: how to tell authentic media apart from the kind made with AI technology. In a recent Threads post on the topic, Mosseri said that social media platforms like Instagram will be under mounting pressure to help users tell the difference. Mosseri argued that major platforms will initially succeed at spotting and labeling AI content, but that they'll begin to falter as AI imitates reality with more precision."
"'There is already a growing number of people who believe, as I do, that it will be more practical to fingerprint real media than fake media,' Mosseri wrote. That 'fingerprint' could be created from within cameras themselves, if their manufacturers 'cryptographically sign images at capture, creating a chain of custody.' 'We need to label AI-generated content clearly, and work with manufacturers to verify authenticity at capture - fingerprinting real media, not just chasing fake,' Mosseri added."
AI-generated media is increasingly capable of imitating real content, making differentiation between authentic creator work and synthetic material difficult. Social platforms can initially detect and label AI content, but detection will become less reliable as AI improves. Fingerprinting real media at capture through cryptographic signing in cameras can create a verifiable chain of custody. Clear labeling of AI-generated content paired with manufacturer cooperation to verify capture authenticity offers a practical approach to maintaining trust. Authentication of creator content will influence how users relate to and value media on social platforms.
#ai-generated-content #media-authentication #cryptographic-fingerprinting #platform-content-labeling
Read at Mashable
Unable to calculate read time
Collection
[
|
...
]