The image of Taylor Swift supporting Donald Trump? Fake. Kamala Harris in communist garb? AI-generated. Misinformation is now easy to weaponize. That’s why the Content Authenticity Initiative—over 3,700 tech and media companies, including Adobe, TikTok, and the Associated Press—and the Coalition for Content Provenance and Authenticity, which includes Google, Microsoft, OpenAI, and the BBC, have created Content Credentials, a system of watermarks and metadata intended to ensure authenticity and flag AI. Under the system, a participating company's digital camera could affix the image with metadata, while Adobe Photoshop could track any AI edits. Andy Parsons, the senior director of the Content Authenticity Initiative at Adobe, calls Content Credentials "a way to provide a ‘nutritional label’ for digital content."
Read more at TIME Magazine.