Cloudflare has added a new feature to easily detect AI-manipulated images.
Hosting provider Cloudflare integrates Adobe’s Content Credentials system, which allows users to easily find out the authenticity of images. Adobe’s tool applies a digital metadata tag to each image to quickly identify the owner, as well as where they were placed and whether they were manipulated.
Digital provenance
Cloudflare is introducing a new feature within its network that allows users to capture and verify the authenticity of images. Content creators and publishers can preserve an image’s digital history via a simple click. This includes information about the original creator, edits and format changes. The feature is based on the standards of the Coalition for Content Provenance and Authenticity (C2PA).
The goal is to recognize creators and give consumers insight into the provenance and modifications of digital images. In doing so, Cloudflare joins the Content Authenticity Initiative (CAI), an Adobe-led community advocating for broad support of Content Credentials as a standard.
Strengthen reliability
The ability to quickly share images globally brings challenges, especially with the rise of generative AI. It is becoming increasingly difficult to distinguish between real and manipulated images. Cloudflare says this new feature helps media and news organizations verify images and maintain ownership.
Users of Cloudflare Images, the company’s storage and optimization service, can now enable the Preserve Content Credentials option. This protects embedded metadata and stores all edits encrypted. An image’s digital history can then be verified via Adobe Content Authenticity Inspect.