AI & DEEPFAKE

Sony adds 3D information to confirm the authenticity of images

Sony is involved in developing the C2PA certification that confirms the authenticity of images both at the moment of capture and through 3D information from the image.

Publicerad

During the press conference for the Sony World Photography Awards in London, Yann Salmon-Legagneur, Marketing Director for Sony Europe, presented Sony's contribution to the authentication of image authenticity.

Yann Salmon-Legagneur

Yann Salmon-Legagneur explained that the tests for the collaboration known as C2PA, The Coalition for Content Provenance and Authenticity, have been successfully used to confirm the authenticity of images through the algorithms used. The collaboration consists of Sony, Adobe, BBC, Google, Intel, Microsoft, Truepic, and Publicis, who have developed their own specification for verification.

- The spread of false information and images has an actual social impact that not only harms our photojournalists and news agency partners, but society as a whole, stated Salmon-Legagneur earlier in connection with the official announcement of the collaboration.

By using a camera with software that supports the C2PA verification process, anyone can verify the authenticity of an image by checking the image file through the 'capture, sign, inspect' workflow.

By creating a sort of 'birth certificate' for the image file along with information about the image's depth, meaning where objects are located in three-dimensional space, this information can then be encrypted. What Sony adds is the validation that the image was taken with a camera and that the subject of the image actually existed, and has not been added through, for example, generative AI.

The solution is now in its final testing phase to ensure that everything works with the protocol. To be able to verify the images, they must be taken with a camera that supports C2PA, which at the time of writing includes the Sony A1, A7 IV, A7s III, and A9 III.

Material created with Adobe's generative AI engine Firefly includes the solution for "content credentials", Adobe's own system to demonstrate material created with generative AI - and to show how, when, and with what the material was created - built upon the open and free authentication system CAI (Content Authenticity Initiative). The AI-generated data remains tagged regardless of how it has been stored, published, or used.

Camera & Image has also written about the organization Amnesty, which after criticism removed AI-generated images they had published along with criticism directed at the Colombian police's violent actions against protesters in the country. Despite stating that the images were AI-generated, there was a significant uproar about this choice - and ultimately, it affects the work of journalists and photographers.