Print this page
Published in News

Microsoft releases "deep fake" detection tool

by on02 September 2020


Just in time for the US election

Software King of the World Microsoft is releasing new technology to fight "deepfakes" that can be used to spread false information ahead of the US election.

"Microsoft Video Authenticator" analyses videos and photos and provides a score indicating the chance that they're manipulated, the company said. Deepfakes use artificial intelligence to alter videos or audio to make someone appear to do or say something they didn't. Microsoft's tool aims to identify videos that have been altered using AI, according to a Volish blog post.

The digital tool works by detecting features that are unique to deepfakes but that are not necessarily evident to people looking at them.

These features -- "which might not be detectable to the human eye" -- include subtle fading and the way boundaries between the fake and real materials blend together in altered footage. The tool will initially be available to political and media organisations "involved in the democratic process", according to the company.

A second new Microsoft tool, announced Tuesday, will allow video creators to certify that their content is authentic and then communicate to online viewers that deepfake technology hasn't been used, based on a Microsoft certification that has "a high degree of accuracy", the post said. Viewers can access this feature through a browser extension.

Last modified on 02 September 2020
Rate this item
(1 Vote)