DW is a partner in the vera.ai (Verification Assisted by Artificial Intelligence) project. One of the project's aims is to build reliable AI solutions to identify disinformation.
The project has kicked off with 14 partners and benefits from a foundation of verification solutions developed with DW involvement.
In the consortium, DW is providing the other 13 members with user-focused feedback on the solutions and services to be developed. DW itself has four spheres of involvement in the project, beginning with the operation of a project website and overall communication and dissemination of the project's activities and outcomes. DW is also formulating user requirements and feedback to aid in development and in integrating developed solutions into existing services. Finally, DW is involved in three technological tasks, assisting respective task leaders: multilingual credibility assessment and evidence retrieval, audiovisual content analysis and multimodal deepfake and manipulation analysis.
"As an international broadcaster and important player in the verification sphere, DW is a well-suited and dedicated partner to test and further co-develop the next level of verification solutions," says Jochen Spangenberg, Deputy Head of DW Research & Cooperation Projects and DW's vera.ai Project Lead. "The other partners are research institutes, universities, and commercial companies, for instance. France-based news agency AFP and the European Broadcasting Union are also acting as so-called user partners."
The vera.ai project and its partners are working to develop and build reliable AI (Artificial Intelligence) solutions to aid in countering disinformation. This would apply to all types of content including audio, images, text and even video. The target user group for the vera.ai solutions consists of journalists, fact-checkers, researchers, and investigators of human rights violations, for instance, who need to be able to detect fake content quickly and easily.
In parts, vera.ai can build on the experience gained through various verification projects the DW Research & Cooperation team has been involved in since its first project in 2012.
Some examples of more prominent projects are listed below, including the WeVerify project, which provides some of the building blocks forming the basis for vera.ai.
Vera.ai can partly build on the experience gained through various verification projects the DW Research & Cooperation team has been involved in since its first project in 2012.
The browser extension is free of charge and optimized for use with the Chrome browser. It is currently utilized by more than 70,000 journalists, human rights investigators, and others every week to fact-check and analyze digital media ranging from images to video and text. The project began in 2019, building on the forerunner project InVID, and focused on researching and developing various components and features to support data analysis, eventually leading to the verification plug-in.
Go Verify is a verification game played against the backdrop of a fast-moving social network. It teaches players the most important fundamentals of verification. How can users easily find out if content has been manipulated and who can they trust? It also shows a few simple tools like reverse image search or a detailed Web search to aid in identifying fakes. As always, the most important rule in the game is: When in doubt, do not share! More information is available in the DW Innovation blog.
Truly Mediais a collaborative tool that enables users to work together to find, organize and verify digital, user-generated content. The platform, co-developed with Greek software company ATC Athens Technology Center, also integrates or links to third-party tools that can be used for verification measures. Truly Media is currently in use by journalists at DW, Reuters and ZDF. It is also part of the central infrastructure of the EDMO (European Digital Media Observatory) and the respective EDMO Hubs.
The Digger project aims to use visual verification and audio forensic technologies to detect fakes online. DW worked together with Fraunhofer IDMT and ATC to develop tools for video verification and trainings for visual verification of synthetic media. Digger assists with the detection of both shallowfakes (audiovisual content, manipulated with 'low tech' technologies like cut & paste or speed adjustments which, often taken out of context, is extremely convincing) and deepfakes/synthetic media (audiovisual content, which is extremely realistic but artificial, generated with technologies like machine learning).