In a bid to control fabricated or manipulated content – known as deepfakes — on its platform, Twitter on Monday asked users to go through a survey before formulating a new policy on tackling such deepfake or shallowfake content.
The survey asked myriad of questions around misleading altered media — photos and videos that have been changed to deceive or confuse others — and how Twitter can take strict action against it.
Twitter may place a notice next to Tweets that share synthetic or manipulated media or warn people before they share or like Tweets with synthetic or manipulated media.
The micro-blogging platform can also add a link — for example, to a news article or Twitter Moment — so that people can read more about why various sources believe the media is synthetic or manipulated.
“Your individual responses are entirely confidential and will not be shared outside of Twitter, but we may share common themes and overall results,” Twitter said in a statement.
Examples of misleading altered media are videos that make someone look sick, photos that add people who were not present or delete people who were present and videos that show events that never actually happened.
“Deepfakes” are video forgeries that make people appear to be saying things they never did, like the popular forged videos of Facebook CEO Mark Zuckerberg and US House Speaker Nancy Pelosi that went viral recently.
“In addition, if a Tweet including synthetic or manipulated media is misleading and could threaten someone’s physical safety or lead to other serious harm, we may remove it,” said Twitter.
According to the company, misleading altered media does not include photos and videos that are edited to remove blemishes or physical imperfections.
Not just Twitter, to fight the growing menace, Facebook, the Partnership on AI, Microsoft, and academics from Cornell Tech, MIT, University of Oxford, University of California-Berkeley, University of Maryland, College Park and University at Albany-SUNY are joining hands to build the Deepfake Detection Challenge (DFDC).