Zuckerberg defends Facebook's decision to keep up Pelosi ‘deepfake’ video
Facebook (FB) CEO Mark Zuckerberg on Wednesday defended the decision to keep a doctored video of House Speaker Nancy Pelosi live on its site — but he admitted the social networking giant should have labeled it a fraud more quickly.
During a conversation with Harvard Law School professor Cass Sunstein at the Aspen Ideas Festival, Zuckerberg said that while he doesn't want Facebook to control whether users can share inaccurate information with each other, the company is looking at new policies to deal with so-called deepfake videos.
"I think that what we want to be doing is improving execution. But I do not think we want to go so far toward saying that a private company prevents you from saying something it thinks is actually incorrect to another person," he said.
New policies for deepfakes
Deepfakes are videos using artificial intelligence to make individuals appear to do and say things they haven't. Examples include clips that show former presidents Barack Obama and George W. Bush making statements they never gave, and another that shows a researcher moving his mouth and eyebrows to manipulate a video of Russian President Vladimir Putin.
While deepfakes have caught on in recent years, they didn't truly explode onto the national stage until a video altered to make Pelosi appear drunk was uploaded to Facebook, YouTube, and Twitter in May.
That video wasn't a true deepfake since it was simply slowed down to make Pelosi look inebriated and didn’t use sophisticated AI technologies. However, it still managed to raise the specter of deepfakes being used to manipulate elections and spread misinformation.
While Google took down the YouTube version of the video, both Facebook and Twitter left it up. Zuckerberg said they left the video up because he believes it’s better to let people see false information like the Pelosi video called out as fraudulent rather than hiding it from users.
"I feel like that is important, because if you are just hiding things that are rumors then how would you refute them? I do think it would be an overreach to say, 'Hey you shouldn't be able to say something that is not correct to your friends.' "
But deepfakes, Zuckerberg admitted, represent a new frontier for misinformation, and likely need to be treated as such.
"I do think saying deepfakes are different from misinformation is a reasonable perspective," the CEO said. "I think that we need to make sure in doing this that we need to define what a deepfake is very clearly."
Zuckerberg says he fears that if Facebook's policies are too broad, individuals who dislike the way certain interviews or videos with them are cut will petition to have them taken off Facebook.
A failure of execution
Though Zuckerberg doesn’t believe the Pelosi video should have been taken down, he said the company didn't act quickly enough to identify it and label it as a fake.
"One of the issues in the example of the Pelosi video that you mentioned, which was an execution mistake on our side, was it took a while for our systems to flag it and for the checkers to rate it as false."
The fear behind deepfakes is that, unlike a text article, they could prove too realistic to be proven false for viewers. There's also the worry that a deepfake of a world leader could create or exacerbate global conflicts.
Then there's the notion that if a politician or other leader doesn't like how a comment he or she made is received, they would be able to use the existence of deepfakes as a means to deny they ever said anything at all.
For now, though, Facebook's policy is that it won't tell users when they can and can't lie. And videos like the one depicting Pelosi will stay up.
Lenovo’s and Google's Smart Alarm Clock might make you hate mornings less
Tim Cook on tech: ‘If you built a chaos factory, you can’t dodge responsibility for the chaos.’
Email Daniel Howley at [email protected]; follow him on Twitter at @DanielHowley.
Follow Yahoo Finance on Twitter, Facebook, Instagram, Flipboard, SmartNews, LinkedIn,YouTube, and reddit.