Murder Video Again Raises Questions About How Facebook Handles Content

Murder Video Again Raises Questions About How Facebook Handles Content

Video of a murder uploaded to Facebook this week upset many users, especially since it took Facebook two hours to take it down. But the incident illustrates a dilemma for the company as it becomes an open platform for both recorded and livestreamed video. Facebook CEO Mark Zuckerberg was contrite about the incident when he appeared on stage at the company’s F8 developer’s conference. “Our hearts go out to the family and friends of Robert Godwin Sr.,” said Zuckerberg, referring to the man whose murder was posted on Facebook. “And we have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening.” But, doing more may not be so easy for Facebook. On the one hand, its users want to be free to express themselves; on the other hand, they do want some protection. “Half the time it’s, ‘Oh no, Facebook didn’t take something down, and we think that’s terrible; they should have taken it down,’ ” says Daphne Keller, a law professor at Stanford University. “And the other half of the time is, ‘Oh no! Facebook took something down and we wish they hadn’t.’ ” Keller points to an incident last year when Facebook took down a post of an iconic Vietnam War photo of a naked girl running from a napalm attack. The removal upset users. Keller says Facebook isn’t actually under legal obligation to keep anything up or to take down a video of a crime. The company wants to respond to keep users happy. “They want to take things like this down, and they’re working really hard to have a good way to do that,” she says. Keller thinks part of Facebook’s dilemma is that society isn’t sure yet whether the company should be like the phone company, which isn’t responsible for what people say, or if it should be like a traditional broadcaster, subject to strict regulations on what can be put on air. “And I think Facebook isn’t really exactly like either of those two things,” says Keller, “and that makes it hard as a society to figure out what it is we do want them to do.” Nearly 2 billion people use Facebook each month, and millions of them are uploading videos every day. Facebook also pays media outlets, including NPR, to upload videos. That volume of content makes Facebook’s job a lot harder. The company has three ways of monitoring content: There are the users — like the ones who flagged the murder videos from Cleveland. Facebook also has human editors who evaluate flagged content. And, there’s artificial intelligence, which can monitor enormous amounts of content. But, even AI has its limits, says Nick Feamster, a professor of computer science at Princeton University. Take that iconic naked girl photo from Vietnam, he says. “Can we detect a nude image? That’s something that an algorithm is pretty good at,” he says. “Does the algorithm know context and history? T…