The planned controls, which were announced in a late night Facebook post, follow accusations that a flood of fake news stories influenced the U.S. presidential election.
"The bottom line is: we take misinformation seriously," wrote Zuckerberg. "We take this responsibility seriously. We've made significant progress, but there is more work to be done."
The CEO said that Facebook (FB, Tech30) is working to develop stronger fake news detection, a warning system, easier reporting and technical ways to classify misinformation. Facebook has also been in contact with fact checking organizations.
For Zuckerberg, it's a sharp reversal in tone from comments made in the immediate aftermath of the election.
"I think the idea that fake news on Facebook -- of which it's a small amount of content -- influenced the election in any way is a pretty crazy idea," he said last week.
Zuckerberg has come under pressure to do more to fight the fake news scourge. Some former employees said the CEO's public comments even contradicts Facebook's pitch to advertisers.
The site's core business is built on the premise that advertisers can use Facebook's targeting tools to show the right users the right message at the right time leading to the right outcome. If it works for advertisers, shouldn't it also work for political campaigns?
Zuckerberg's latest post makes clear that Facebook does not want to play the role of an editor.
"The problems here are complex, both technically and philosophically," he said. "We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties."
Zuckerberg did not say how quickly the measures would be in place. But they should make it much easier for users to flag misleading content -- similar to the way cyberbullying can be reported with a single click on some social media.
The social media giant is also working to undermine the business model used by fake news publishers.
Facebook said earlier this week that it would not place ads from fake news publishers on third party apps or websites, because the content falls under the broader category of "illegal, misleading or deceptive" content. Google (GOOG) has taken similar steps.
"Some of these ideas will work well, and some will not," Zuckerberg admitted. "We understand how important the issue is for our community and we are committed to getting this right."
"The bottom line is: we take misinformation seriously," wrote Zuckerberg. "We take this responsibility seriously. We've made significant progress, but there is more work to be done."
The CEO said that Facebook (FB, Tech30) is working to develop stronger fake news detection, a warning system, easier reporting and technical ways to classify misinformation. Facebook has also been in contact with fact checking organizations.
For Zuckerberg, it's a sharp reversal in tone from comments made in the immediate aftermath of the election.
"I think the idea that fake news on Facebook -- of which it's a small amount of content -- influenced the election in any way is a pretty crazy idea," he said last week.
Zuckerberg has come under pressure to do more to fight the fake news scourge. Some former employees said the CEO's public comments even contradicts Facebook's pitch to advertisers.
The site's core business is built on the premise that advertisers can use Facebook's targeting tools to show the right users the right message at the right time leading to the right outcome. If it works for advertisers, shouldn't it also work for political campaigns?
Zuckerberg's latest post makes clear that Facebook does not want to play the role of an editor.
"The problems here are complex, both technically and philosophically," he said. "We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties."
Zuckerberg did not say how quickly the measures would be in place. But they should make it much easier for users to flag misleading content -- similar to the way cyberbullying can be reported with a single click on some social media.
The social media giant is also working to undermine the business model used by fake news publishers.
Facebook said earlier this week that it would not place ads from fake news publishers on third party apps or websites, because the content falls under the broader category of "illegal, misleading or deceptive" content. Google (GOOG) has taken similar steps.
"Some of these ideas will work well, and some will not," Zuckerberg admitted. "We understand how important the issue is for our community and we are committed to getting this right."
Comments
Post a Comment