Zuckerberg mentor, Facebook investor support allegations company incentivized hatred, polarization

Stephanie Sierra Image
Tuesday, October 5, 2021
Zuckerberg mentor supports allegations Facebook incentivized hatred
The tech giant is now facing accusations it didn't take action to suppress the negative content over concerns it would hurt ad revenue.

SAN FRANCISCO (KGO) -- Internal documents leaked from a Facebook whistleblower revealed the company identified changes to its algorithm were incentivizing hatred and polarization among users. The tech giant is now facing accusations it didn't take action to suppress the negative content over concerns it would hurt ad revenue.



RELATED: Facebook whistleblower 2021: Ex-manager alleges social network fed Capitol riot



"The first signals began to appear towards the end of the three years I advised him," said Roger McNamee, a Silicon Valley investor and mentor to Mark Zuckerberg.



McNamee authored the New York Times best seller Zucked, that warned the company could have disastrous effects on our democracy. He says he saw early signs of the company's dysfunction.



"The first signals began to appear towards the end of the three years that I advised him," said McNamee. "I wish I had seen it sooner but, when I did see it in the beginning of 2016, it led me to reach out to Mark in October of 2016 before the Presidential election to warn them the culture and business models of Facebook were allowing bad people to hurt innocent people."



Josh Constine is a principle investor at venture capital fund SignalFire, and formerly worked as the editor at large for TechCrunch, covering Facebook for ten years. He says the leaked documents reaffirmed what experts already feared was happening.



VIDEO: Lawmakers compare Instagram's effects on teens to cigarette addiction during Senate hearing


Senators have fired criticism at a Facebook executive over its handling of internal research on how Instagram can harm teens.


"I don't think this is a surprise," Constine said. "The research finally proves what so many of us have already thought...Facebook is cutting corners internationally, that it was giving VIPS and celebrities special treatment instead of actually moderating their posts and it would actually weave up content that blatantly violated its rules."



Constine explained Facebook's 2018 algorithm change incentivized hateful content to encourage political parties and news outlets to make 'angrier' posts. A leaked slide first reported by the Wall Street Journal showed Facebook identified some parties shifted to making 80 percent of their posts negative. But, whistleblower allegations revealed executives refused to change it over fear of revenue loss.



"Facebook has invested a ton of money into hiring moderators to make sure the content on its platform is safe, yet it still says only one to five percent of hate and violence on its platform necessarily get taken down," said Constine.



RELATED: US attorneys general slam Facebook's plan for kid-targeted Instagram platform



Levent Ertaul is the Computer Science Dept. Chair at the California State University East Bay campus, and has studied Facebook's platform for more than a decade. He says the company made many poor decisions that contributed to these allegations - including keeping up video footage of the New Zealand Mosque shootings six months following the tragedies.



"They couldn't even identify the livestreams of the killings on their platforms for like a half an hour," he said. "We knew they weren't doing enough."



Experts say the company needs to have a required portion of its budget dedicated to moderating its platform to ensure safety.



"It may need to commit to a specific level of its profits that it's willing to spend to protect its own service," said Constine. "World governments may need to say, 'if you don't spend this amount of money you can't operate here.'"



ABC7 reached out to Facebook for comment, but have yet to hear back from the company.



Copyright © 2024 KGO-TV. All Rights Reserved.