
I began video-calling QuiƱonero regularly. I also spoke to Facebook executives, current and former employees, industry peers, and external experts. Many spoke on condition of anonymity because theyād signed nondisclosure agreements or feared retaliation. I wanted to know: What was QuiƱoneroās team doing to rein in the hate and lies on its platform?
But Entin and QuiƱonero had a different agenda. Each time I tried to bring up these topics, my requests to speak about them were dropped or redirected. They only wanted to discuss the Responsible AI teamās plan to tackle one specific kind of problem: AI bias, in which algorithms discriminate against particular user groups. An example would be an ad-targeting algorithm that shows certain job or housing opportunities to white people but not to minorities.
By the time thousands of rioters stormed the US Capitol in January, organized in part on Facebook and fueled by the lies about a stolen election that had fanned out across the platform, it was clear from my conversations that the Responsible AI team had failed to make headway against misinformation and hate speech because it had never made those problems its main focus. More important, I realized, if it tried to, it would be set up for failure.
The reason is simple. Everything the company does and chooses not to do flows from a single motivation: Zuckerbergās relentless desire for growth. QuiƱoneroās AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoidĀ proposed regulationĀ that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.
In other words, the Responsible AI teamās workāwhatever its merits on the specific problem of tackling AI biasāis essentially irrelevant to fixing the bigger problems of misinformation, extremism, and political polarization. And itās all of us who pay the price.
āWhen youāre in the business of maximizing engagement, youāre not interested in truth. Youāre not interested in harm, divisiveness, conspiracy. In fact, those are your friends,ā says Hany Farid, a professor at the University of California, Berkeley who collaborates with Facebook to understand image- and video-based misinformation on the platform.
āThey always do just enough to be able to put the press release out. But with a few exceptions, I donāt think itās actually translated into better policies. Theyāre never really dealing with the fundamental problems.ā
If youāre willing to fight for Main Street America, click here to sign up for my free weekly email.




