Search

Facebook Is a Black Box. It Needs a Window. - The Atlantic

sambitasa.blogspot.com

The overarching takeaway from the Facebook Papers is that Facebook knows. The company monitors just about everything, as the whistleblower Frances Haugen revealed by providing 17 news organizations with documents about the social-media company’s internal research and discussions. Facebook and its tech-industry peers employ armies of exceptional research scientists who evaluate how the platform shapes social behavior. Those researchers agree to a Faustian bargain—in exchange for limitless data, they sign nondisclosure agreements. And as the Facebook Papers document, these employees have discovered a range of disturbing problems that, if not for Haugen, might never have become publicly known. Even when employees of Facebook (which officially renamed itself Meta on Thursday) have privately objected to the company’s decisions to put profit over public safety, they’ve in many cases been overruled by Mark Zuckerberg and other executives.

If Facebook employees continue to be the only ones who can monitor Facebook, neither they nor anyone else can make sufficient progress in understanding how toxic content thrives on social-media platforms, how it leads to human suffering, and how best to diminish that suffering. Facebook and other social-media giants have amassed enormous power over the global flow of information: Facebook alone censors millions of posts every day, more human communication than any government ever has. The decisions that their employees and their algorithms make about what to amplify and what to suppress end up affecting people’s well-being. Yet the companies are essentially black boxes—entities whose inner workings are virtually unknowable to people on the outside.

Particularly in the absence of outside oversight, private companies cannot be expected to work in the public interest. It is neither their purpose nor their role. That’s why independent researchers at news organizations, universities, and civil-society groups need to be permitted to pursue and gather knowledge on behalf of the public. Compelling that access and protecting it by law is essential to holding internet platforms accountable.

I am working with other researchers—including J. Nathan Matias at Cornell and Rebekah Tromble and David Karpf at George Washington University—on initiatives that would guarantee the sharing of key information. (Matias, Tromble, and Karpf all contributed to this article.) We have observed that when independent researchers have tried to develop public-interest evidence on digital harms, tech companies have regularly obstructed their efforts. For years, Facebook has blocked transparency research from ProPublica, the Markup, New York University, AlgorithmWatch, and others. In 2018, Social Science One, a research consortium based at Harvard, partnered with Facebook in an attempt to obtain data from Zuckerberg’s company about social media’s impact on democracy and share those data with academics. (Journalists and civil-society researchers were not allowed to take part.) Facebook took much longer than expected to roll out the first promised data set, which was of limited utility to researchers and was recently revealed to have a fundamental flaw that undermined all of their findings.

And Facebook is far from alone in wanting the PR boost that comes from ostensibly working with external researchers while preventing independent research from happening at all, or slow-walking it. For example, years after Twitter publicly promised to work with outside researchers studying online harassment, the company’s own initiative assessing how to promote healthy conversations stalled.

Independent scrutiny of corporate behavior is standard in other industries—especially those that, like internet companies, make technology available for private uses that can make life easier but can also lead to serious injury. Crash tests on new cars, for instance, are conducted by the National Highway Traffic Safety Administration, which publishes its procedures and findings. Auto-safety regulation supported by independent testing has saved thousands of lives each year. Regulation wouldn’t work as effectively if car companies were the only ones that could test their own cars, and if they kept the test results secret. The best solutions for safety come from a robust, competitive knowledge search, whereby disparate researchers critique one another’s work and try to improve on it.

You don’t have to think that tech companies or their CEOs are evil to see that they can’t provide effective oversight of their own businesses. Even when a company like Facebook discovers ways to protect its users and the common good—such as by diminishing the reach of misinformation that can kill people—it may not use them. In August 2020, for example, a Facebook staff member noted that the platform’s system of recommending posts and pages can “very quickly lead users down the path to conspiracy theories and groups.” The staffer went on to protest, in vain, that “out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms.”

Matias, Tromble, Karpf, and I, among others, are proposing a new system of governance and external oversight of tech companies. They would be required to grant access to independent researchers, who could then collect and analyze data about harms arising on an internet platform, interview people about them, and test methods to reduce or fully prevent those harms.

Facebook has objected to independent research by claiming that it might violate users’ privacy. In fact, most researchers are far more concerned about privacy than most internet companies are.

In any case, in the system that my colleagues and I propose, a new independent entity would vet academic researchers in advance for their capacity and commitment to work ethically and protect privacy. Their study designs and privacy practices would be reviewed in advance by independent ethics boards, just as the U.S. government and universities already require for other research on humans and animals, but with additional ethics and privacy standards to reflect the complexities of social-media data. These standards should protect both the data and the people affected by the research. The system we propose would also extend access to journalists and civil-society researchers, who would be subject to the same review for ethical and privacy safeguards.

In this system of industry oversight, independent researchers would be obliged to make their results freely available to other researchers and the public.

Companies, for their part, would be required to rigorously protect the privacy of all individuals affected in any way by the proposed research. They must not attempt to influence the outcomes of research, or suppress findings for any reason other than legitimate protection of privacy. We would require companies to grant access to data, people, and relevant software code in the form researchers need.

Since the tech companies have demonstrated all too clearly that they won’t permit oversight on their own, it must be regulated. My three colleagues and I have come to this position the hard way, slowly, after trying all the alternatives. Independent researchers like us have attempted for years to collaborate with tech companies on methods to make their platforms safer. It has not worked. After making grand promises, they have refused to share essential information with us. We have tried to monitor the companies’ practices and platforms on our own. They have shut down our accounts and blocked our research tools.

And though the trove of information in the Facebook Papers is rich, leaks are by nature ad hoc and intermittent. Relying on them won’t provide regular oversight of platforms whose inner workings are complex. Whistleblowers are essential, but they cannot replace independent, systematic research. The papers record only the research that Facebook chose to do itself, and the critiques and debate that occurred among the company’s own staff. It is clear that many researchers inside Facebook and other tech companies strive to do the right thing. But as long as their findings are hidden and their recommendations are ignored, companies cannot be held accountable for how they use their researchers’ work.

The Facebook Papers prove that Facebook and other companies cannot police themselves. Effective oversight requires rigorous, regular, independent scrutiny, and the United States needs to cut a permanent window into the tech world’s black boxes.

Adblock test (Why?)



"need" - Google News
October 30, 2021 at 05:00PM
https://ift.tt/3GAuW3p

Facebook Is a Black Box. It Needs a Window. - The Atlantic
"need" - Google News
https://ift.tt/3c23wne
https://ift.tt/2YsHiXz

Bagikan Berita Ini

0 Response to "Facebook Is a Black Box. It Needs a Window. - The Atlantic"

Post a Comment

Powered by Blogger.