As COVID-19 spread rapidly across the world in 2020, people everywhere were hungry for reliable information. A global network of volunteers rose to the challenge, consolidating information from scientists, journalists and medical professionals, and making it accessible for everyday people.
Two of them live almost 3,200 kilometers away from one another: Dr. Alaa Najjar is a Wikipedia volunteer and medical doctor who spends breaks during his emergency room shift addressing COVID-19 misinformation on the Arabic version of the site. Sweden-based Dr. Netha Hussain, a clinical neuroscientist and doctor, spent her downtime editing COVID-19 articles in English and Malayalam (a language of southwestern India), later focusing her efforts on improving Wikipedia articles about COVID-19 vaccines.
Thanks to Najjar, Hussain and more than 280,000 volunteers, Wikipedia emerged as one of the most trusted sources for up-to-date, comprehensive knowledge about COVID-19, spanning nearly 7,000 articles in 188 languages. Wikipedia’s reach and ability to support knowledge-sharing on a global scale — from informing the public about a major disease to helping students study for tests — is only made possible by laws that enable its collaborative, volunteer-led model to thrive.
As the European Parliament considers new regulations aimed at holding Big Tech platforms accountable for illegal content amplified on their websites and apps through packages like the Digital Services Act (DSA), it must protect citizens’ ability to collaborate in service of the public interest.
Lawmakers are right to try to stem the spread of content that causes physical or psychological harm, including content that is illegal in many jurisdictions. As they consider a range of provisions for the comprehensive DSA, we welcome some of the proposed elements, including requirements for greater transparency about how platforms’ content moderation works.
But the current draft also includes prescriptive requirements for how terms of service should be enforced. At first glance, these measures may seem necessary to curb the rising power of social media, prevent the spread of illegal content and ensure the safety of online spaces. But what happens to projects like Wikipedia? Some of the proposed requirements could shift power further away from people to platform providers, stifling digital platforms that operate differently from the large commercial platforms.
Big Tech platforms work in fundamentally different ways than nonprofit, collaborative websites like Wikipedia. All of the articles created by Wikipedia volunteers are available for free, without ads and without tracking our readers’ browsing habits. The commercial platforms’ incentive structures maximize profits and time on site, using algorithms that leverage detailed user profiles to target people with content that is most likely to influence them. They deploy more algorithms to moderate content automatically, which results in errors of over- and under-enforcement. For example, computer programs often confuse artwork and satire with illegal content, while failing to understand human nuance and context necessary to enforce platforms’ actual rules.
The Wikimedia Foundation and affiliates based in specific countries, like Wikimedia Deutschland, support Wikipedia volunteers and their autonomy in making decisions about what information should exist on Wikipedia and what shouldn’t. The online encyclopedia’s open editing model is grounded in the belief that people should decide what information stays on Wikipedia, leveraging established volunteer-developed rules for neutrality and reliable sources.
This model ensures that for any given Wikipedia article on any subject, people who know and care about a topic enforce the rules about what content is allowed on its page. What’s more, our content moderation is transparent and accountable: All conversations between editors on the platform are publicly accessible. It is not a perfect system, but it has largely worked to make Wikipedia a global source of neutral and verified information.
Forcing Wikipedia to operate more like a commercial platform with a top-down power structure, lacking accountability to our readers and editors, would arguably subvert the DSA’s actual public interest intentions by leaving our communities out of important decisions about content.
The internet is at an inflection point. Democracy and civic space are under attack in Europe and around the world. Now, more than ever, all of us need to think carefully about how new rules will foster, not hinder, an online environment that allows for new forms of culture, science, participation and knowledge.
Lawmakers can engage with public interest communities such as ours to develop standards and principles that are more inclusive, more enforceable and more effective. But they should not impose rules that are aimed solely at the most powerful commercial internet platforms.
We all deserve a better, safer internet. We call on lawmakers to work with collaborators across sectors, including Wikimedia, to design regulations that empower citizens to improve it, together.
source https://techcrunch.com/2021/11/28/digital-regulation-must-empower-people-to-make-the-internet-better/
0 comments:
Post a Comment