Jump to content

Search the Community

Showing results for tags 'wikipedia'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Site Related
    • News & Updates
    • Site / Forum Feedback
    • Member Introduction
  • News
    • General News
    • FileSharing News
    • Mobile News
    • Software News
    • Security & Privacy News
    • Technology News
  • Downloads
    • nsane.down
  • General Discussions & Support
    • Filesharing Chat
    • Security & Privacy Center
    • Software Chat
    • Mobile Mania
    • Technology Talk
    • Entertainment Exchange
    • Guides & Tutorials
  • Off-Topic Chat
    • The Chat Bar
    • Jokes & Funny Stuff
    • Polling Station

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...

Found 10 results

  1. How Wikipedia Prevents the Spread of Coronavirus Misinformation A group of hawk-eyed experts operate on a special track to monitor medical information on the site. Photograph: Getty Images “This edit was VERY poor,” wrote James Heilman, an emergency-room doctor in British Columbia, to a Wikipedia contributor who had made a couple of changes toward the end of the article on the new coronavirus outbreak. Those edits recommended a special type of mask for blocking the transmission of the virus from those who have it, and Heilman, a prominent figure in reviewing medical Wikipedia articles, wanted to inform the editor that this advice was too sweeping and based on insufficient evidence. More than that, he aimed to send a warning. “Please do not make edits like this again,” he wrote. Wikipedia’s reputation is generally on the ascent. Just last month, no less a publication than Wired deemed it “the last best place on the Internet.” What was once considered the site’s greatest vulnerability—that anyone can edit it—has been revealed to be its greatest strength. In the place of experts there are enthusiasts who are thrilled to share their knowledge of a little part of the world with all of humanity. As Richard Cooke, who wrote the Wired essay, observed: “It’s assembled grain by grain, like a termite mound. The smallness of the grains, and of the workers carrying them, makes the project’s scale seem impossible. But it is exactly this incrementalism that puts immensity within reach.” His point, and it’s really indisputable, is that this mammoth online project has developed a personality, a purpose, a soul. Now, as the new coronavirus outbreak plays out across its many pages, we can see that Wikipedia has also developed a conscience. The coronavirus articles on English Wikipedia are part of WikiProject Medicine, a collection of some 35,000 articles that are watched over by nearly 150 editors with interest and expertise in medicine and public health. (A survey for a paper co-written by Heilman in 2015 concluded that roughly half of the core editors had an advanced degree.) Readers of Wikipedia wouldn’t know that an article is part of the project—the designation appears on a separate talk page and really serves as a head’s up to interested editors to look carefully at the entries. Once an article has been flagged as relating to medicine, the editors scrutinize the article with an exceptional ferocity. While typically an article in The New York Times or The Wall Street Journal would be a reliable source for Wikipedia, the medical editors insist on peer-reviewed papers, textbooks or reports from prominent centers and institutes. On these subjects, Wikipedia doesn’t seem like the encyclopedia anyone can edit, striving to be welcoming to newcomers; it certainly doesn’t profess a laid-back philosophy that articles improve over time and can start off a bit unevenly. The editor chastised by Heilman hasn’t returned to the article and instead is improving articles about sound-recording equipment. By having these different standards within its pages, Wikipedia can be a guide to the big commercial platforms that have become way stations for fake cures, bogus comparisons to past outbreaks, and political spin. Twitter, Amazon, YouTube, and Facebook have all promised to cleanse their sites of this dangerous disinformation; but they are doing so in fits and starts and by relying in part on familiar, passive tools like acting when others flag dangerous content. Here is how Facebook's Mark Zuckerberg put it in a post on March 3: “It’s important that everyone has a place to share their experiences and talk about the outbreak, but as our community standards make clear, it’s not okay to share something that puts people in danger. So we’re removing false claims and conspiracy theories that have been flagged by leading global health organizations. We’re also blocking people from running ads that try to exploit the situation—for example, claiming that their product can cure the disease.” Wikipedia shows, however, that extreme circumstances, especially when related to public health, require different, more stringent rules, not better application of existing rules. The stakes are simply too high. I spoke this week with the Wikipedia editor who guided the article about the new coronavirus from a one-sentence item in early January to a substantial article with charts of infections around the world. She goes by the handle Whispyhistory, and is a doctor in South London; she spoke via Skype from her office, which she proudly noted had a new thermometer that looks like a laser gun. Whispyhistory has only been contributing for three years; she was recruited through an edit-a-thon at a medical library. While at first she was open with her colleagues about her side project, now she prefers to remain anonymous. “You start getting hounded by people about what you are writing,” she said. “It’s just so much easier to not use your real name.” WikiProject Medicine welcomed her, she said, but she’s had to build a reputation for accuracy and responsibility. “You have to know what you are saying,” she said, and even so it can be intimidating. “You’ve got so many people watching you.” The picture she paints of the project’s contributors is akin to the staff of a demanding teaching hospital. The editors confer on a talk page she calls “the doctors’ mess” where they perform “triage” to assess which articles require attention immediately. Science and data reign; and above all else, the pledge is to do no harm. On January 6, she said, a colleague asked her if she had heard of an outbreak of atypical pneumonia in China. She hadn’t, but “being someone who writes for Wikipedia, the first thing you do is see if it’s on Wikipedia. Someone had written the article the day before.” The article was thin, but Whispyhistory had the sense that “this might be something big,” so she added the WikiProject Medicine tag to the article and wrote a note informing her colleagues to pay attention to the outbreak, which they did. Like a young resident, she pulled all-nighters before showing up at the office at 6 a.m., keeping a watch over the article as the virus spread. In those early days, for instance, she saw a note on the doctors’ mess that linked to a news report claiming that the new coronavirus could survive on surfaces for nine hours. The author wanted to add that information to the Wikipedia page immediately. “That already sends an alert since there is nothing that’s really so important that you’ve got to add something straight away,” she recalled. She went from the news article to the paper that it cited, and discovered that it was looking at the SARS virus, not the (very similar) one that causes Covid-19. She decided not to include the research. As Heilman put it in an email, “Keeping Wikipedia reliable and up-to-date involves deleting material just as much as adding it.” I asked both him and Whispyhistory how the article on the new coronavirus managed to exclude the arguments that were being made (at least until recently) by President Trump and his supporters—that the disease is being hyped by Democrats and that it’s comparable to the flu. Don’t they have angry wannabe contributors accusing Wikipedia of bias? “That’s really easy to answer. ... You have to cite everything you write,” Whispyhistory said. Heilman agreed that a requirement for legitimate sourcing filters out unfounded notions. Bogus claims about the pandemic do show up on Wikipedia, but in a separate article: “Misinformation related to the 2019–20 coronavirus pandemic,” under the heading “Misinformation by governments/United States.” Heilman noted that Wikipedia has a structural advantage over the big social networks: “It takes more time and effort to disrupt Wikipedia than it does to restore Wikipedia to a reliable level. It’s the exact opposite on Twitter and Facebook, where it takes a second to spread false news,” while getting those lies removed will take a lot of time and effort. Unless Twitter, Facebook and the others can learn to address misinformation more effectively, Wikipedia will remain the last best place on the Internet. WIRED is providing unlimited free access to stories about the coronavirus pandemic. Sign up for our Coronavirus Update to get the latest in your inbox. Source: How Wikipedia Prevents the Spread of Coronavirus Misinformation (Wired)
  2. Turkey lifts ban on Wikipedia for the first time since 2017 Turkey’s Constitutional Court has ruled that the country’s ban on Wikipedia is a violation of freedom of speech. Following the ruling, the Wikimedia Foundation said that internet service providers have begun restoring access to the site. The block, which was first introduced in April 2017, was allegedly introduced because Wikimedia wouldn’t delete pages which said that the Turkish government had co-operated with IS and Al-Qaeda in Syria. Commenting on the unblocking of Wikipedia, Katherine Maher, Executive Director of the Wikimedia Foundation, said: “We are thrilled to be reunited with the people of Turkey. At Wikimedia we are committed to protecting everyone’s fundamental right to access information. We are excited to share this important moment with our Turkish contributor community on behalf of knowledge-seekers everywhere.” According to the Wikimedia Foundation, a case they filed with the European Court of Human Rights (ECHR) is still being considered. The petition was launched in the spring of last year and the Court granted the case priority status in July. Going forward, it’s unclear whether the government will try to find another way to block access to the website. The Wikimedia Foundation has said it’s reviewing the full text of the ruling by the Constitutional Court of Turkey and will continue to advocate for free expression in Turkey and around the world. It also appears to be keeping the petition open that it filed with the ECHR. Source: Wikimedia Foundation via BBC News Source: Turkey lifts ban on Wikipedia for the first time since 2017 (Neowin)
  3. More than 260 billion people Wiki’d something or other in 2019, and the overwhelming majority of them apparently shared a similar question: What the hell did I just watch? Wikipedia revealed its most popular English articles of the year, as compiled by researcher Andrew West, in a blog post Friday, and nearly all of them relate to the movies, TV shows, and video games that dominated our social media feeds at one point or another. Nowhere is that more apparent than in the most viewed Wikipedia page of 2019: Avengers: Endgame, the epic conclusion to Marvel’s decade-long story arc and Google’s most-searched movie of the year as well, according to its own year-end statistics. Endgame’s Wiki page clocked in at 44 million page views for the year, according to the nonprofit online encyclopedia’s post. Likely from folks trying to wrap their head around the time-traveling plot threads it juggles (because six infinity stones weren’t already hard enough to keep track of without timey wimey wibbly wobbly voodoo). Other media phenomenons, some of them infamously polarizing like Game of Thrones and Todd Phillips’ Joker, appeared throughout the list as well. “Sixteen of the top 25 articles are related to the media people consumed in theaters and on their devices. At least five of those are directly fueled by the dominance of superhero films, including all three films released by Marvel in 2019,” the post stated. A few top contenders also seemed to be tangentially inspired by popular media: Chernobyl disaster came in at number five, likely due to HBO’s mini-series about the incident, and Ted Bundy—the real-life serial killer at the center of Netflix’s biopic Extremely Wicked, Shockingly Evil and Vile—made the top three with nearly 30 million page views. Bucking this trend, Wikipedia’s list of 2019's famous deaths came in just behind Endgame with nearly 37 million page views (for reference, last year’s list made the top spot, even beating out the World Cup and Marvel’s companion blockbuster: Avengers: Infinity War). In addition to that, goth Lorde—aka Billie Eilish—was the only entertainer whose Wikipedia page breached the top 10. Her first album, When We All Fall Asleep, Where Do We Go?, debuted at number one on the Billboard 200 list in March, and in writing this blog I learned she’s the first 2000s baby to do so. That fact has turned me to dust and all that you’re reading was published posthumously. The top 10 most popular Wikipedia pages of 2019 and their corresponding page view counts are listed below: Avengers: Endgame, 43,847,319 pageviews Deaths in 2019, 36,916,847 pageviews Ted Bundy, 29,062,988 pageviews Freddie Mercury, 26,858,123 pageviews Chernobyl disaster, 25,195,814 pageviews List of highest-grossing films, 24,547,640 pageviews Joker (2019 film), 22,062,357 pageviews List of Marvel Cinematic Universe films, 21,467,603 pageviews Billie Eilish, 19,638,478 pageviews Keanu Reeves, 16,622,576 pageviews Wikipedia’s blog post lays out the rest if you’re curious. Though if you’re interested in its most popular pages behind the top 25, West went on to compile the top 5,000, which you can check out here. Go wild. Source
  4. In December 2018, the Indian government proposed changes to its Intermediaries Guidelines that govern how websites in India with more than 5 million users will operate and host content in India. According to the proposed amendments, such intermediaries are required to: Set up a permanent registered office in India with a physical address Appoint a nodal person of contact for coordination with law enforcement agencies Amanda Keton, general counsel of Wikimedia Foundation, the nonprofit group that operates Wikipedia, has written an open letter to the minister of Electronics and Information Technology, Shri Ravi Shankar Prasad, expressing her concerns. She writes that Wikipedia operates on an open editing model, and the required provisions could lead to a “significant financial burden” on nonprofit technology organizations. It could also limit free expression rights for internet users in the country. The proposed changes in the Intermediaries Guidelines intend to make the internet safer for Indian citizens by formulating rules for it. The rules also require intermediaries to automatically filter out unlawful information and content by deploying automated tools. Other nonprofit organizations, including Mozilla and Github, also wrote a joint letter saying that the upcoming rules “would significantly expand surveillance requirements on internet services.” Last month, Wikipedia received 771 million page views from India, which is its fifth-largest market in the world. In her open letter, Keton has asked the Ministry to release a new draft of rules which take into consideration all the expressed concerns. Source
  5. Wikipedia co-founder Jimmy Wales thinks he can create a better social network. Called WT:Social, the network has no financial association with Wikipedia, but operates on a similar business model: donations, not advertising. WT:Social went live last month and is currently nearing 50,000 users. The company is rolling out access slowly; when I signed up, I was approximately number 28,000 on the waitlist. Alternatively, you can pay 13 bucks a month or 100 a year to get access right away. In comments to the Financial Times, Wales said “The business model of social media companies, of pure advertising, is problematic. It turns out the huge winner is low-quality content.” You don’t say. WT:Social’s interface is rather sparse at the moment, featuring a simple feed comprised of news stories and comments below them. News is a big part of the network; it’s a spinoff of Wales’ previous project, WikiTribune, which sought to be a global news site comprised of professional journalists and citizen contributors. Both WikiTribune and WT:Social emphasize combatting fake news, highlighting evidence-based coverage over the focus on ‘engagement’ seen on other networks. Each story posted to the network makes prominent where the article comes from, as well as sources and references. You can also join various “SubWikis” that are essentially like Facebook groups or subreddits, which filter content to stories of a given topic. You can also add hashtags to a post or follow hashtags for more specific interests that might span more than one SubWiki. Posts are currently sorted chronologically, but the site plans to add an upvote system for users to promote quality stories. Taking on Facebook and Twitter is no easy task: Just ask Google. But Wales appears to have a more focused approach for WT:Social, aiming for meaningful content and hoping to build smaller, niche communities. To this end, Wales doesn’t seem too concerned about running on donations; the Financial Times says Wales highlighted the success of Netflix and the New York times as examples that people are willing to pay for meaningful content. I suppose we’ll find out if that’s true in due time. Source
  6. Controversial Wikipedia Edits Wipe Out Denuvo Crack History People interested in whether a particular Denuvo-protected game has been cracked or not can no longer quickly visit the relevant Wikipedia page and view the information easily. Controversial edits to the official Denuvo page have removed an easy-to-read column, in part due to the claim that the sources used to report pirate releases are unreliable. There can be little doubt that Wikipedia is one of the greatest resources of information available online today. The platform has plenty of critics but generally there’s a credible effort to ensure that the data presented to readers is properly researched and sourced. That’s also true for the Wikipedia page dedicated to the anti-piracy technology known as Denuvo. The anti-tamper system is the most well-known product of its type and is regularly deployed on various gaming titles, much to the disappointment of many legitimate purchasers and the vast majority of pirates. As a result, Denuvo has become a target for cracking groups, who aim to defeat the technology in the quickest possible time. Up until recently, people wanting to see a convenient list of Denuvo titles and their ‘cracked or not’ status had two obvious choices. They could visit Reddit’s appropriately-named /r/crackwatch subreddit or head over to Denuvo’s Wikipedia page, where an entire column was dedicated to the news. A sample of how the page used to look This week, however, a dispute broke out behind the scenes at Wikipedia, as first publicly highlighted by a poster on Reddit’s /r/pcgaming sub. This resulted in the removal of most of the link sources in the ‘cracked’ column, later followed by the deletion of the entire column, as shown in the image below. A sample of how the page looks now Without going into the minutiae (which is best handled by those more au fait with the rules, intricacies, and etiquette of Wikipedia editing), one of the key reasons the column was removed (the other is detailed here) was that the source of the material relied upon to prove that a crack actually exists isn’t acceptable. As clearly illustrated in this earlier version of the page, many of the links led to sites (such as Xrel.to) which are dedicated to archiving so-called NFO text files that cracking groups distribute with their releases. These files are usually very informative, providing key information about each release, who made it, and when it was distributed etc. However, according to the people who made the decisions behind the scenes on Denuvo’s page, sites like Xrel are not reliable sources as defined by Wikipedia. They do not carry absolute proof that a game has been cracked, they only carry text files that claim that to be the case, they argue. “I do not see how this can be an accurate proof whether a game is cracked or not since this site does not offer any cracks, they just have (easy to fake) nfo files. Notice about not reliable source exist since August 2016 but has been ignored by authors,” one of the editors commented. Those who understand how sites like Xrel and many pre-databases work will probably be disappointed that they’re not considered legitimate sources. Fake NFO files are simply not tolerated and any sites publishing them would be quickly called out by their users and/or abandoned for a more accurate source. In this case the Wikipedia rules are being strictly enforced, which creates problems. Clearly, posting a link to a torrent of a cracked game wouldn’t be acceptable, so an NFO database is usually the next best thing. Sadly, however, we know from experience that NFO files don’t meet Wikipedia’s standards. It has been many years ago now and I no longer have the original emails to quote from. However, I can confirm having a short conversation with Wikipedia co-founder Jimmy Wales who was very clear that sites like Xrel (I believe we were actually talking about the now-defunct Nforce NFO database at the time) are not acceptable sources for Wikipedia. This presents a challenge moving forward. Given that there are so many pirate releases every single day, there is no source for them that meets Wikipedia standards, unless a credible news source reports on each and every one. Clearly, reporting on everything isn’t necessary but it’s a shame that properly curated and maintained resources for release data can’t be used on the Denuvo page. The fact that games have been cracked can still be reported in the body of the page, but the easy reference column appears to have gone for good. Given Denuvo’s controversial nature, there’s some speculation that the edits were designed to protect the company’s position. However, as numerous people have pointed out, potential customers in the video game industry won’t be using Wikipedia as their primary research platform before deciding whether to spend money with Denuvo. Source
  7. On Tuesday, Wikipedia celebrated its 18th birthday. If the massive crowdsourced encyclopedia project were human, then in most countries, it would just now be considered a legal adult. But in truth, the free online encyclopedia has long played the role of the Internet's good grown-up. Wikipedia has grown enormously since its inception: it now boasts 5.7 million articles in English and pulled in 92 billion page views last year. The site has also undergone a major reputation change. If you ask Siri, Alexa or Google Home a general-knowledge question, it will most likely pull the response from Wikipedia. The online encyclopedia has been cited in more than 400 judicial opinions, according to a 2010 paper in the Yale Journal of Law & Technology. Many professors are ditching the traditional writing assignment and instead asking students to expand or create a Wikipedia article on the topic. And YouTube Chief Executive Susan Wojcicki announced a plan last March to pair misleading conspiracy videos with links to corresponding articles from Wikipedia. Facebook has also released a feature using Wikipedia's content to provide users more information about the publication source for articles in their feed. Wikipedia's rise is driven by a crucial difference in values that separates it from its peers in the top 10 websites: On Wikipedia, truth trumps self-expression. Last year, Wikipedia co-founder Jimmy Wales told NPR that Wikipedia has largely avoided the "fake news" problem, raising the question of what the encyclopedia does differently than other popular websites. As Brian Feldman suggested in New York magazine, perhaps it's simply the willingness within the Wikipedia community to delete. If a user posts bad information on Wikipedia, other users are authorised and empowered to remove that unencyclopedic content. It's a striking contrast to Twitter, which allows lies and inflammatory statements to remain on its platform for years. Every generation's encyclopedists face adversaries - in the 18th century, Denis Diderot and other authors of the Encyclopédie were denounced as heretics - and today's Wikipedians confront serious challenges: an often hostile editing environment with regular editors who "bite" the newbies, a long-term decline in the contributor community, bad actors who hack administrator accounts to vandalise pages, and an overall systemic bias in its coverage, caused in part by a contributor base that's mostly Western and male. Gender bias on Wikipedia received media attention last year when Donna Strickland won a Nobel Prize, and - at the time of her award - did not have a Wikipedia page. (An earlier entry on Strickland had been rejected due to lack of "notability".) Katherine Maher, executive director of the nonprofit Wikimedia Foundation, responded with a commentary observing that the encyclopedia mirrors, but does not cause, the world's biases. It's worth emphasizing, however, how the volunteer group WikiProject Women in Red, co-founded by Rosie Stephenson-Goodknight and Roger Bamkin in 2015, has committed to reduce the site's gender gap and has already increased the percentage of female biographies from about 15 to 17.8. Millennials coined the word "adulting" to describe mundane acts of grown-up self-sufficiency. But perhaps the term could be expanded to include moral maturity and repeated contributions toward the common good. As Wikipedia crosses this milestone, it's worth acknowledging that - at its best - its community has long been adulting, the contributors modeling a selflessness that's increasingly rare online. Source
  8. For years, Wikipedia has been working on making its eponymous encyclopedia support more languages. But the nonprofit’s effort has been slow, in part because of the translation tool it uses. Editors on the website have long expressed a desire to use Google Translate, as it could make translations faster. The Wikimedia Foundation, the parent company of Wikipedia, announced today that it has partnered with Google to make that happen. Wikimedia says that it will integrate Google Translate, arguably the best translation service on the planet, into its four-year-old content translation tool. The integration also means that Wikipedia’s translation tool would now support an additional 15 languages (pushing the overall count to 121), Wikimedia said in a blog post. The content translation tool used by Wikipedia produces an initial machine translation of an article, which is then reviewed and improved by human editors. To date, this tool has been used to translate nearly 400,000 articles, up from 30,000 articles in late 2015. Wikipedia, which was visited over 190 billion times last year, offers its extensive database of articles in about 300 languages. (People speak more than 7,000 languages around the world, if you were wondering.) Google has made its own efforts in recent years to help content producers expand their work in local languages, as much of the content on the web is available only in English. (A similar disparity also exists on Wikipedia.) This often emerges as a roadblock for the unconnected population of the world, much of which is not comfortable with the English language. Today’s announcement appears to be an extension of a partnership that Google inked with the Wikimedia Foundation last month. The company, which like several other Silicon Valley companies is a benefactor to Wikimedia, announced in December that it would help Wikipedia make its English-only content more accessible for Indonesians. As part of that partnership, Google said it would translate relevant English articles in Bahasa Indonesia and surface them in Google search. “In the Wikimedia vision lies a core promise to everyone who uses our sites — all the world’s knowledge, for free, and in your own language,” Wikimedia said in the announcement today. As part of it, Wikimedia said that the translations done using Google Translate will be published under a free license. It assured readers that it won’t be storing any personal data or sharing them with Google. Source
  9. Last month, members of European Parliament voted to move forward with a sweeping overhaul of the European Union’s copyright laws that critics say will impede the spread of news, kill memes, bolster tech giants, and stifle innovation. Ahead of the final vote this week, Wikipedia Italy has joined protests across the continent by blocking users from viewing its pages. The EU Copyright Directive is the first update to European copyright laws since 2001—making it the first time lawmakers have grappled with copyright since the internet became entwined with every facet of our lives. Some of the most prominent technology experts from around the globe have taken issue with several key items in the legislation. In June, Jimmy Wales, the founder of Wikipedia, joined more than 100 tech pioneers to urge European Parliament to reconsider voting in favor of the new regulations. Other signatories on the letter included Vint Cerf, one of the “fathers of the internet,” and Tim Berners-Lee, the inventor of the World Wide Web. Among the issues raised by the bill is a vague requirement in Article 13 that requires popular websites—estimated to encompass the top 20 percent of sites—to utilize a content filtering system that prevents copyrighted works from ever being posted to the platform. The other key issue is Article 11, also known as the “link tax.” In an effort to push readers back to the homepages of news organizations, lawmakers want to charge websites fees for linking to news and using snippets of text from articles. Both articles have broad implications for upending the way the internet functions as we know it today, but activists have warned from the beginning that online encyclopedias that rely on fair use practices would have their very existence threatened. On Tuesday, Wikipedia Italy set all of its pages to redirect to a statement raising awareness for the upcoming vote that (barring some legislative wrangling) would make the copyright directive law. The statement reads, in part (emphasis theirs): As we’ve previously explained in detail, one of the biggest problems with the legislation is that it’s sloppily written and doesn’t offer enough specifics on how its requirements would be implemented. For instance, its stipulation that websites take appropriate measures to prevent copyrighted material from appearing on their service suggests “effective content recognition technologies” be put in place to monitor uploads. It doesn’t explain what that means, forcing experts to assume that every platform will require a YouTube-style algorithmic filter that flags user-uploaded content before it appears on a website. In response to fears that the bill would make it impossible for online encyclopedias to operate, legislators attempted to carve out an exception specifically for those platforms. Poor legal wording caused activists like the Electronic Frontier Foundation to outline the reasons why encyclopedias would still be at risk. Other protests are expected over the next two days ahead of the July 5 vote, and citizens of the EU are urged to contact their representatives to voice their disapproval. Source
  10. On Tuesday, YouTube CEO Susan Wojcicki announced that the company had a new strategy to deal with conspiracy theories on the platform: dropping a handy Wikipedia link beneath videos on highly contested topics. And it looks like Wikipedia learned about this curious strategy at the same time as everyone else. In a Twitter thread asking the public to support Wikipedia as much as it relies on it, Wikimedia executive director Katherine Maher first suggested that the organization was unaware of YouTube’s plans. When asked whether this new module would only apply to English Wikipedia pages, Maher responded, “I couldn’t say; this was something they did independent of us.” In a statement to Gizmodo, the Wikimedia Foundation confirmed that the organization first learned of the new YouTube feature on Tuesday. “We are always happy to see people, companies, and organizations recognize Wikipedia’s value as a repository of free knowledge,” a Wikimedia Foundation spokesperson said in a statement. “In this case, neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube. We were not given advance notice of this announcement.” It’s unclear why YouTube didn’t feel the need to ask or inform Wikimedia about its plans ahead of this week’s announcement. That’s a pretty crucial piece of information not to share. And given that YouTube has failed at efficiently moderating conspiracy theories on its platform, it might have been smart to consult with Wikimedia about how to best use its resources to fight misinformation. Of course, maybe YouTube would’ve learned that showing users a Wikipedia link isn’t the best way to fight hoaxes. We have reached out to YouTube for comment and will update this story if and when they respond. Source
  • Create New...