The CPOV event in Leipzig hold on the 24-26 September got quite some attention.
Under you find an overview of all the news items. For the latest documentation visit also the CPOV.de website.
ARD tagesschau (24.09.2010)
MDR Sachsen Spiegel (24.09.2010)
ARD nachtmagazin (25.09.2010)
Radio Blau (23.09.2010)
Deutschlandradio Kultur (24.09.2010)
MDR Figaro (24.09.2010)
WDR 5 Scala: Das Wikipedia-Phänomen, kritisch gesehen (24.09.2010)
MDR Info: Wer kontrolliert Wikipedia? (24.09.2010)
DRadio Kultur Breitband: Strukturwandel in der kollektiven Wissensproduktion? (25.09.2010)
detektor.fm: Kritische Wikipedia-Konferenz (25.09.2010)
DRadio Wissen: Wikipedia trifft Wissenschaft (27.09.2010)
WDR 5 Leonardo: Wikipedia auf dem Prüfstand (27.09.2010)
DRadio Wissen Redaktionskonferenz: Wikipedia - Die Konferenz zur Enzyklopädie (27.09.2010)
kreuzer: Ein kritischer point of view Leipziger Volkszeitung: Wikipedia wenig weiblich (25./26.09.2010)
Telepolis: Wikipedia: Ein kritischer Standpunkt (23.09.2010)
ZEIT ONLINE: Wikipedia-Community trifft sich in Leipzig (24.09.2010)
taz.de: Die Macht der Admins (24.09.2010)
tagesschau.de: "Wikipedia hängt alle ab" (24.09.2010)
CAMPUS ONLINE: „Gedruckten Brockhaus wird es in Zukunft wohl nicht mehr geben“ heise online: Kritischer Standpunkt: Wie offen ist Wikipedia? (25.09.2010)
heise online: Kritischer Standpunkt: Wikipedia und das vorläufige Wissen (26.09.2010)
heise online: Kritischer Standpunkt: Wohin mit dem Wissen? (27.09.2010)
FAZ.NET: Einst basisdemokratisch, jetzt ein exklusiver Club (28.09.2010)
It’s official: Wikipedia has jumped on the social media bandwagon. The online encyclopedia recently announced the introduction of an article feedback tool currently being tested on 400 articles pertaining to the WikiProject on United States Public Policy. (Melanson, 23 September 2010). What are the potential consequences of the tool’s mass implementation? I will try to answer this question by briefly addressing emerging issues like the demise of expertise and user abuse. Before doing so, I will place this Wikipedia-related development in the broader context of the socialization of knowledge on the Web.
This process of socialization culminated on the 21st of April 2010, when Facebook announced – through CEO Mark Zuckerberg – its intention to become the ‘fabric of the web’ (Siegler, 21 April 2010) by launching a social plugin called Open Graph. The newly launched application gave users the possibility to integrate the protocol on their page and turn it into “the equivalent of a Facebook page” (Facebook, 2010). More importantly, users could also 'like' or ‘recommend’ news stories on CNN. A rating system was thus added to a type of content that is often considered to exhibit a relatively high degree of objectivity and a strong orientation towards neutral, factual information. It is hard to believe that such standards are maintained in light of some stories being more popular than others on social media platforms.
What does such a system mean for Wikipedia? Are there actually any standards to uphold? While the collaborative platform has been often condemned – especially in academic circles – for its lack of accuracy and substance, renowned intellectual Slavoj Zizek did not shy away from citing Wiki entries in his latest work, Living in the End Times. The confidence granted by the philosopher indicates a “significant cultural shift” (Lovink, 15 July 2010) towards seeing Wikipedia as a trustworthy source. While the numbers of active editors have been constantly dwindling, the dedication of a core group – in 2006, 73.4% of the total number of edits had been done by 2% of the users (Wales, in Swartz, 4 September 2006) – shows that knowledge for the sake of knowledge is possible and that expertise is present on Wikipedia. A rating system might make it all crumble and make the platform follow in the disappearing footsteps of the Directory, once a staple of the Google home page and now a ghost under the “more” tab.
Consider the common scenario of an ever-spreading practice: performing a search with the query ‘[topic or person of choice] + wiki’. Just finding the required entry might lead many users to give high ratings to the page; their lack of expertise in the field would entail gratitude for the mere existence of the needed information. Are then the ‘well-sourced’, “complete”, ‘neutral’ and ‘readable’ (please see image below) the new ‘like’ buttons of collective knowledge? The lack of criticality exhibited by Web 2.0 adepts never ceases to amaze: a quick glance at a Facebook news feed shows that content (such as links to articles) is liked within seconds of being posted, suggesting that the appreciative individual is judging it based on topic and not the ideas it comprises. This leads to the question of user abuse: will individuals arm themselves with the feedback tool to sabotage content?
In 2007, California Institute of Technology student Virgil Griffith created the WikiScanner site, capable of detecting IP addresses from which articles had been edited (Blakely, 16 August 2010). Wal-Mart and Exxon Mobile were among the business giants publicly shamed for tinkering with their own entries. With the advent of the feedback application, gaining prestige within the confines of Wikipedia and - subsequently search results, which are mainly based on the number of hits received by a page – becomes easier than ever. Now aware that IP addresses can be easily tracked, pointing the finger at the company headquarters, who can stop corporations from advising their employees to highly rate Wiki entries from the coziness of their own homes or from smart phone devices? Abuse thus becomes a matter of concern that should be taken into account by the feedback tool’s developers. It is not only through the artificial boosting of stars received that abuse may occur; as Wikipedia user Sage Ross astutely remarks, “non-experts may submit low-quality ratings” (Ross in Melanson, 23 September 2010). Therefore, abuse can happen unwittingly, because of ignorance or superficiality; a highly specialized and article might get low marks from a student with a poor understanding of the subject at hand, while mediocre content might be lifted in a similar manner.
Instead of concluding, I would like to further the debate by focusing on the very semantics of the article feedback tool and what each rating scheme refers to. Does ‘well-sourced’ mean citing scholarly pieces of top tier academic journals, articles of major broadsheets, well-known blogs of critical citizens or any information relevant to the particular topic? Does it simply have to do with the plurality of sources and perspectives listed in the entry? Can Wiki articles – especially those on “boundless controversies” such as global warming (Venturini, 2010) – ever be ‘complete’? Is ‘neutrality’ even an option in the competitive environment of companies looking for search engine visibility? As for an entry being ‘readable’, the same problem concerning deeply specialized content emerges, with bad players likely to blame the game.
Blakely, R. (16 August, 2010). Wal-Mart, CIA, ExxonMobil Changed Wikipedia Entries. Fox News. Retrieved 19 September, 2010 from http://www.foxnews.com/story/0,2933,293389,00.html
Facebook (2010). Open Graph protocol. Retrieved 20 September, 2010 from http://developers.facebook.com/docs/opengraph
Lovink, G. (15 July, 2010). Dare to Quote! On Zizek and Wikipedia. Institute of Network Cultures. Retrieved September 20, 2010 from http://networkcultures.org/wpmu/cpov/lang/de/2010/07/15/dare-to-quote-on-zizek-and-wikipedia/
Melanson, M. (23 September, 2010). Wikipedia Introduces Article Feedback Tool. ReadWriteWeb. Retrieved 23 September, 2010 from http://www.readwriteweb.com/archives/wikipedia_introduces_article_feedback_tool.php
Siegler, MG (21 April, 2010). I Think Facebook Just Seized Control Of The Internet. TechCrunch. Retrieved 19 September, 2010 from http://techcrunch.com/2010/04/21/facebook/
Swartz, A. (4 September 2006). Who Writes Wikipedia. Retrieved 20 September, 2010 from http://www.aaronsw.com/weblog/whowriteswikipedia
Venturini, T. (2010b). Diving in Magma: How to Explore Controversies with Actor-Network Theory. Public Understanding of Science, 19(3), 258-273.
Posted: September 23, 2010 at 8:38 pm | By: alexandre | Tags: wikipedia
When talking about the construction of a Wikipedia page, we are talking about the action of collective creation. But it works not by simple mathematical sum of knowledge of the contributors. More than that, it is about the full development of the whole structure of this free encyclopedia, including issues such as rules of conduct, arguments and discussion.
Therefore, it involves the process of edition and their values for understand how the knowledge came to be and be understood.
To illustrate this point in a physical way, the writer and editor James Bridle published this month a set of books with every edit made to a single Wikipedia article. The entry elected was “The Iraq War”.
The result is twelve volumes, as a single old-style encyclopedia, with almost 7,000 pages that show about 12,000 changes made to the article since its inception in December 2004 until November 2009.
The collection exposes the differences of political standpoints and other relevant issues behind the article. It shows as well frequent moments of vandalism when someone, for example, erases everything and substitutes it for one single sentence.
With this work, Bridle defends the importance of the pages histories in Wikipedia as a relevant archive of historiography. A content that, he says, we can go through to understand history and that “there is no one true answer, there is no one true story”.
He believes that focus on that is a way to challenge absolutist narratives and allow everyone to see the process of history being gradually constructed. Also, he stresses that this kind of process always existed, but now is possible to store and comprehend how the facts that are treated are solidified.
“This is what culture actually looks like: a process of argument, of dissenting and accreting opinion, of gradual and not always correct codification. And for the first time in history, we're building a system that, perhaps only for a brief time but certainly for the moment, is capable of recording every single one of those infinitely valuable pieces of information”.
The Bridle’s initiative shows that the collective process of creation presupposes the space for the clash of ideas, for suggestion for new links and for a joint review of texts.
The collective improvement of a controversial article, like the “Iraq War” entry, is the result of a process of discussion and negotiation that generates a collective vision of the subject matter.
Working on the same content, the group needs to deal with differences, check sources and refine concepts. Each discussion and argument evolving the edition of any article leads to the evolution of the texts and the behavior of Wikipedia, making it “alive” and linked with the today’s news.
Through collective creation the mainly concern is shifted from the protection of author’s name to the care with the information, to the result of the text made by the community.
It represents the experience of the community through the wiki. The progressive improvement of the content is a result of the cooperation between users and the movement of data.
The visualization of the process of the "Iraq War" entry confirms that maybe Wikipedia can be understood as the biggest table of discussion about the production and dissemination of knowledge and information nowadays.
Have you seen that episode of 30 Rock where Jenna bases her big screen portrayal of Janis Joplin on what she reads on Wikipedia? She’s power walking around the office, intent to lunch on felines, when Frank confesses. He edited the Joplin page as a prank. Is it funny because Jenna is so gullible? Or is it more accurate to say that Frank’s subtle blend of fact and fiction is a given for modern Internet users?
Trusting in the fundamental nature of information is a personal trait that signifies a belief in the validity of a factual, knowable universe. However, the form that the Internet takes today requires a mindset more along the lines of doublethink. In Orwell’s dystopian universe, 2+2=5 but sometimes it also equals 4. Doublethink is in the back of our minds, not yet conscious but functioning on a daily basis in a way that distorts our perception not only of information but also of reality itself. Any institution that requires doublethink should set off red flags.
I like imagining what the world will be like in a few hundred years. Will it become Idiocracy or 1984? If the powers that be have a sense of humor, I hope it will be the former. And if Wikipedia is going to play a role in the development of a whole generation of thinkers, it’s my inclination to predict that popular culture is getting dumber. Culture critics already claim that our modern world bears resemblance to Orwell’s Oceania, and his book is a great resource for thinking points on fabricating knowledge. Winston’s job is the job of all Wikipedians: taking a piece of memory, a newspaper article, for example, and embedding your rewrite into the database.
You want to believe the Wikipedia. You want to wrap yourself up in the folds of its all-encompassing knowledge. You want to trust that it is all true. At the same time you have this nagging urge to edit it. You also want the knowledge to conform to your own understanding. How can Wikipedia be everything to everyone?
Wikipedia, a repository of all information in the universe, sounds strangely similar to the fictitious Hitchhiker's Guide to the Galaxy. The online, collectively edited encyclopedia plays a part in constructing users' perception of meaning and truth, but it also constructs a sense of what is real: wikiality. It’s based on a consensus model; if enough people agree that something is true, then it becomes real. I love Jaron Lanier’s anecdote about his own Wikipedia article; he just can’t convince people that he is not a filmmaker (Digital Maoism). I can’t help but bring up the embarrassing case of John Seigenthaler, a journalist who, despite what it said on his Wikipedia page, had not been a suspect in the Kennedy assassination. The motives in a defamation of this sort can easily be blamed on the Michael Bay mentality “it’s just fun.” The Seigenthaler incident has been referred to as “sneaky vandalism,” as opposed to the otherwise outright harassment that takes place on the site. How many people have been killed by Wikipedia? Sneaky vandalism is the new form of doublethink. Verisimilitude wins. Just as in the 30 Rock anecdote, if it fits, you will believe it. No matter if the information is true and verifiable. It looks like it could be true and if it’s findable it becomes integrated into the worldwide knowledge base.
Denial forces fiction to replace fact. I’d like to be one of those morons who can cite the research studies that show that vandalism and errors are fixed within minutes--most notably the 2005 articles in Nature and The Guardian. The problem isn't so much about accuracy than about veracity. In this respect, the major qualms about Wikipedia revolve around the nature of its collective information and how users internalize that knowledge. The question for me isn't so much how accurate the information is but how Wikipedians construct collective wisdom by revisioning truth.
Wikipedia is a digital palimpsest; it's sneaky compared to its historical cousin. The medieval palimpsest shows the marks of history on its face--written, erased, and rewritten. The digital one conceals those marks. While any Wikipedian can revert an article to a previous state, the traces of that act are not immediately visible. You have to be an active analyst to see what's going on behind the scenes, and I don't believe that casual browsers look that far into the information they're soaking up. Wikiality finds its best representation in these casual searches, and Frank's Alf-Joplin mashup is both the best example and the worst consequence. What comes to light on the face of the digital palimpsest is a truth by consensus, a truth whose objective meaning is masked--perhaps even irrelevant.
The very nature of truth takes a spin. The gullible and the iconoclasts stand on either side of the war front; it's a battle between popular consensus and expert opinion. The effects of this war reach beyond the limits of the Web. The state of Texas, proud to be the second largest textbook market in the US, is working on rewrites to history, and this change will impact the way social studies is taught in American high schools. Comparing the school textbook to Wikipedia in this case ranks the experts along side the Wikipedians—that is, as far as objective truth is concerned. Doesn’t this remind you of Big Brother?
Opponents to Wikipedia, for example Robert McHenry, the former editor of the Encyclopedia Britannica, would like information to be controlled by a panel of experts who peer review information to verify its credibility and accuracy. The other camp wants information to be free, readily accessible, and collaborative. This argument goes back to the foundation of language. People create consensus in language; it’s a basic cultural truth. The dictionary is the central tome in which language is compiled. Consensus is what makes words real. What happens to culture, in this case, when the deletion of a word like “dime store” makes room for “chillax”? Language evolves along with culture. No doubt, mashups like wikiality and absotively will one day find their way into the dictionary. Does this phenomenon imply that there is also a natural lifecycle to knowledge?
Misinformation kills. Knowledge needs to be protected like an endangered species. Contrary to expert opinion, it does not find its best habitat in universities or in Wikipedia but in the threads of history. What we need are conscientious citizens to advocate for truth. If there is such a thing as objective truth, you won’t be able to find it in textbooks or on the Internet.
All roads lead to Wikipedia. So, how do you undergo Internet research about Wikipedia when search engines repeatedly return you to that same encyclopedia’s pages? Funny enough, it’s remarkably difficult finding information about Wikipedia on the Web. A search engine performs poorly in this case, because Google ranks Wikipedia's own pages highly. What could be more meta than Wikipedia's page on itself? A search for “Wikipedia surveys” pulls up the Wikipedia page describing a survey. I find it ironic that when searching for information about the reliability of Wikipedia, I end up using this page as the springboard.
Nicholas Carr's assessment of Web 2.0 and his "Is Google Making Us Stupid" provide the foundation found of a rich, insightful argument against passive browsing. Now, with Google's autocomplete algorithm, the search engine finishes your query for you. One of the last responsibilities of the user has evaporated into the Cloud. When 92% of New Yorkers can’t name a browser, what percentage will cite Wikipedia with conviction? I’m not saying we all have to be experts. We just have to be intelligent and informed. Does there have to be a division between conscientious and educated? I know the statistics. 85% of Americans survive with only a high school diploma. Judging on the benchmarks for public education in this country, high school graduates are ill equipped to deal with the information explosion, much less to engage in critical analysis of information available on the Web.
I like to think that an objective reality exists, but it’s hard to believe that these days. Popular cultural consensus is much stronger than objective truth, and you have to convince people of the latter. Some information is just true, whether it’s fact or fiction. Popular culture is a belief system, and to an extent a secular religion, where wars are fought and truth concocted on the battlefields of Wikipedia.
http://tinyurl.com/yfvgyh6 The collective knowledge creation on various wiki-sites, including the massively popular Wikipedia, is having a profound effect on the social and epistemological conditions of public information. Distributed collaboration, possible anonymity, radical equality and global reach of wikified information lead to a situation that at the same time democratizes knowledge production by levelling hierarchies of expertise and increases the postmodern condition of reflective uncertainty. Everybody knows that the Wikipedia can not be trusted in the same way as, say, the Encyclopedia Britannica, yet over 100 million people utilize the Wikipedia daily. The ‘edit’ and ‘history’ buttons ever present on wiki pages are already starting to exert pressure on information presented elsewhere. For instance, the negotiations on what information to include and how the information should be presented in various Wikipedia entries constitute a huge experiment in the use of public reason à la Kant. Consequently, the dynamics of collective collaboration also bring out questions on the nature of rationality and plurality of knowledge. Wikis provide ready made windows into the dialectical interplay between knowledge creation and issues of identity, social inclusion, authority, and the interface between information and politics.
The session invites contributions discussing these themes through theoretical reflection and/or empirical case studies. Abstracts should be between 150-200 words of length.
Abstract submission: http://tinyurl.com/yk98dgk Organizers & more information:
Organizers: Tere Vadén, tere.vaden ä uta.fi , Teemu Mikkonen, teemu.mikkonen ä uta.fi, Juha Suoranta juha.suoranta ä uta.fi
News Sites Rethink Anonymous Online Comments
Also on anonymity:
Wikipedia and the Matter of Accountability Interesting for someone analyzing conflict/controversy:
Wikipedia: Israelis, Notorious for Sexual Exploitation of Children, Traffic Haitian Organs. Cpedia: A very different encyclopedic project.
Cpedia NSFW: Jimmy Wales Wants Me Dead (The Neutrality Of This Article Is Disputed) [got to love the title!] Multimedia Wikipedia: Can Video Be Collaborative? Jimmy Wales on Wikipedia, Wikia, Traditional Media Jimmy Wales: Newspapers wasting money on expensive columnists.
- Refutation: It is not as reliable as Britannica.
- Intentionality: What's the motor behind all this?
- Pragmatic steps: How to improve upon?
The book does three things:
- A visual documentation of the WLA project.
- A reflection of related topics.
- Produce new works, hoping to create a never ending loop.