All Roads Lead to Wikipedia

Posted: September 20, 2010 at 10:57 pm  |  By: ivyrobertsis  |  Tags: , , ,

Have you seen that episode of 30 Rock where Jenna bases her big screen portrayal of Janis Joplin on what she reads on Wikipedia? She’s power walking around the office, intent to lunch on felines, when Frank confesses. He edited the Joplin page as a prank. Is it funny because Jenna is so gullible? Or is it more accurate to say that Frank’s subtle blend of fact and fiction is a given for modern Internet users?

Trusting in the fundamental nature of information is a personal trait that signifies a belief in the validity of a factual, knowable universe. However, the form that the Internet takes today requires a mindset more along the lines of doublethink. In Orwell’s dystopian universe, 2+2=5 but sometimes it also equals 4. Doublethink is in the back of our minds, not yet conscious but functioning on a daily basis in a way that distorts our perception not only of information but also of reality itself. Any institution that requires doublethink should set off red flags.

I like imagining what the world will be like in a few hundred years. Will it become Idiocracy or 1984? If the powers that be have a sense of humor, I hope it will be the former. And if Wikipedia is going to play a role in the development of a whole generation of thinkers, it’s my inclination to predict that popular culture is getting dumber. Culture critics already claim that our modern world bears resemblance to Orwell’s Oceania, and his book is a great resource for thinking points on fabricating knowledge. Winston’s job is the job of all Wikipedians: taking a piece of memory, a newspaper article, for example, and embedding your rewrite into the database.

You want to believe the Wikipedia. You want to wrap yourself up in the folds of its all-encompassing knowledge. You want to trust that it is all true. At the same time you have this nagging urge to edit it. You also want the knowledge to conform to your own understanding. How can Wikipedia be everything to everyone?

Wikipedia, a repository of all information in the universe, sounds strangely similar to the fictitious Hitchhiker's Guide to the Galaxy. The online, collectively edited encyclopedia plays a part in constructing users' perception of meaning and truth, but it also constructs a sense of what is real: wikiality. It’s based on a consensus model; if enough people agree that something is true, then it becomes real. I love Jaron Lanier’s anecdote about his own Wikipedia article; he just can’t convince people that he is not a filmmaker (Digital Maoism). I can’t help but bring up the embarrassing case of John Seigenthaler, a journalist who, despite what it said on his Wikipedia page, had not been a suspect in the Kennedy assassination. The motives in a defamation of this sort can easily be blamed on the Michael Bay mentality “it’s just fun.” The Seigenthaler incident has been referred to as “sneaky vandalism,” as opposed to the otherwise outright harassment that takes place on the site. How many people have been killed by Wikipedia? Sneaky vandalism is the new form of doublethink. Verisimilitude wins. Just as in the 30 Rock anecdote, if it fits, you will believe it. No matter if the information is true and verifiable. It looks like it could be true and if it’s findable it becomes integrated into the worldwide knowledge base.

Denial forces fiction to replace fact. I’d like to be one of those morons who can cite the research studies that show that vandalism and errors are fixed within minutes--most notably the 2005 articles in Nature and The Guardian. The problem isn't so much about accuracy than about veracity. In this respect, the major qualms about Wikipedia revolve around the nature of its collective information and how users internalize that knowledge. The question for me isn't so much how accurate the information is but how Wikipedians construct collective wisdom by revisioning truth.

Wikipedia is a digital palimpsest; it's sneaky compared to its historical cousin. The medieval palimpsest shows the marks of history on its face--written, erased, and rewritten. The digital one conceals those marks. While any Wikipedian can revert an article to a previous state, the traces of that act are not immediately visible. You have to be an active analyst to see what's going on behind the scenes, and I don't believe that casual browsers look that far into the information they're soaking up. Wikiality finds its best representation in these casual searches, and Frank's Alf-Joplin mashup is both the best example and the worst consequence. What comes to light on the face of the digital palimpsest is a truth by consensus, a truth whose objective meaning is masked--perhaps even irrelevant.

The very nature of truth takes a spin. The gullible and the iconoclasts stand on either side of the war front; it's a battle between popular consensus and expert opinion. The effects of this war reach beyond the limits of the Web. The state of Texas, proud to be the second largest textbook market in the US, is working on rewrites to history, and this change will impact the way social studies is taught in American high schools. Comparing the school textbook to Wikipedia in this case ranks the experts along side the Wikipedians—that is, as far as objective truth is concerned. Doesn’t this remind you of Big Brother?

Opponents to Wikipedia, for example Robert McHenry, the former editor of the Encyclopedia Britannica, would like information to be controlled by a panel of experts who peer review information to verify its credibility and accuracy. The other camp wants information to be free, readily accessible, and collaborative. This argument goes back to the foundation of language. People create consensus in language; it’s a basic cultural truth. The dictionary is the central tome in which language is compiled. Consensus is what makes words real. What happens to culture, in this case, when the deletion of a word like “dime store” makes room for “chillax”? Language evolves along with culture. No doubt, mashups like wikiality and absotively will one day find their way into the dictionary. Does this phenomenon imply that there is also a natural lifecycle to knowledge?

Misinformation kills. Knowledge needs to be protected like an endangered species. Contrary to expert opinion, it does not find its best habitat in universities or in Wikipedia but in the threads of history. What we need are conscientious citizens to advocate for truth. If there is such a thing as objective truth, you won’t be able to find it in textbooks or on the Internet.

All roads lead to Wikipedia. So, how do you undergo Internet research about Wikipedia when search engines repeatedly return you to that same encyclopedia’s pages? Funny enough, it’s remarkably difficult finding information about Wikipedia on the Web. A search engine performs poorly in this case, because Google ranks Wikipedia's own pages highly. What could be more meta than Wikipedia's page on itself? A search for “Wikipedia surveys” pulls up the Wikipedia page describing a survey. I find it ironic that when searching for information about the reliability of Wikipedia, I end up using this page as the springboard.

Nicholas Carr's assessment of Web 2.0 and his "Is Google Making Us Stupid" provide the foundation found of a rich, insightful argument against passive browsing. Now, with Google's autocomplete algorithm, the search engine finishes your query for you. One of the last responsibilities of the user has evaporated into the Cloud. When 92% of New Yorkers can’t name a browser, what percentage will cite Wikipedia with conviction? I’m not saying we all have to be experts. We just have to be intelligent and informed. Does there have to be a division between conscientious and educated? I know the statistics. 85% of Americans survive with only a high school diploma. Judging on the benchmarks for public education in this country, high school graduates are ill equipped to deal with the information explosion, much less to engage in critical analysis of information available on the Web.

I like to think that an objective reality exists, but it’s hard to believe that these days. Popular cultural consensus is much stronger than objective truth, and you have to convince people of the latter. Some information is just true, whether it’s fact or fiction. Popular culture is a belief system, and to an extent a secular religion, where wars are fought and truth concocted on the battlefields of Wikipedia.

Ivy Roberts

CFP: Dynamics of Knowledge Creation in Wikis

Posted: April 15, 2010 at 12:55 pm  |  By: julianabrunello  |  Tags: , , , , , ,

Call for papers (open until May 15., 2010): A session in The 2nd International Power & Knowledge Conference, Tampere, September 6-8, 2010
http://tinyurl.com/yfvgyh6 The collective knowledge creation on various wiki-sites, including the massively popular Wikipedia, is having a profound effect on the social and epistemological conditions of public information. Distributed collaboration, possible anonymity, radical equality and global reach of wikified information lead to a situation that at the same time democratizes knowledge production by levelling hierarchies of expertise and increases the postmodern condition of reflective uncertainty. Everybody knows that the Wikipedia can not be trusted in the same way as, say, the Encyclopedia Britannica, yet over 100 million people utilize the Wikipedia daily. The ‘edit’ and ‘history’ buttons ever present on wiki pages are already starting to exert pressure on information presented elsewhere. For instance, the negotiations on what information to include and how the information should be presented in various Wikipedia entries constitute a huge experiment in the use of public reason à la Kant. Consequently, the dynamics of collective collaboration also bring out questions on the nature of rationality and plurality of knowledge. Wikis provide ready made windows into the dialectical interplay between knowledge creation and issues of identity, social inclusion, authority, and the interface between information and politics.
The session invites contributions discussing these themes through theoretical reflection and/or empirical case studies. Abstracts should be between 150-200 words of length.
Abstract submission: http://tinyurl.com/yk98dgk Organizers & more information:
Organizers: Tere Vadén, tere.vaden ä uta.fi , Teemu Mikkonen, teemu.mikkonen ä uta.fi, Juha Suoranta juha.suoranta ä uta.fi

Liang: De-Classify and Un-Authorize

Posted: March 29, 2010 at 4:53 pm  |  By: Bas Wijers  |  Tags: , , , , , ,

Lawrence Liang is an Indian legal researcher and lawyer residing in Bagalore. With his speech he tries to place current debates into a historical context. There are a variety of controversies which he tries to place in perspective. CPoV Wikipedia Conference Throughout the latest years there have been different responses considering the rise of Wikipedia:
  • Refutation: It is not as reliable as Britannica.
  • Intentionality: What's the motor behind all this?
  • Pragmatic steps: How to improve upon?
Overall a rather somber tone dominated, Liang argues. What remained stable and unchallenged was the authority of knowledge. Therefore Liang tries to point us to a manner which is a little more realistic. He continues with an early history of the book itself, the book as an object of knowledge. His goal is bringing the notion of the authority of knowledge back, in a sense, to the contemporary realm. It's an important debate, not confined to Wikipedia, he says. The book has not always been seen as reliable; there were various inherent problems of copy. During the print revolution, the volume of the total amount of books increased tremendously. The reliability of book were constantly challenged. He shows some quotes of people contesting the supposed inherent truth of books. It not only happened in the realm of religion. Also science struggled how to classify things constantly, of which Borges has given a famous example. CPoV Wikipedia Conference The physical copy introduced simultaneously the right to grant permission to copy it. It was not just accessing. The popular account of pre-print cultures is that of slavish copying, but - as many might not realize - also that of annotators, compilers and correctors. Medieval book owners and scribes actively shaped the text they read. Authority was never given. Readers didn't have the tools to check this either. It was a question of trust. Tracking the original source was very difficult. The emergence of authority was born later on. We now take for granted things like the publisher's name, the cover design etc. They are all meant to make the total package appear to be reliable. His end argument, then, is to try to de-classify and un-authorize encyclopedias and Wikipedia in particular. Encyclopedias are an attempt to authoritatively classify the world, an act to create certainty that doesn't exist. We should move to a certain idea of the uncertainty of knowledge. Not as reform or to make Wikipedia better, but as a precondition to just think of the production of knowledge. For more information about him:

Van der Velden: When Knowledges Meet: Database Design and the Performance of Knowledge

Posted: March 28, 2010 at 2:29 pm  |  By: Erinc Salor  |  Tags: , , , , , ,

CPoV Wikipedia Conference
I would like to talk about my research which, I believe, could help us approach some of the questions related to Wikipedia. For this research, I was interested to know, how knowledge is translated/traveled; from people to things. I focused on community healers in rural India and observed the process by which their knowledge recorded, collected and disseminated through larger geographical areas. During its many transformations, the wisdom of the local healer goes through numerous phases and inevitably loses some of its essential characteristics. This observation has lead me to the question;

Can we define databases that makes these different ways of knowing things visible?

I believe this question relates to the equal treatment of different knowledges in technology design, ultimately aiming for a technology that contributes to knowledge diversity. So, with this talk, I am hoping to bring the issue of database design to the forefront concerning Wikipedia.
To illustrate her point on different database desing approaches to varying knowledges, Maja van der Velden then presents the following examples;

  • Mukurtu archive (features collective tagging but also implementation of exclusionary login procedures)

  • Tami (ontologically flat database - Minimizes Western assumptions on data collection by only categorizing by media type. No tagging, only a list of items listed by picture/sound/video etc. Each object can be connected to each other)

  • Storyweaver (focus, again, is creating connectivities. Storytelling is a form of performative mapping. 3 protocols underlying; -Autonomous local knowledge mapping -Local ontology mapping -Merging mapping through making connections; Connections are made through context to context, not object to object in another context)
These databases have the underlying idea that Knowledge can be understood as performance. Implying that the design of a database is not preconfigured, but they emerge through usage over time.

Seen under this light of these examples, Jimmy Wales’s quote regarding the ultimate goal of Wikipedia (Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge.) emerge as a warning. Given this understanding, Wikipedia’s understanding of knowledge becomes hegemonic. Such a definition of knowledge is very singular, other ‘knowledges’ are ignored. Aboriginal knowledge links back to traditional knowledge in English language Wikipedia, which is not recognized as proper knowledge. In this sense, Wikipedia becomes a master narrative about knowledge.

I would propose the creation of a third space, a contact zone where different knowledges not clash, but interact and co-exist. As Donna Haraway calls them; “world-making entanglements”, which is the meaning I want to carry over to Wikipedia. If Wikipedia aims to provide sum of all knowledge to everyone, how can it provide dissenting knowledge claims?

I argue that Wikipedia’s decentralized production system can be taken further as a model for its ontological stance as well, rendering a database that is more respectful to differing knowledge claims.

More information about her: