Do You ‘Like’ Wikipedia? The Socialization of an Encyclopedia

Posted: September 24, 2010 at 7:08 pm  |  By: Catalina Iorga  |  Tags: , , , ,

It’s official: Wikipedia has jumped on the social media bandwagon. The online encyclopedia recently announced the introduction of an article feedback tool currently being tested on 400 articles pertaining to the WikiProject on United States Public Policy. (Melanson, 23 September 2010). What are the potential consequences of the tool’s mass implementation? I will try to answer this question by briefly addressing emerging issues like the demise of expertise and user abuse. Before doing so, I will place this Wikipedia-related development in the broader context of the socialization of knowledge on the Web.

This process of socialization culminated on the 21st of April 2010, when Facebook announced – through CEO Mark Zuckerberg – its intention to become the ‘fabric of the web’ (Siegler, 21 April 2010) by launching a social plugin called Open Graph. The newly launched application gave users the possibility to integrate the protocol on their page and turn it into “the equivalent of a Facebook page” (Facebook, 2010). More importantly, users could also 'like' or ‘recommend’ news stories on CNN. A rating system was thus added to a type of content that is often considered to exhibit a relatively high degree of objectivity and a strong orientation towards neutral, factual information. It is hard to believe that such standards are maintained in light of some stories being more popular than others on social media platforms.

What does such a system mean for Wikipedia? Are there actually any standards to uphold? While the collaborative platform has been often condemned – especially in academic circles – for its lack of accuracy and substance, renowned intellectual Slavoj Zizek did not shy away from citing Wiki entries in his latest work, Living in the End Times. The confidence granted by the philosopher indicates a “significant cultural shift” (Lovink, 15 July 2010) towards seeing Wikipedia as a trustworthy source. While the numbers of active editors have been constantly dwindling, the dedication of a core group – in 2006, 73.4% of the total number of edits had been done by 2% of the users (Wales, in Swartz, 4 September 2006) – shows that knowledge for the sake of knowledge is possible and that expertise is present on Wikipedia. A rating system might make it all crumble and make the platform follow in the disappearing footsteps of the Directory, once a staple of the Google home page and now a ghost under the “more” tab.

Consider the common scenario of an ever-spreading practice: performing a search with the query ‘[topic or person of choice] + wiki’. Just finding the required entry might lead many users to give high ratings to the page; their lack of expertise in the field would entail gratitude for the mere existence of the needed information. Are then the ‘well-sourced’, “complete”, ‘neutral’ and ‘readable’ (please see image below) the new ‘like’ buttons of collective knowledge? The lack of criticality exhibited by Web 2.0 adepts never ceases to amaze: a quick glance at a Facebook news feed shows that content (such as links to articles) is liked within seconds of being posted, suggesting that the appreciative individual is judging it based on topic and not the ideas it comprises. This leads to the question of user abuse: will individuals arm themselves with the feedback tool to sabotage content?

Figure 1: Wikipedia Article Feedback Tool

Figure 1: Wikipedia Article Feedback Tool

In 2007, California Institute of Technology student Virgil Griffith created the WikiScanner site, capable of detecting IP addresses from which articles had been edited (Blakely, 16 August 2010). Wal-Mart and Exxon Mobile were among the business giants publicly shamed for tinkering with their own entries. With the advent of the feedback application, gaining prestige within the confines of Wikipedia and - subsequently search results, which are mainly based on the number of hits received by a page – becomes easier than ever. Now aware that IP addresses can be easily tracked, pointing the finger at the company headquarters, who can stop corporations from advising their employees to highly rate Wiki entries from the coziness of their own homes or from smart phone devices? Abuse thus becomes a matter of concern that should be taken into account by the feedback tool’s developers. It is not only through the artificial boosting of stars received that abuse may occur; as Wikipedia user Sage Ross astutely remarks, “non-experts may submit low-quality ratings” (Ross in Melanson, 23 September 2010). Therefore, abuse can happen unwittingly, because of ignorance or superficiality; a highly specialized and article might get low marks from a student with a poor understanding of the subject at hand, while mediocre content might be lifted in a similar manner.

Instead of concluding, I would like to further the debate by focusing on the very semantics of the article feedback tool and what each rating scheme refers to. Does ‘well-sourced’ mean citing scholarly pieces of top tier academic journals, articles of major broadsheets, well-known blogs of critical citizens or any information relevant to the particular topic? Does it simply have to do with the plurality of sources and perspectives listed in the entry? Can Wiki articles – especially those on “boundless controversies” such as global warming (Venturini, 2010) – ever be ‘complete’? Is ‘neutrality’ even an option in the competitive environment of companies looking for search engine visibility? As for an entry being ‘readable’, the same problem concerning deeply specialized content emerges, with bad players likely to blame the game.

References

Blakely, R. (16 August, 2010). Wal-Mart, CIA, ExxonMobil Changed Wikipedia Entries. Fox News. Retrieved 19 September, 2010 from http://www.foxnews.com/story/0,2933,293389,00.html

Facebook (2010). Open Graph protocol. Retrieved 20 September, 2010 from http://developers.facebook.com/docs/opengraph

Lovink, G. (15 July, 2010). Dare to Quote! On Zizek and Wikipedia. Institute of Network Cultures. Retrieved September 20, 2010 from http://networkcultures.org/wpmu/cpov/lang/de/2010/07/15/dare-to-quote-on-zizek-and-wikipedia/

Melanson, M. (23 September, 2010). Wikipedia Introduces Article Feedback Tool. ReadWriteWeb. Retrieved 23 September, 2010 from http://www.readwriteweb.com/archives/wikipedia_introduces_article_feedback_tool.php

Siegler, MG (21 April, 2010). I Think Facebook Just Seized Control Of The Internet. TechCrunch. Retrieved 19 September, 2010 from http://techcrunch.com/2010/04/21/facebook/

Swartz, A. (4 September 2006). Who Writes Wikipedia. Retrieved 20 September, 2010 from http://www.aaronsw.com/weblog/whowriteswikipedia

Venturini, T. (2010b). Diving in Magma: How to Explore Controversies with Actor-Network Theory. Public Understanding of Science, 19(3), 258-273.

The historiography of a Wikipedia entry

Posted: September 23, 2010 at 8:38 pm  |  By: alexandre  |  Tags:

Wkp.entry

When talking about the construction of a Wikipedia page, we are talking about the action of collective creation. But it works not by simple mathematical sum of knowledge of the contributors. More than that, it is about the full development of the whole structure of this free encyclopedia, including issues such as rules of conduct, arguments and discussion.

Therefore, it involves the process of edition and their values for understand how the knowledge came to be and be understood.

To illustrate this point in a physical way, the writer and editor James Bridle published this month a set of books with every edit made to a single Wikipedia article. The entry elected was “The Iraq War”.

The result is twelve volumes, as a single old-style encyclopedia, with almost 7,000 pages that show about 12,000 changes made to the article since its inception in December 2004 until November 2009.

The collection exposes the differences of political standpoints and other relevant issues behind the article. It shows as well frequent moments of vandalism when someone, for example, erases everything and substitutes it for one single sentence.

With this work, Bridle defends the importance of the pages histories in Wikipedia as a relevant archive of historiography. A content that, he says, we can go through to understand history and that “there is no one true answer, there is no one true story”.

He believes that focus on that is a way to challenge absolutist narratives and allow everyone to see the process of history being gradually constructed. Also, he stresses that this kind of process always existed, but now is possible to store and comprehend how the facts that are treated are solidified.

“This is what culture actually looks like: a process of argument, of dissenting and accreting opinion, of gradual and not always correct codification. And for the first time in history, we're building a system that, perhaps only for a brief time but certainly for the moment, is capable of recording every single one of those infinitely valuable pieces of information”.

theiraqwar2-thumb-600x450-33466-1

The Bridle’s initiative shows that the collective process of creation presupposes the space for the clash of ideas, for suggestion for new links and for a joint review of texts.

The collective improvement of a controversial article, like the “Iraq War” entry, is the result of a process of discussion and negotiation that generates a collective vision of the subject matter.

Working on the same content, the group needs to deal with differences, check sources and refine concepts. Each discussion and argument evolving the edition of any article leads to the evolution of the texts and the behavior of Wikipedia, making it “alive” and linked with the today’s news.

Through collective creation the mainly concern is shifted from the protection of author’s name to the care with the information, to the result of the text made by the community.

It represents the experience of the community through the wiki. The progressive improvement of the content is a result of the cooperation between users and the movement of data.

The visualization of the process of the "Iraq War" entry confirms that maybe Wikipedia can be understood as the biggest table of discussion about the production and dissemination of knowledge and information nowadays.

Alexandre Guiote

All Roads Lead to Wikipedia

Posted: September 20, 2010 at 10:57 pm  |  By: ivyrobertsis  |  Tags: , , ,

Have you seen that episode of 30 Rock where Jenna bases her big screen portrayal of Janis Joplin on what she reads on Wikipedia? She’s power walking around the office, intent to lunch on felines, when Frank confesses. He edited the Joplin page as a prank. Is it funny because Jenna is so gullible? Or is it more accurate to say that Frank’s subtle blend of fact and fiction is a given for modern Internet users?

Trusting in the fundamental nature of information is a personal trait that signifies a belief in the validity of a factual, knowable universe. However, the form that the Internet takes today requires a mindset more along the lines of doublethink. In Orwell’s dystopian universe, 2+2=5 but sometimes it also equals 4. Doublethink is in the back of our minds, not yet conscious but functioning on a daily basis in a way that distorts our perception not only of information but also of reality itself. Any institution that requires doublethink should set off red flags.

I like imagining what the world will be like in a few hundred years. Will it become Idiocracy or 1984? If the powers that be have a sense of humor, I hope it will be the former. And if Wikipedia is going to play a role in the development of a whole generation of thinkers, it’s my inclination to predict that popular culture is getting dumber. Culture critics already claim that our modern world bears resemblance to Orwell’s Oceania, and his book is a great resource for thinking points on fabricating knowledge. Winston’s job is the job of all Wikipedians: taking a piece of memory, a newspaper article, for example, and embedding your rewrite into the database.

You want to believe the Wikipedia. You want to wrap yourself up in the folds of its all-encompassing knowledge. You want to trust that it is all true. At the same time you have this nagging urge to edit it. You also want the knowledge to conform to your own understanding. How can Wikipedia be everything to everyone?

Wikipedia, a repository of all information in the universe, sounds strangely similar to the fictitious Hitchhiker's Guide to the Galaxy. The online, collectively edited encyclopedia plays a part in constructing users' perception of meaning and truth, but it also constructs a sense of what is real: wikiality. It’s based on a consensus model; if enough people agree that something is true, then it becomes real. I love Jaron Lanier’s anecdote about his own Wikipedia article; he just can’t convince people that he is not a filmmaker (Digital Maoism). I can’t help but bring up the embarrassing case of John Seigenthaler, a journalist who, despite what it said on his Wikipedia page, had not been a suspect in the Kennedy assassination. The motives in a defamation of this sort can easily be blamed on the Michael Bay mentality “it’s just fun.” The Seigenthaler incident has been referred to as “sneaky vandalism,” as opposed to the otherwise outright harassment that takes place on the site. How many people have been killed by Wikipedia? Sneaky vandalism is the new form of doublethink. Verisimilitude wins. Just as in the 30 Rock anecdote, if it fits, you will believe it. No matter if the information is true and verifiable. It looks like it could be true and if it’s findable it becomes integrated into the worldwide knowledge base.

Denial forces fiction to replace fact. I’d like to be one of those morons who can cite the research studies that show that vandalism and errors are fixed within minutes--most notably the 2005 articles in Nature and The Guardian. The problem isn't so much about accuracy than about veracity. In this respect, the major qualms about Wikipedia revolve around the nature of its collective information and how users internalize that knowledge. The question for me isn't so much how accurate the information is but how Wikipedians construct collective wisdom by revisioning truth.

Wikipedia is a digital palimpsest; it's sneaky compared to its historical cousin. The medieval palimpsest shows the marks of history on its face--written, erased, and rewritten. The digital one conceals those marks. While any Wikipedian can revert an article to a previous state, the traces of that act are not immediately visible. You have to be an active analyst to see what's going on behind the scenes, and I don't believe that casual browsers look that far into the information they're soaking up. Wikiality finds its best representation in these casual searches, and Frank's Alf-Joplin mashup is both the best example and the worst consequence. What comes to light on the face of the digital palimpsest is a truth by consensus, a truth whose objective meaning is masked--perhaps even irrelevant.

The very nature of truth takes a spin. The gullible and the iconoclasts stand on either side of the war front; it's a battle between popular consensus and expert opinion. The effects of this war reach beyond the limits of the Web. The state of Texas, proud to be the second largest textbook market in the US, is working on rewrites to history, and this change will impact the way social studies is taught in American high schools. Comparing the school textbook to Wikipedia in this case ranks the experts along side the Wikipedians—that is, as far as objective truth is concerned. Doesn’t this remind you of Big Brother?

Opponents to Wikipedia, for example Robert McHenry, the former editor of the Encyclopedia Britannica, would like information to be controlled by a panel of experts who peer review information to verify its credibility and accuracy. The other camp wants information to be free, readily accessible, and collaborative. This argument goes back to the foundation of language. People create consensus in language; it’s a basic cultural truth. The dictionary is the central tome in which language is compiled. Consensus is what makes words real. What happens to culture, in this case, when the deletion of a word like “dime store” makes room for “chillax”? Language evolves along with culture. No doubt, mashups like wikiality and absotively will one day find their way into the dictionary. Does this phenomenon imply that there is also a natural lifecycle to knowledge?

Misinformation kills. Knowledge needs to be protected like an endangered species. Contrary to expert opinion, it does not find its best habitat in universities or in Wikipedia but in the threads of history. What we need are conscientious citizens to advocate for truth. If there is such a thing as objective truth, you won’t be able to find it in textbooks or on the Internet.

All roads lead to Wikipedia. So, how do you undergo Internet research about Wikipedia when search engines repeatedly return you to that same encyclopedia’s pages? Funny enough, it’s remarkably difficult finding information about Wikipedia on the Web. A search engine performs poorly in this case, because Google ranks Wikipedia's own pages highly. What could be more meta than Wikipedia's page on itself? A search for “Wikipedia surveys” pulls up the Wikipedia page describing a survey. I find it ironic that when searching for information about the reliability of Wikipedia, I end up using this page as the springboard.

Nicholas Carr's assessment of Web 2.0 and his "Is Google Making Us Stupid" provide the foundation found of a rich, insightful argument against passive browsing. Now, with Google's autocomplete algorithm, the search engine finishes your query for you. One of the last responsibilities of the user has evaporated into the Cloud. When 92% of New Yorkers can’t name a browser, what percentage will cite Wikipedia with conviction? I’m not saying we all have to be experts. We just have to be intelligent and informed. Does there have to be a division between conscientious and educated? I know the statistics. 85% of Americans survive with only a high school diploma. Judging on the benchmarks for public education in this country, high school graduates are ill equipped to deal with the information explosion, much less to engage in critical analysis of information available on the Web.

I like to think that an objective reality exists, but it’s hard to believe that these days. Popular cultural consensus is much stronger than objective truth, and you have to convince people of the latter. Some information is just true, whether it’s fact or fiction. Popular culture is a belief system, and to an extent a secular religion, where wars are fought and truth concocted on the battlefields of Wikipedia.

Ivy Roberts

Good Faith Collaboration: The Culture of Wikipedia released

Posted: September 20, 2010 at 3:23 pm  |  By: margreet  |  Tags: , ,

cover book joseph reagleThe book *Good Faith Collaboration: The Culture of Wikipedia* (2010, The MIT Press) by Joseph Reagle is now available.

A brief description is included below as well as a link to the blog announcement. There, comments can be left, follow the links to the book's webpage which includes the foreword, preface, introduction, notes and bibliography; http://reagle.org/joseph/blog/social/wikipedia/annc-good-faith-collaboration

Good Faith Collaboration The Culture of Wikipedia
Joseph Michael Reagle Jr. Foreword by Lawrence Lessig The MIT Press, Cambridge, MA. Wikipedia's style of collaborative production has been lauded, lambasted, and satirized. Despite unease over its implications for the character (and quality) of knowledge, Wikipedia has brought us closer than ever to a realization of the century-old pursuit of a universal encyclopedia. Good Faith Collaboration: The Culture of Wikipedia is a rich ethnographic portrayal of Wikipedia's historical roots, collaborative culture, and much debated legacy.

internship at INC – researcher on Wikipedia

Posted: September 6, 2010 at 8:54 am  |  By: margreet  | 

RESEARCHER (internship)

The Institute of Network Cultures (INC) is a media research centre that actively contributes to the field of network cultures through research, events, publications and online dialogue. The INC was founded in 2004 by media theorist Geert Lovink, following his appointment as professor within the Institute of Interactive Media at the Amsterdam University of Applied Sciences (Hogeschool van Amsterdam). The institute acts as a frame within several researches and meetings take place as well as publications are being published.

The project – Critical Point of View- (CPOV) is a collaboration of the Centre for Internet and Society (Bangalore, India), the Institute of Network Cultures (Amsterdam, Netherlands and various partners in Germany. The first conference took place in Bangalore on the 12th and 13th of January 2010. The second conference took place place in Amsterdam on 26th and 27th of March. A third conference will take in Leipzig, Germany on September 24-26 and will focus on the Germanspeaking part of Wikipedia (the second largest after English). After these event the project will focus on the production of an INC reader on the matter.

The CPOV collaborative research network is an ongoing project for which we are now looking for an intern. Wikipedia has emerged as the de facto global reference of dynamic knowledge. Different grouped – Wikipedians, users, academics, researchers, gurus of Web 2.0, publishing houses, artists and governments have entered into fierce debates and discussions about what the rise of Wikipedia and Wiki cultures means and how they influence the information societies we live in. The Wikipedia itself has been at the centre of much controversy, pivoted around questions of accuracy, anonymity, vandalism, expertise, and authority.

RESEARCHER (internship)

We are looking for an enthusiastic, energetic, inquisitive and precise master student with knowledge in the field of new media. As the conference has an international atmosphere, a good control of English is required. The Institute of Network Cultures offers you the possibilities for an internship of 3-6 months (starting September/October 2010 or soon after).

Tasks of the researcher:
• write (book) reviews on the subject
• collect interesting articles on the subject
• writing contributions and responses for the webpages/blog
• editorial assistance in the production of the INC Wikipedia reader

The traineeship will approximately be from 8 till a maximum of 40 hours a week.

If you are interested please mail a motivated letter with CV to margreet[at]networkcultures[dot]org. For further information you can contact Margreet Riphagen; margreet[at]networkcultures[dot]org.

Interview with Hendrik-Jan Grievink

Posted: August 20, 2010 at 9:55 am  |  By: julianabrunello  |  Tags: , , ,  |  2 Comments

What was the compelling reason for you to get involved in a project concerning Wikipedia?
As a designer, I dedicate myself to inventing new ways of understanding the world through images. I use existing images in almost every project: the Fake for Real memory game I showed during the conference is a good example of this. This is a game that pairs images to make a statement about simulation in ourl world. Another example would be the Next Nature book (to be published early 2011 by Actar, Barcelona). This book talks about what we call ‘culturally emerged nature’, or ‘the nature caused by people’. Through hundreds of images and observations we analyse the influence of technology and design on our daily lives. These projects can be looked up on respectively http://www.fakeforreal.com and http://www.nextnature.net

A lot of images that we use are created by ourselves (co-editor Koert van Mensvoort and me) but even more come from all kinds of sources: some traceable, others not. We strive to credit all authors and would love to pay them a good fee for using their material – if this was possible, which it isn’t. Paying for all visual content would quadruplicate the costs of such a publication, which would make it impossible to get published. As for the credit part: we will always credit artists for creative images, but for small or generic images – even commercial ones, we’re not going to do this, it’s just way too time-consuming. Also, a lot of the times it’s realy hard to trace back the origins of an image in today’s copy/paste culture.

When I heard of the Wiki Loves Art contest I was immediately sympathetic to the initiative, because I think these kinds of best-practise projects are crucial to change the way people (in this case: museums and cultural institutions) think about intellectual property. They have to realise that limiting the availability of resources limits cultural production in a very direct way. Next to that, I am interested in everything that signals new forms of cultural production and the crowdsourced archiving of images certainly does that.

Are you a Wikipedian yourself or a user? Have you contributed to any articles? What about Wikicommons? Any contributions from you that we can find there?
Although I have contributed a few articles I would not consider myself a wikipedian, neither do I have any ambitions of becomimg one. As for Wikimedia Commons, I must confess I am abit of a leech: I use it often, yet only contributed little – I am sorry to say. But I will improve my life in the future! Actually, I am thinking of uploading a batch of my work and visual elements from my work when I have the time. So it can last a while, as I am extremely busy, at least until the end of this year.

In 2006 however, I did the graphic design for the My Creativity convention, organised by the Institute Of network Cultures. One of the main images I designed for this event was a copyright symbol that I manipulated into a snake that bites its own tail. A very strong image i must say in all modesty, even today. I uploaded this image to Wikicommons, but so far it has not been used by people, only on some incidental powerpoint shows here and there. It hooked me up to Paul Keller from Creative Commons Netherlands, who proposed to turn it into a font so that it could be used on people’s PC. Now Paul and I collaborate on the WLA book. But we still should do the font. Maybe there are readers of the INC website who would be interested in doing this?

At the conference you gave us some insight on the ‘Wiki Loves Art’ book. How is the production going? What can we expect to see in the book? Have you planned already a possible launching date?
At the moment we’re editing material from Flickr (both texts and images) and correspondence between the organisers and the contributers. This are mainly small observations which will be presented in an A-Z index that runs through the whole book. You can think of comments by other Flickr users, statements from the participating museums and short analyses of the visual material, like a comparison of different images from the same object, or a special page dedicated to the person who obsessively photographed all the information labels in the Boijmans museum in Rotterdam. This is the most light-hearted part of the book. More a documentation or celebration of the project. Next to that, we have longer written essays by contributors who reflect on topic relating to the project, like copyright, amateur culture, curatorial issues, crowdsourcing etc.

My little baby in the project however, is the visual contribution part. We invited artists and designers – young and upcoming as well as more established ones – to make a derivative work, a remix you could say, of the images from the WLA project. For example, one artist makes a series of merchandising products by combining images from the WLA database with online printing-on-demand tools. This results in products like an Isaac Israels Thong, Mondrian Sneakers etc. and conceptualizes a kind of virtual Wiki Loves Art Museum – a museum that exists only through it’s DIY merchandising.

We do this because we are convinced that good practices of remixing otherwise copyrighted material can help change the way cultural institutions think about intellectual propery in a positive way, in the hope that in the near future they will loosen up their IP regimes. For me, this part of the project is very exciting because it relates the most directly to my practice as a designer and personal interest in this project: the (re)use of cultural resources for new forms of cultural production. In the end, it is all about the question: how can we have as much high quality visual material accessible for everyone as possible?

The launching date has to be confirmed, but the plan is to release the book during the Economy of the Commons event, organised by INC in November this year.

By reading the blog entrance on your presentation, I came across the following sentence: "A real challenge would be to think of Wikimedia Commons as a goal in itself." Would you care to comment on that?
Well, as I stated in the answer to your previous question, my personal agenda is as much banal as it is idealistic: I just want as many visual resources online so that I can use them for the projects I do. The idealistic part is that I want this for cultural actors all over the globe because I believe this contributes to a better world. At least one where cultural production is more free and less restricted by intellectual property laws.

Since its start, Wikimedia Commons is mainly set up as a picture archive for Wikipedia. There is nothing wrong with that because Wikipedia is still very text-based and could use some imagery here and there. My problem with it is that Wikipedia is very much linked to literate culture – a perception of the world through the written word. But the cycles of meaning production in the world are more and more dominated by images (whether you think that is a good or a bad thing). If you want this process to be more democratic instead of dominated by corporations, than the tools behind visual production should be more democratic and collaborative than tey are now.

For example, the visuals from the keynote presentation that Al Gore used to adress his global warming statement have made a huge impact on (at least a part of) the world’s ideas on this topic. Regardless what your position is in this debate, one has to acknowledge the highly manipulative character of these images (as is the very nature of images, but that’s another story). So when there’s manipulation through images, whether it is for ‘the good cause’ or not, we deal with power relations and power always corrupts and thus needs contra-power. In this case, we talk about Visual Power: the power of images to change the way people think. To go short, I think Gore’s presentation should be available online in an open format, including all the media files that come with it, so that it can be re-used, mis-used and re-interpreted by anyone. In a flash of self-chosen naïvity (call it idealism), I would say the same should go for voting forms, press photography, corporate imagery and so on.

I see a huge potential for Wikimedia Commons there. It is shocking how little cultural actors (like my designer friends) are aware of the existence of Wikimedia Commons, let alone that they use it or even contribute to it. This is not just because of their lack of interest, they just don’t know about it because they have other tools. They google everything because Google image search seems to have a monopoly as image archive. But we all know what comes with finding images on Google: they’re often poor quality, badly tagged and from unclear sources and often not copyright-free. Flickr is a bit more reliable in that respect, although it is hard to cut through all these Eiffel tower pictures there. But for some reason the architecture and design of these two websites is just so much more convenient for cultural actors to get their images from. There is a lot to be learned from these commercial giants. I see a huge potential if Wikimedia Commons would be able to abondon their librarian’s mentality and rethink itself as the world’s largest collaborative media database. But before that happens, we need to realise that understanding the world through text is on its demise and rethink the world’s cycles of meaning production from a visual culture perspective.

Do you have any recommondations for Wikimedia Commons?
  1. Acknowledge that we live in a visually oriented culture and act to that
  2. Learn from succesful tools on the web such as Google’s image search and Flickr.com
  3. Try to engage image makers and other people who professionally use images on a daily basis

Interview with Felipe Ortega

Posted: July 21, 2010 at 9:46 am  |  By: julianabrunello  |  Tags: , ,  |  1 Comment

As a computer scientist and engineer you could have chosen among many different objects of study, why Wikipedia?
For 3 reasons:
  • Firstly, at that time (2005) it was very clear for me that Wikipedia was a new Internet phenomenon, a flagship initiative, one that would play a key role in the Open Movements arena, just like Linux did back in the 90s.

  • Secondly, Wikipedia was creating a vast compilation of activity records from millions of editors. It had the potential to become one of the largest online commuities in the world, thus making it unique and offering a great challenge to analyze it, at the same time.

  • Finally, because of its goal. We all know that compiling all human knowledge is a very ambitious goal. But even in its current state (still a long way to go for completing its mission) Wikipedia has proven to be valuable for hundreds of millions of persons. Its content is adhered to the definition of free cultural works [ http://freedomdefined.org/Definition ]. It's a perfect example of the advantages of this totally open model for knowledge production. So, I thought it deserved my attention to explain why this approach works in practice, defying our classical preconceived models for collaborative production.

What gave you the idea to develop the WikiXRay tool?
From Eric S. Raymond's "The Cathedral and the Bazaar": "Every good work of software starts by scratching a developer's personal itch".

I don't think that my tool (still in "beta" state after 3 years) can be really considered a "good work of software", yet. But it did start from a personal itch: the lack of a reliable tool to parse available data dumps published by WMF. I think I tried 3 or 4 tools, and mwdumper was the best one, but I constantly got errors parsing the huge Engilish Wikipedia dump.

Thus, I thought carefully about my options and it was pretty clear that, at least, I should attempt to build my own tool if I ever wanted to do serious work with those data.

In addition to this, reproducibility was (and still is) and obsession for me. Too frequently, we find yet another quantitative analysis on a certain set of FLOSS projects, online communities, etc. And you cannot reproduce it, validate it and learn from their approach for the simple reason that the source code is not available anywhere! So, the best option for me was to code libre software, under GPL, and let others to freely inspect, adapt, use and distribute it.

What was the most difficult thing in developing this tool?
There were many difficult problems to solve, but I think the most complicated one was to build a parser with decent performance. Parsing huge XML files was a little bit tricky, since you cannot store the whole thing in memory (+2TB is out of question).

Interestingly, last month I found a new parser (also libre software) that apparently outperforms mine. But that's also the good side of libre software. Now I can try to adapt it in my own code :) . In any case, I'm very happy that WikiXRay is still one of the best options out there to analyze Wikipedia dumps.

How do you see WikiXRay being used in future research? Can it be used on other platforms as well?
Well, right now it's a cross-platform tool, and I've heard of people using it on Windows, Linux and Mac OS without problems. All dependencies (MySQL, GNU-R and Python) are available for these platforms. This is great when you're trying to build something useful for a broad audience. It's a good starting point.

However, right now it only works for MediaWiki dumps. In the future, I'd love to have alternative parsers for other wikis (Tiki Wiki, DokuWiki, MoinMoin... well I can not mention them all!).The advantage is that the analysis modules are independent from the type of platform analyzed, as long as you store the info using the same data model.

Other ideas could be:
  • Feeding a web interface to visualize the current state of your wiki (community, activity, trends). This could be great as a service for large wiki communities, like Wikia, or even for enterprise wikis.

  • Adding support for seasonality analysis, trends and forecasting (something I'd love to work on as soon as we find time and funding :-) ).

  • Integrating additional perspectives like: Social Network Analysis, effort analysis and patterns, co-authorship, forecasting/identifying prospective top-quality articles...

You have stated on your WikiSym 2010 summary that “the need to find solutions for social scientists and engineers to work together in interdisciplinary groups, is probably one of the top-priority issues in [your] research agenda.”How do you envision it? Any concrete plans already?
This is absolutely right, and it's a must for reasearch on virtual communities today. If we just focus on numbers, trends, activity patterns etc. and we obviate the social side of the story, we're missing in practice half of the whole picture. We will never understand virtual communities completely.

I'd like to explore why there seem to be so many difficulties to create interdisciplinary working teams (tech sciences + social sciences). Admittedly, we may "speak" and "interpret" things in a bit different way. But we must overcome these differences, since they are not the problem but the *asset* when we build this kind of teams.

We don't have concrete plans for WikiSym 2011, yet. But I'd love to have a panel where researchers from both "worlds" can sit around a table with the audience and debate on best practices for interdisciplinary teams to become a reality.

What are your predictions concerning the future of Wikipedia and its influence?
[Smiling] You know, last time I answered this question, there was a strong polemic, so I tend to be cautious (even though subsequent research and reports have surfaced some of the key problems we had already identified).

My impression is that Wikipedia influence will keep on growing, specially in development countries, as Wikipedias with fewer articles attract more contributions and expand their coverage. At the same time, we need to find new ways for weaving edits from academics and scholars with the contributions from the large existing community, to address the problem of creating content in very specific niches of knowledge. This also involves spreading the word on how to use Wikipedia effectively among students and scholars alike, and eliminating widespread FUD among many faculties who still think that Wikipedia is just "a perfect source for students to avoid doing the hard work".

Finally, I think we still have to find "new ways of using Wikipedia". Many people use it as an encyclopedia, right. But we can also see a source for information contextualization and categorization, for creating thesaurus, for translation... The longer the list, the best we will exploit the many possibilities of this "everyday partner".

Anything else you would like to add? Comments, ideas, thoughts?
If I can make a call, I would really like to spot the attention of funding entities (private and public foundations, EU government, etc.) on the urgent need to invest in research on virtual communities. In our own research group, we have spent several years in this research line with very little support, but with great results and outreach, so far. NSF is funding 7 or 8 projects on virtual communities and open collaboration in the USA, while EU is somewhat lagging behind, in my opinion.

Not only Wikipedia, but virtual communities in general are a core piece of the Information Society, and the so-called "Future Internet". If we just focus on technology but forget about "people using technology" we may lose and important perspective: that this Information Society should be user-centered, and not technology-centered. There must be a serious effort to fund research lines to understand this reality, creating interdisciplinary teams to face up the challenge.

Dare to Quote! On Zizek and Wikipedia

Posted: July 15, 2010 at 2:57 pm  |  By: geert  |   |  4 Comments

Reading Slavoj Zizek's 2010 Living in the End Times book, I noticed the author quoting Wikipedia a number of times. No big deal, you would say but it is significant in the light of the ongoing controversy around Wikipedia as a reliable (academic) source. Zizek is considered a leading intellectual, and arguably Europe's most famous baby boom philosopher  (b. 1949). This postwar generation entered their professional lives in the age of the (electronic) type writer, well before the introduction of the personal computer. As authors they are the ones that profit from the copyright regimes and are known to have a firm grip on the print media. Even though computer literate (read: they can type) their cultural attitude towards the WWW is ambivalent--if not absent. If a critic like Zizek includes Wikipedia in his verbal stream of consciousness it is a sign of the times that Wikipedia has become an integral part of our media environment.

So far, in the case of Zizek, referenced media have been books, followed by feature films. Forget newspapers, television and radio, or hearsay conversations and correspondences. If Zizek starts telling stories it is based on contemporary myths and current affairs that are supposed to be known to all of us, written down without detailed references. If Zizek starts to theorize he talks aloud, like in a bar, and it is this oral, narrative element that constitutes his philosophy. To include Wikipedia in these rants is part of a significant cultural shift and it is odd that Zizek himself is unaware of this Event.

As far as I know Zizek has not yet written at length about the internet, mobile phones, e-readers or computer games. What in Living in the End Times resurfaces is his fascination for post-humanism and techno-gnosis. The example analyzed in this book it is MIT's Sixth Sense research program ("wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information"). Much like Zizek's analysis of early 90s Virtual Reality it is in particular the embodiment of information that interests the psycho analyst. Zizek cannot distinguish between networked communication and the 'virtual architecture' (if possible in 3D) of Second Life or World of Warcraft. The invisible, non-representational nature of new media falls outside of Zizek's theory scope. Zizek is not the only theorist we can blame for the confusion between cyberspace and virtual reality. But twenty years onwards you would think that someone could have given Zizek a basic update what has happened in the world of new media.

Libertarians are indeed featured (Ayn Rand) but the Silicon Valley techno-libertarian religion is not an object of study for Zizek. It is in particular the dark, apocalyptic side of Ray Kurzweil that interests Zizek, not Google. An interesting example of his  blind spot for the networked nature of capitalism is on display in Zizek's visit to Google's Mountain View headquarters where he spoke during the Authors@Google lunch series in October 2008.  Zizek is the perfect example if you want to show how little cultural studies and film theories have to say about the internet. As Zizek recently admitted to The Guardian: "I am a good Hegelian. If you have a good theory, forget about the reality." The problem in this case is that Zizek not even as a basic set of critical notions, let alone a theory. This could be reason why he remains silence about it in his books.

All the more interesting that In Living in the End Times we can find at least five references to Wikipedia (always without URL).  The books also refers to used internet sources in thirteen footnotes in which he does point to actual web locations but forgets to mention dates or author names. The editors at Verso Books did not include Wikipedia in the index. They did include 'internet' with three page references, but none of them are significant, idea-wise. "He is very much a thinker for our turbulent, high speed, information-led lives," Sophie Fiennes remarks in the same Guardian piece. Sure, but it is a pity that when Zizek will eventually slow down to write his real Magnum Opus its topic will be Hegel and not the internet.

Wikitrust reduces ‘oracle’ ilusion

Posted: July 7, 2010 at 10:22 am  |  By: julianabrunello  |  Tags: , ,

Wikitrust is an open source software that can be added to your firefox extensions. It shows you "the origin and author of every word of a wiki, as well as a measure of text trust that indicates the extent with which text has been revised".

Jaron Lanier points out in his essay Digital Maoismus that "In the last year or two the trend has been to remove the scent of people, so as to come as close as possible to simulating the appearance of content emerging out of the Web as if it were speaking to us as a supernatural oracle." His critique here is clear if you take the platform design of Wikipedia into account. Due to the lack of vision on who, rather than what, has authored an article in Wikipedia, the information presented on the articles' main pages wins an 'oracle' quality. I mean, it just makes it look a lot more authoritative than it would look like, if you could see the work of the many individuals who were actually behind the authoring of an article.

Using Wikitrust allows the Wikipedia user to have a better understanding of how Wikipedia articles come to existance, making the 'oracle' ilusion generated by its design to be, at least, reduced. The information does not come out of 'the web' or 'Wikipedia' anymore, it can be traced back to individuals.

For the add-on download go to https://addons.mozilla.org/en-US/firefox/addon/11087/ .

Manuel Schmalstieg and the Wiki-Sprint

Posted: June 29, 2010 at 10:48 am  |  By: julianabrunello  |  Tags: , , ,  |  3 Comments

by Juliana Brunello

Manuel Schmalstieg has recently directed an event called Wiki-Sprint. The sprint concept derived from the code-sprints of the FLOSS communities, in which a team of developers came together in order to engage in some serious code-writing. Only this time, there would be no code-writing, but article writing for Wikipedia.

For this, a team of contributors was gathered to take part in the event's workshop, which consisted of rewriting and improving the Wikipedia article of VJing. I ask Schmalstieg about this experience:

Most Wikipedia articles are written in collaboration by people who have not met. Why did you choose to make it a face-to-face event? What are the benefits in writing an article this way?
I should make clear that my main target was actually not the improvement of this article… That was the alibi, but the actual objective was to explore the performative act of collective writing, in the tradition of Surrealism… and also informed by the "reading performances" of artists such as Arnold Dreyblatt or Rainer Ganahl, as well as the recent practice of collaborative technical "writing-sprints" that has emerged from the free software scene, exemplified by the Flossmanuals project. The public reading of the article, and its inclusion in Wikipedia (as an audio article), was the crowning of this performative aspect. To answer your question, the benefits of this method of writing are: a) a much faster writing process, b) strict time management, and c), the unique experience of human interaction that derives from such an intensive work situation.

Were the people involved in the sprint already involved with Wikipedia?
Most of them were not. When searching for volunteers for this project, I targetted different groups: specialists in the field (audiovisual performance and VJing), who had already written on that topic; heavy contributors of the existing Wikipedia articles (in English and French version). From the 11 people who participated, 3 had some previous editing experience on Wikipedia (one of them, Sleepytom, was a major contributor of the VJ article in 2006).

How was it to work with the previous editors of this Wikipedia article, who did not belong to the sprint-group?
As far as I am aware, the article has practically no regular editors. It is the result of initial work by a handful of wikipedians in 2006-2007, who aren't active anymore. The rest is the result of "drive-by editing". So we didn't have any response from the original editors of the article (with the exception of Sleepytom).
One exception: during the writing-sprint, I had the chance to meet Anthere (Florence Devouard), who had contributed photos from Pixelache festival to the French version of the article. But she isn't a specialist of visual art, so she didn't contribute to the text of the article.

Have you been following the changes on the VJing article in Wikipedia? Were there any? How do you feel about them?
Yes, I have been watching the changes - a bit like a gardener who planted vegetable seeds, and observes the slow growing process. There were some small corrections, minor additions, a bit of cleaning up. I think it's a good sign - it would prove that a "solid" article with consistent references can act as a barrier against spammy self-referential edits (which were very frequent on the previous version).

How difficult was it to organize such an event? Do you recommend it and could you give us any tips?
The project was organized in a very short timespan, which was a problem for getting any institutional funding partners (also the fact that it doesn't fit into any category does not help). In the end, everything was done on a shoestring budget, all the logistics being handled by the Mapping Festival team who loved the project. On the other hand, it was great to see how easily people from the "general public" understood the idea and how positively they responded to it. We had a lot of enthousiastic feedback.
However, I wouldn't repeat the project in this format, as it really was a context-specific experiment.

Anything else you would like to add? Comments, ideas, thoughts?
The most recent news: we are currently preparing a print publication of the article, with some statements and reflections from our participants. This very weird relationship between Wikipedia content and print distribution is something I'm looking forward to work on in the future (the next planned step is a printed edition of my favorite Wikipedia article: The KLF).
For more background information on the wiki-sprint, here is a FAQ page that I wrote during the preparation phase: wiki.greyscalepress.com/FAQ
Finally, if after this interview you want to actively engage with Wikipedia, I suggest creating some of the missing articles on pioneering media artists, such as Kit Galloway and Sherrie Rabinowitz, for instance.