Triple the number of African elephants, so when they do get extinct we still have ourselves a wikiality. A jolly suggestion coming from Stephen Colbert which got embraced by Wikipedia critics. It was basically what happened on the encyclopedia: vandalization to make a point. But what really happened was the elephant page being protected. It was solely editable by administrators for about two weeks before being given to the public again.
Who is in control of Wikipedia? Jimmy wales? Wikilawyers? It’s a question based on fear. The idea of a leviathan (he shows this image), a totalitarian ruler, applying body politics. One of severe oppression.
But this is not the most interesting, Stuart Geiger argues. He is a researcher at Georgetown University in the Communication, Culture, and Technology program. People are important, he says, but not soly. Because, in addition, technology is important. The technocal structure makes the social possible. With this in mind, the question becomes: what is in control of Wikipedia?
Order in Wikipedia is increasingly produced through technical means. There are bots, deployed to ban users, enforce policy and to inform admins on debates. Tools, such as specialised scrits that automate various social actions, like nominating an article for deletion. Code, in the form of wiki-software, with for instance ‘flagged revisions’. And analytics, which are being used as arguments in themselves, making judgements of the future.
Geiger gives a quick definition of bots: actors who perform repetitive and mundane tasks. But, they are most definitely important. Bots are becoming more and more sophisticated. Checking language use and censorship, moving into the sphere of admins. It has become a quite complex phenomenon, plus, the amount of bots is still growing. Therefore, more research is needed (considering the different languages).
His main example, or case study if you will, is HagermanBot. A bot which added the {{unsigned}} template to edits which where not signed. It was ment to let people know that they forgot to sign their comment, which wasn’t a contorversial guideline. However, what the bot did was fixing the comments in real time, leading to the point that signatures became one of the most enforced policies. The thing to remember is that bots are not allowed to act by default; they have to be approved.
The case of the HagermanBot is interesting, because it seems to be a non-controversial bot, but it nevertheless can show us some of the mechanics of power and policy. Because there is a bot policy, a bot approval group. In order to install a bot, people need to submit a proposal, whereafter it gets approved or declined. HagermanBot got accepted fast.
Because new software reveals incosisitencies, a big discussion on Wikipedia commenced. People didn’t want to sign their comments. A loose social norm was turned into a very strong law, they said. Geiger shows some of this debate on slides. In the end, the introduction of the HagermanBot triggered a debate of the role of bots in general.
The solution to solve these complaints came in the form of an opt-out list. But these solutions to controversies are ‘black boxed’, Geiger argues. The opt-out mechanism became a standard reply to every objection. It showed the technical structure.
So, does society dominate technology? Or does technology dominate society?
Determinist narratives are easy, he says. But dialectis of socio-technical systems are hard. Kinds of things people talk about change as technology changes. Beyong “code is law”, Geiger notes that social structures and techological system are co-productive. Compromises are forced to happen.
The opt-out list as a compromise, for example, turns out to be a standard. What we need to be thinking about then, is how technology and society affect each other.
During the Q&A and via Twitter Stuart Geiger notes that he wanted to incorporate Kelty’s theory in his speech, proposing it as a more contemporal frame.