Resources

Public Debate: Future of the Public Domain in Europe

Posted: November 14, 2010 at 11:12 pm  |  By: morgancurrie  |  Tags: , , , , , , , , , , , ,

Friday session, 20.30– 22.30

Documents and sources on the Public Domain

Paul Keller from Kennisland opened the session with a bit of historical context: the 1990 Proposal for a Hypertext Project by sir Tim Berners-Lee. From the very beginning the internet has been a place of debate about what should and what shouldn’t be in the public domain – an influential text was the discussion started by Eric Kluitenberg on the nettime mailing list Frequently Asked Questions About the Public Domain.
James Boyle’s influential book from 2008, The Public Domain, has been the groundwork for anyone talking, thinking and/or reflecting on the subject since.
In the framework of the Communia project, the Public Domain Manifesto was published, which led to an official charter for the European Library Project Europeana: the Public Domain Charter.

Creative commons have become the public domain mark, but meanwhile many answers prevail, such as who should take care of this public domain and what infrastructures can we revert to.

James Boyle: Problems of the Public Domain

In his Skype session, Professor of Law at Duke Law School James Boyle laid out three main problems in discussions of the Public Domain debate – and what could be a number of solutions to them:

On the conceptual level, an essential task is to make politicians, institutional bodies and citizens aware about the ecology of knowledge, whereby a key driver for creativity stems from the interaction of the free and the controlled: We get creativity by control over the realm of the free – in culture, science, politics, etc. More common, however, seems to be an understanding that takes a universal stand only for the free. Yet, one may neglect the balance between the two realms on basis of such a conceptualization. Boyle illustrates this giving the example of a lawyer who believed that every breach of copyright may be understood as a violation of Human Rights and who was shocked by the idea that some people may see this very differently.

The second problem seems to be a cultural one. In the first place, when the copyright terms were extended, we applied the most speech-restrictive set of laws on most of 20th century culture. Since there is no speech-enhancing part of copyright law that could allow access and translation, we are denying ourselves access to most cultural expressions – even to orphaned works. Currently, 90% of creative and scientific materials are commercially unavailable but their copyright is still extended – the benefit of royalties for authors applies only to a very small fraction of historically produced documents. More often, there is no benefit to anyone.

Meanwhile, with e-culture rapidly growing and researchers looking less and less at off-line sources, the pyramid of knowledge seems to have been inversed: books have become the realm of the inaccessible. While spatial distance rendered inaccessibility before, actors such as Google now redefined access as immediate and disconnected from spatial fixation of cultural expressions.

The choice of where to publish what is persistently laid in the hands of the author – and without the conscious choice of an author none of us will have access to a wok produced by a contemporary in our lifetime. Free culture, public domain culture, will not contain any work made by our contemporaries unless they actively stipulated it – it is copyrighted by default. In such a way, we have cut ourselves off from our collective heritage, while generative production was always made by remixing.

The last problem identified by Boyle is based on the realm of science. The public domain is an essential component of scientific undertakings. While there are assumptions that issues around copyrights function better in this realm due to the relevance of technological progress and the resulting shorter term for patents of 20 years (in comparison to copyright terms of lifetime + 70 years), this seems not to hold true.

Referring to Berners-Lee, Boyle points out that the web was envisioned for science. As a tool to link and share scientific material, forming sets of hypertext links: a web of connections that would enable human knowledge to flourish. What we are confronted with now however, is that it works great for consumption and personal interests, yet for science the web hasn’t progressed very much: most literature is locked-up behind fire- or paying walls, which makes a dense set of connections to other online material impossible. Yet, the power of internet lies in these connections. Further, current copyright law regulates items which are not even covered by copyright law in the first place, such as footnotes. They are merely regulated by a technological accident, made exclusive by walls of payments.

Next to this, what we see is an expansion of the range of scientific subject matter. In the EU, the Database Directive had no empirical benefit to database industry while imposing economically inefficient structures on scientists and citizens. At the same time, we see an expansion of patent rights to cover new developments such as gene sequencing or synthetic biology, whereby fears exist that these expanded realms of intellectual property inhibit new scientific fields to grow. Could foundational truths established in new areas be protected under patent law?

Now what can be done to alleviate these processes? In the political sphere, orphan rights legislation could be feasible, since expansions of copyrighted material that is economically inaccessible is an embarrassment to cultural industries. Other stimuli lie in private hacks/privately created solutions such as general licenses in software, Creative Commons licenses expanding copyright by individual authors as open commons or maybe even Google books as an example for private initiatives. Playing into the political and privately constructed commons as alternatives, there seems to be an enormous role for public education. Initiatives such as the public domain manifesto and Communia are extremely valuable and in more domains, from music sampling over software development to libraries and the sciences, people need to realise what public domain means – and what it means if it’s taken away from them.

Bas Savenije: Challenges for libraries in the digital era

On basis of James Boyle’s talk, Keller notes that librarians may have become the keepers and custodians of material that is generally difficult to access, opening the podium for Bas Savenije, Director General of the Dutch Royal Library, Koninklijke Bibliotheek. In his talk, Savenije reflects on the changing role and challenges that libraries are confronted with in connection to current developments regarding the public domain.

Savenije makes the observation that our current generation more and more seems to perceive that knowledge which is not digitally accessible is non-existent. Documents which are not yet digitalised may therefore be threatened to be forgotten. To counter this, libraries increasingly turn to digital content and digitalisation of their stock. Currently, about 4 mil. items are preserved by the National Library of the Netherlands which is aiming for their full digitalisation. However, Savenije points out that current calculations estimate that digitalisation until 2013 would cover approximately only 10%. What reasons hinder the digitalisation progress?

The first obstacle is the lack of financial funding for such undertakings, as grants are often made available only for specific purposes such as the digitalisation of parliamentarian papers of the Netherlands or newspapers for research purposes. On the European level, there is money available to build infrastructures or better access but when it comes to the actual digitalisation of books, there is a lack of funding.

A way of dealing with these circumstances is seeking for public-private partnerships, as recently happened with Google. This cooperation however was based on three conditions: 1) everything that is in the public domain in print, should be in the public domain digitally forever; 2) there should be no exclusivity of the documents to Google as a contractor and 3) there would be no non-disclosure agreements. On basis of this agreement, the digital material is now available for almost any educational or research purposes as long as it is not commercial. A dilemma remains: old manuscripts are not digitalised by Google due to matters of insurance for these vulnerable manuscripts. But public-private partnerships with companies that take care of these materials often run under different conditions that may create exclusivity.

A specialised company like ProQuest, taking care of such projects for example for the National Library of Denmark, grants free accessibility to the documents only per country – access from anywhere else is locked behind a paywall for 10-15 years. Yet without such commercial partnerships, it is questionable to what degree the necessary progress towards digitalisation can be accomplished.

A second obstacle of course is copyright. Solutions to legal regulations, e.g. around orphan works, are being developed in various EU-countries in the form of extended collective licensing. A case which helped to gain attention for this issue was the Google Books Settlement as it brought discussions about copyright and issues of open access for scientific information on the European agenda.

Digital born content presents another challenge to the workings of libraries, as it demands quite different approaches to collection and preservation. Is the traditional task of libraries to cover everything ‘published’, i.e. in an operational definition any document that is published with an ISBN number, still valid? With the Library of Congress’ move towards tweet collection, should the National Library of the Netherlands collect tweets as well? Or would it rather be the task of the National Archive? How about scientific blogs? Common definitions of ‘publication’ do seem to fall short under the current wealth of data creation. Connected to this are the implications of the organizational diversity of heritage bodies facing these developments. Current publications sometimes work with annotated data models, integrating text and expressions of research-relevant data, audio and visual files in different media. How can the division of media over different organizations integrate multimedia? Since partial, media-based data collection would ruin the data, how does one arrange cooperation and build inclusive infrastructures?

Further, different types of libraries serving different parts of society are being funded by different sources. Being as a consequence different systems, how do the users of these libraries get access to data that is not available in ‘their’ specific library? An approach is needed that grants integrated access to data across territorial separation. It seems thus that the trend goes towards a National Digital Library with one common back-office where every library should provide access to their own community. While we have great examples such as Europeana, a big challenge is the envisioning of a ‘Nederlandeana’ that has a common infrastructure and responds to the changes across all domains induced by the Internet. Another issue remains securing the sustainability of such undertakings; however due to temporal reasons, this was not further elaborated upon.

James Boyle responds to the apparent dilemma of the increasing access to data connected to a shift towards integration of territoriality into the international public domain. How can one address these developments? According to Boyle, the first best solution would be to shorten copyright laws to about 8 to 17 years which seems to be optimal terms. However, that does essentially not remove territoriality. The second best solution then would be private or public-private initiatives, which would however also likely be territorial. An interesting case is that of Google, as the Google Booksearch Settlement may open up a market for competitors and thereby introduce new challenges for the public domain aside from territoriality. The adoption of the second best solution to Boyle seems more reasonable due to the potential of public licensing to achieve great things.

Bas Savenije adds that on a European level, the issues such as territoriality are being addressed several times per year in meetings of different National Libraries or Research Libraries. Conditions for public-private partnerships have been translated into a draft paper that is still being worked on. Responding to a question from the audience about the libraries’ access to interface and search-functions developed by private partners, Savenije mentions that own data bodies are larger that the database of scans produced by Google and thus need to be developed independently: “I hope we can be as good as Google is in that”.

Lucie Guibault: The European Dimension

European directive on copyright: recent discussion on public domain – including the World Intellectual Property Organisation (WIPO).

The digital agenda plays decisive role in stimulating discussion of the public domain. The piling up of rights may be counterintuitive and counterproductive, which is why the European Union plays an important role in a new wave of public domain discussions – focused in the thematic network COMMUNIA, which discusses what the public domain means for science, for the public and for the general interest.

A working group has been working on an adaption of the public domain manifesto, which is meant to take a bold and provocative stand against copyright law. When attempting to define what copyright law is, we notice that lots of writing on the public domain is US-based (Duke University, et al.). Communia puts it on the map of the European discussions.

The manifesto proposes a split between structural public domain (both works whose protection has expired and all works that aren’t copyrightable) and voluntary sharing of information (creative commons, …). It proposes the adoption and development of the public domain mark and includes a number of general principles:

  • We should be able to freely build upon all information that is available out there: Public domain should be the rule and copyright the exception.
  • Most info is not original enough or copyright protectable so should freely flow.
  • What is in the public domain should remain in the public domain
  • Copyright is a right limited in time.

Simona Levi: Citizens’ and artists’ rights in the digital era

Simona Levi, Director of Conservas and involved in the annual oXcars, shares her point of view on public domain issues with a stronger focus on the position of contemporary producers of cultural goods and reflects on the immediate challenges and contributions of the artist in relation the public domain. Levi is connected to the FCForum, a platform and think-tank which understands itself as an international, action-oriented space to build and coordinate tools that enable civil society to answer to urgent political changes in the cultural sector. The FCForum brings together voices from liberal culture interest groups, yet explicitly also reaches out to the general audience to prevent the absorption by institutional bodies. In 2009, the FCForum set up the first Charter for Innovation, Creativity and Access to Knowledge, a legal companion supporting work in the cultural domain by addressing copyright legislation in the digital era.

In 2010, the main focus of the forum was how to generate and defend new economic models for the digital era. Issues of the public domain are thereby especially approached from the understanding of the artists’ work being seated in shared spaces. The current charter 2.0.1 ‘Citizens’ and artists’ Rights in the Digital Age’ has particularly a practical focus, trying to challenge and influence political decision making on local and European level. While the points addressed in the charter are obvious and logic to those working in the artistic field, they may sadly not be to political bodies.

Some of the points mentioned by Levi are then:

  • Copyright terms should not exceed the minimum term set by the Berne convention (30 years), on the long term it should be shortened to about 8-17 years.
  • Jurisdiction should allow every publication to directly enter into the public domain.
  • Results of work and development funded by public money should be made accessible to everyone.
  • Research funded by educational institutions should be made accessible to the public.
  • There should be no restriction on the freedom to access, link or index any work that is already freely accessible to the public online, even if it not published under a shareable licence, an issue touching on the issue of private/non-private copying legislation.

According to Levi, another problem is posed by the legal framework around quoting, which is not allowed in many parts of Europe if the goal does not serve pedagogical or investigative reasons. Even if content creators support the quoting of their work, these limitations remain in power.

One major problem is connected to collecting societies. The problem here lies in the fact that there is few control on these bodies. They collect financial support in a public manner, yet redistribution of this money for their members works in a problematic way, since only a fraction of these members can vote on these decisions, based on royalties brought into the organization. This means that artists with a lower ability to bring financial assets into the group are essentially excluded from decision making. As a last point, Levi notes that they restrict the application of free licensing in the cultural industries and thereby silence potential interests of the artist in engaging with the pubic domain.

Respondents

Charlotte Hess: Protection of access in knowledge – In need of a movement.

Charlotte Hess, Associate Dean for Research, Collections & Scholarly Communication at Syracuse University Library and internationally renown commons theorists, briefly reacts to the different positions that have been mapped out by the previous speakers.

While she recognizes that there is still a much to do about issues such as open access, it seems that Europe however is on a good track concerning these developments. In 2001, the first conference ever on the public domain had been organised by James Boyle and Hess points out how important and influential his contribution also through his work on the intellectual enclosure movement had been. What is needed for now seems to be a movement similar to the Environmental movement, something that could draw together all sorts of different people to protect our access to knowledge.

While much of the issues we are facing in this context are based in the realm of law, there certainly is also a general lack of awareness, neglecting negotiating and fighting any of the legal restriction. Yet, in a world where the dominance of corporations is so strong, the youth needs to be encouraged to go into the political arena instead of being swallowed by corporate entities.

Marietje Schaake is a member of the European parliament on behalf of D66, member of the committee on culture, media and education and co-founder of the Intergroup on New Media of European Parliament members.

In the closing part of the Public Debate, she discussed what the European Parliament can do for the public domain and what the sentiment in the parliament is towards the public domain. Overall, due to heavy lobbywork, the suggestion is raised that counterfeiting and breaches of copyright are to be the next war after terrorism. Currently, the odds are against reform of copyright law – there’s a strong lobby in favor of keeping and strengthening the status quo and a severe lack of knowledge about public domain issues.

A lot can be done though, to influence the existing wave:

  • present facts & studies about the impact of new technologies
  • have artists proclaim their trust: conservative lobby currently seems to defend creativity
  • present data: seeming neutral helps alleviate the image of being “squatters of the internet who want to kill innovation”

We need to find a way to open up a polarized climate where it’s safer to choose the establishment, if we want to secure an internet and knowledge culture that relies on principles of the public domain.

Towards a Radical Archive: De Balie's Eric Kluitenberg

Posted: September 9, 2010 at 3:06 pm  |  By: morgancurrie  |  Tags:

Eric Kluitenberg is a well-traveled theorist, writer, and lecturer who has produced media events in The Netherlands, Moscow, and Estonia, and also currently heads the media program at De Balie, a cultural and political hotbed in Amsterdam. I’ve had to the luck to attend some of Eric’s events, such as 2010′s Electrosmog fest, and witness Eric speak eloquently about the digital commons in a lecture inspired by his 2008′s Economies of the Commons conference. That event’s essential question – how will we support our cultural archives in the digital age? – seems largely unanswered, or at least in an unfolding state, and Eric has taken an active role to see that the cultural heritage sector is represented in the fall out.

When I approached him for an interview, Eric asked to focus the discussion on the Living Archive project at De Balie, an work-in-progress that neatly exposes the role played by theory in the technical design of online archives. The Living Archive, in its very architecture, stresses the importance of ephemera, dissenting messages and mutable, collaborative scaffolds to produce conversations around the objects we transmit into the future.

MC: What is the Living Archive? Does it exist yet?

EK: The Living Archive is a really a theory, founded on the problem that most traditional archives are organized through selection, inclusion and exclusion. There is a strong tendency in these traditional models to leave out what is called ephemera, for instance flyers or temporary productions, like the Prelinger Archive’s industrial films that’s made for one particular purpose then expected to disappear. Ephemera are considered noise, irrelevant, and as a result, a large aspect of living culture is often excluded.

This is the topic of The Order of Things by Foucault, who says that dominant powers ultimately determine the structures of discourse and consequently what should be preserved in the archive. Everything that falls out is automatically irrelevant. This classical notion of archiving excludes too much, a problem increasingly recognized within the archiving world itself and even more pressing now that digital media allows countless people to put weird stuff online. The official archiving world doesn’t have an effective way to deal with all this ephemera. Foucault also critiques the archive as a static collection of dead phrases no longer a part of living culture, because it’s already enshrined in a system of power. You have to dig out the power structures underneath, figure out who created the rules, the political motives and material conditions behind it all. That’s why he calls it archeology. A static archive is a completely closed thing, in contrast to the multiple, dispersed discourses of present, living culture. To Foucault there are dominant forces that try to control this dispersal and order it in a particular way, making the archive immutable.

The Living Archive, then, is a theoretical model that makes discursive practice its active component. It refuses the canon of collected statements that Foucault critiqued and doesn’t accept any kind of necessary outcome. It emphasizes active discursive production, a continuous discussion and debate about everything in the archive, using the archive as a material for the discussion itself.  Wikipedia is an example of this, maybe the best at it so far.

Obviously you can’t store everything. Discrepancy operates on many levels. An artist found this wonderful quote of Nietzsche: “in order to imagine it is necessary to forget.” It’s a classical archival problem: if you store everything, you lose the space for imagination or thinking or reflection, or active, living culture. So there is a healthy tension all the time.

The digital nature of archives has unique potential to challenge older ideas of the repository. Can you talk about how the material properties of digital media make this the case?

If you store things in a digital format, you can always reprocess them. They remain in an unclear state – is the text ever finished? You could see this as a threat or a chance to make materials publicly available to be worked upon. That’s why Wikipedia is important – not only can you work on the documents stored in the system, you can also track the document history. In that sense Wikipedia, with all its shortcomings, is the most sophisticated model of the living archive. The process is revealed as open-ended, rather than left to a professional clan of archivists who have their established systems and abhor the idea of public participation.

What specific archiving projects are you working on at De Balie?

When I first came to work here, there was no archive whatsoever, only a huge pile of flyers and announcements stored in big folders in the basement. We introduced a database driven website in ’99 to kick start a digital archive. Around that time we also began streaming live events, and when the technology became available, we created the online video archive.

The real aim is to capture live discussion and debate as it unfolds over the years. So we created a web-based annotation system allowing you to annotate who is speaking in the videos and link the videos to web resources or to articles in De Balie’s site. As a theme runs over years, the results cluster around dossiers. There’s still an editorial hand that makes certain selections, but this whole process started a living archive trajectory.

tmf-logo

Another project is the Tactical Media Files, a documentation resource for tactical media practices worldwide. Today we do not have active discussion deciding what to include and exclude, but we want to open it up to a collaborative editorial model. Many people can be invited to edit, creating a collective editing open forum.  If you can fuse a documentation resource combined with an active, open discussion extended in time, a form that Wikipedia allows, then you would get closer to a living archive.

As these archives challenge traditional notions of authorship and hence copyright and power structures, do you think the economic structures of traditional institutions will evolve as well?

That’s not for certain. It’s important to look at this from an historical perspective. Consider the history of radio. Technically any radio receiver can be turned into a signal; Brecht recognized the enormous potential of decentralizing and distributing two-way space, later echoed in Howard Rheingold’s early euphoric description of the Internet as a distributed structure and virtual community. But legislation turned radio into one a one-way medium, and it became an authoritarian instrument, like in Rwanda, where violence was largely organized by radio. In the same way, copyright legislation can very easily and effectively be turned into a tool of extreme censorship, used to push the Internet the way of radio. This open space could be shut down by regulation, and the Internet becomes the next mass medium with some paraphernalia on the edges for people to play around with. Dissident, sub-cultural, and political messages would be without a decent audience.

On the other hand, the question of sustainability isn’t immediately addressed by open access and copyleft practices. If you want to move this discussion forward, even beyond less restrictive copyright policy, it becomes inevitable to consider the economic sustainability of these resources. But for the most part, we’re completely without a clear solution. State funding is not in all cases forthcoming or desirable. Donation models only work for famous projects, but even Wikipedia has trouble sustaining itself. The advertisement model still doesn’t go far. Becoming another commercial media operator is not good for the independence of a message.

One exciting model is the open source area where, because of their self-motivated activity, people move into well-paid jobs or become supported by institutions. So there is derivative economy.  But this for me is the main problem: one the one hand, copyright turning into the ultimate censorship instrument, and on the other, the absence of a clear sustainable revenue model to support our digital archives.