Public Debate: Future of the Public Domain in Europe

Posted: November 14, 2010 at 11:12 pm  |  By: morgancurrie  |  Tags: , , , , , , , , , , , ,

Friday session, 20.30– 22.30

Documents and sources on the Public Domain

Paul Keller from Kennisland opened the session with a bit of historical context: the 1990 Proposal for a Hypertext Project by sir Tim Berners-Lee. From the very beginning the internet has been a place of debate about what should and what shouldn't be in the public domain - an influential text was the discussion started by Eric Kluitenberg on the nettime mailing list Frequently Asked Questions About the Public Domain.
James Boyle's influential book from 2008, The Public Domain, has been the groundwork for anyone talking, thinking and/or reflecting on the subject since.
In the framework of the Communia project, the Public Domain Manifesto was published, which led to an official charter for the European Library Project Europeana: the Public Domain Charter.

Creative commons have become the public domain mark, but meanwhile many answers prevail, such as who should take care of this public domain and what infrastructures can we revert to.

James Boyle: Problems of the Public Domain

In his Skype session, Professor of Law at Duke Law School James Boyle laid out three main problems in discussions of the Public Domain debate - and what could be a number of solutions to them:

On the conceptual level, an essential task is to make politicians, institutional bodies and citizens aware about the ecology of knowledge, whereby a key driver for creativity stems from the interaction of the free and the controlled: We get creativity by control over the realm of the free - in culture, science, politics, etc. More common, however, seems to be an understanding that takes a universal stand only for the free. Yet, one may neglect the balance between the two realms on basis of such a conceptualization. Boyle illustrates this giving the example of a lawyer who believed that every breach of copyright may be understood as a violation of Human Rights and who was shocked by the idea that some people may see this very differently.

The second problem seems to be a cultural one. In the first place, when the copyright terms were extended, we applied the most speech-restrictive set of laws on most of 20th century culture. Since there is no speech-enhancing part of copyright law that could allow access and translation, we are denying ourselves access to most cultural expressions - even to orphaned works. Currently, 90% of creative and scientific materials are commercially unavailable but their copyright is still extended - the benefit of royalties for authors applies only to a very small fraction of historically produced documents. More often, there is no benefit to anyone.

Meanwhile, with e-culture rapidly growing and researchers looking less and less at off-line sources, the pyramid of knowledge seems to have been inversed: books have become the realm of the inaccessible. While spatial distance rendered inaccessibility before, actors such as Google now redefined access as immediate and disconnected from spatial fixation of cultural expressions.

The choice of where to publish what is persistently laid in the hands of the author - and without the conscious choice of an author none of us will have access to a wok produced by a contemporary in our lifetime. Free culture, public domain culture, will not contain any work made by our contemporaries unless they actively stipulated it - it is copyrighted by default. In such a way, we have cut ourselves off from our collective heritage, while generative production was always made by remixing.

The last problem identified by Boyle is based on the realm of science. The public domain is an essential component of scientific undertakings. While there are assumptions that issues around copyrights function better in this realm due to the relevance of technological progress and the resulting shorter term for patents of 20 years (in comparison to copyright terms of lifetime + 70 years), this seems not to hold true.

Referring to Berners-Lee, Boyle points out that the web was envisioned for science. As a tool to link and share scientific material, forming sets of hypertext links: a web of connections that would enable human knowledge to flourish. What we are confronted with now however, is that it works great for consumption and personal interests, yet for science the web hasn't progressed very much: most literature is locked-up behind fire- or paying walls, which makes a dense set of connections to other online material impossible. Yet, the power of internet lies in these connections. Further, current copyright law regulates items which are not even covered by copyright law in the first place, such as footnotes. They are merely regulated by a technological accident, made exclusive by walls of payments.

Next to this, what we see is an expansion of the range of scientific subject matter. In the EU, the Database Directive had no empirical benefit to database industry while imposing economically inefficient structures on scientists and citizens. At the same time, we see an expansion of patent rights to cover new developments such as gene sequencing or synthetic biology, whereby fears exist that these expanded realms of intellectual property inhibit new scientific fields to grow. Could foundational truths established in new areas be protected under patent law?

Now what can be done to alleviate these processes? In the political sphere, orphan rights legislation could be feasible, since expansions of copyrighted material that is economically inaccessible is an embarrassment to cultural industries. Other stimuli lie in private hacks/privately created solutions such as general licenses in software, Creative Commons licenses expanding copyright by individual authors as open commons or maybe even Google books as an example for private initiatives. Playing into the political and privately constructed commons as alternatives, there seems to be an enormous role for public education. Initiatives such as the public domain manifesto and Communia are extremely valuable and in more domains, from music sampling over software development to libraries and the sciences, people need to realise what public domain means - and what it means if it's taken away from them.

Bas Savenije: Challenges for libraries in the digital era

On basis of James Boyle's talk, Keller notes that librarians may have become the keepers and custodians of material that is generally difficult to access, opening the podium for Bas Savenije, Director General of the Dutch Royal Library, Koninklijke Bibliotheek. In his talk, Savenije reflects on the changing role and challenges that libraries are confronted with in connection to current developments regarding the public domain.

Savenije makes the observation that our current generation more and more seems to perceive that knowledge which is not digitally accessible is non-existent. Documents which are not yet digitalised may therefore be threatened to be forgotten. To counter this, libraries increasingly turn to digital content and digitalisation of their stock. Currently, about 4 mil. items are preserved by the National Library of the Netherlands which is aiming for their full digitalisation. However, Savenije points out that current calculations estimate that digitalisation until 2013 would cover approximately only 10%. What reasons hinder the digitalisation progress?

The first obstacle is the lack of financial funding for such undertakings, as grants are often made available only for specific purposes such as the digitalisation of parliamentarian papers of the Netherlands or newspapers for research purposes. On the European level, there is money available to build infrastructures or better access but when it comes to the actual digitalisation of books, there is a lack of funding.

A way of dealing with these circumstances is seeking for public-private partnerships, as recently happened with Google. This cooperation however was based on three conditions: 1) everything that is in the public domain in print, should be in the public domain digitally forever; 2) there should be no exclusivity of the documents to Google as a contractor and 3) there would be no non-disclosure agreements. On basis of this agreement, the digital material is now available for almost any educational or research purposes as long as it is not commercial. A dilemma remains: old manuscripts are not digitalised by Google due to matters of insurance for these vulnerable manuscripts. But public-private partnerships with companies that take care of these materials often run under different conditions that may create exclusivity.

A specialised company like ProQuest, taking care of such projects for example for the National Library of Denmark, grants free accessibility to the documents only per country - access from anywhere else is locked behind a paywall for 10-15 years. Yet without such commercial partnerships, it is questionable to what degree the necessary progress towards digitalisation can be accomplished.

A second obstacle of course is copyright. Solutions to legal regulations, e.g. around orphan works, are being developed in various EU-countries in the form of extended collective licensing. A case which helped to gain attention for this issue was the Google Books Settlement as it brought discussions about copyright and issues of open access for scientific information on the European agenda.

Digital born content presents another challenge to the workings of libraries, as it demands quite different approaches to collection and preservation. Is the traditional task of libraries to cover everything 'published', i.e. in an operational definition any document that is published with an ISBN number, still valid? With the Library of Congress' move towards tweet collection, should the National Library of the Netherlands collect tweets as well? Or would it rather be the task of the National Archive? How about scientific blogs? Common definitions of 'publication' do seem to fall short under the current wealth of data creation. Connected to this are the implications of the organizational diversity of heritage bodies facing these developments. Current publications sometimes work with annotated data models, integrating text and expressions of research-relevant data, audio and visual files in different media. How can the division of media over different organizations integrate multimedia? Since partial, media-based data collection would ruin the data, how does one arrange cooperation and build inclusive infrastructures?

Further, different types of libraries serving different parts of society are being funded by different sources. Being as a consequence different systems, how do the users of these libraries get access to data that is not available in 'their' specific library? An approach is needed that grants integrated access to data across territorial separation. It seems thus that the trend goes towards a National Digital Library with one common back-office where every library should provide access to their own community. While we have great examples such as Europeana, a big challenge is the envisioning of a 'Nederlandeana' that has a common infrastructure and responds to the changes across all domains induced by the Internet. Another issue remains securing the sustainability of such undertakings; however due to temporal reasons, this was not further elaborated upon.

James Boyle responds to the apparent dilemma of the increasing access to data connected to a shift towards integration of territoriality into the international public domain. How can one address these developments? According to Boyle, the first best solution would be to shorten copyright laws to about 8 to 17 years which seems to be optimal terms. However, that does essentially not remove territoriality. The second best solution then would be private or public-private initiatives, which would however also likely be territorial. An interesting case is that of Google, as the Google Booksearch Settlement may open up a market for competitors and thereby introduce new challenges for the public domain aside from territoriality. The adoption of the second best solution to Boyle seems more reasonable due to the potential of public licensing to achieve great things.

Bas Savenije adds that on a European level, the issues such as territoriality are being addressed several times per year in meetings of different National Libraries or Research Libraries. Conditions for public-private partnerships have been translated into a draft paper that is still being worked on. Responding to a question from the audience about the libraries' access to interface and search-functions developed by private partners, Savenije mentions that own data bodies are larger that the database of scans produced by Google and thus need to be developed independently: "I hope we can be as good as Google is in that".

Lucie Guibault: The European Dimension

European directive on copyright: recent discussion on public domain - including the World Intellectual Property Organisation (WIPO).

The digital agenda plays decisive role in stimulating discussion of the public domain. The piling up of rights may be counterintuitive and counterproductive, which is why the European Union plays an important role in a new wave of public domain discussions - focused in the thematic network COMMUNIA, which discusses what the public domain means for science, for the public and for the general interest.

A working group has been working on an adaption of the public domain manifesto, which is meant to take a bold and provocative stand against copyright law. When attempting to define what copyright law is, we notice that lots of writing on the public domain is US-based (Duke University, et al.). Communia puts it on the map of the European discussions.

The manifesto proposes a split between structural public domain (both works whose protection has expired and all works that aren't copyrightable) and voluntary sharing of information (creative commons, ...). It proposes the adoption and development of the public domain mark and includes a number of general principles:

  • We should be able to freely build upon all information that is available out there: Public domain should be the rule and copyright the exception.
  • Most info is not original enough or copyright protectable so should freely flow.
  • What is in the public domain should remain in the public domain
  • Copyright is a right limited in time.

Simona Levi: Citizens' and artists' rights in the digital era

Simona Levi, Director of Conservas and involved in the annual oXcars, shares her point of view on public domain issues with a stronger focus on the position of contemporary producers of cultural goods and reflects on the immediate challenges and contributions of the artist in relation the public domain. Levi is connected to the FCForum, a platform and think-tank which understands itself as an international, action-oriented space to build and coordinate tools that enable civil society to answer to urgent political changes in the cultural sector. The FCForum brings together voices from liberal culture interest groups, yet explicitly also reaches out to the general audience to prevent the absorption by institutional bodies. In 2009, the FCForum set up the first Charter for Innovation, Creativity and Access to Knowledge, a legal companion supporting work in the cultural domain by addressing copyright legislation in the digital era.

In 2010, the main focus of the forum was how to generate and defend new economic models for the digital era. Issues of the public domain are thereby especially approached from the understanding of the artists' work being seated in shared spaces. The current charter 2.0.1 'Citizens’ and artists’ Rights in the Digital Age' has particularly a practical focus, trying to challenge and influence political decision making on local and European level. While the points addressed in the charter are obvious and logic to those working in the artistic field, they may sadly not be to political bodies.

Some of the points mentioned by Levi are then:

  • Copyright terms should not exceed the minimum term set by the Berne convention (30 years), on the long term it should be shortened to about 8-17 years.
  • Jurisdiction should allow every publication to directly enter into the public domain.
  • Results of work and development funded by public money should be made accessible to everyone.
  • Research funded by educational institutions should be made accessible to the public.
  • There should be no restriction on the freedom to access, link or index any work that is already freely accessible to the public online, even if it not published under a shareable licence, an issue touching on the issue of private/non-private copying legislation.

According to Levi, another problem is posed by the legal framework around quoting, which is not allowed in many parts of Europe if the goal does not serve pedagogical or investigative reasons. Even if content creators support the quoting of their work, these limitations remain in power.

One major problem is connected to collecting societies. The problem here lies in the fact that there is few control on these bodies. They collect financial support in a public manner, yet redistribution of this money for their members works in a problematic way, since only a fraction of these members can vote on these decisions, based on royalties brought into the organization. This means that artists with a lower ability to bring financial assets into the group are essentially excluded from decision making. As a last point, Levi notes that they restrict the application of free licensing in the cultural industries and thereby silence potential interests of the artist in engaging with the pubic domain.

Respondents

Charlotte Hess: Protection of access in knowledge - In need of a movement.

Charlotte Hess, Associate Dean for Research, Collections & Scholarly Communication at Syracuse University Library and internationally renown commons theorists, briefly reacts to the different positions that have been mapped out by the previous speakers.

While she recognizes that there is still a much to do about issues such as open access, it seems that Europe however is on a good track concerning these developments. In 2001, the first conference ever on the public domain had been organised by James Boyle and Hess points out how important and influential his contribution also through his work on the intellectual enclosure movement had been. What is needed for now seems to be a movement similar to the Environmental movement, something that could draw together all sorts of different people to protect our access to knowledge.

While much of the issues we are facing in this context are based in the realm of law, there certainly is also a general lack of awareness, neglecting negotiating and fighting any of the legal restriction. Yet, in a world where the dominance of corporations is so strong, the youth needs to be encouraged to go into the political arena instead of being swallowed by corporate entities.

Marietje Schaake is a member of the European parliament on behalf of D66, member of the committee on culture, media and education and co-founder of the Intergroup on New Media of European Parliament members.

In the closing part of the Public Debate, she discussed what the European Parliament can do for the public domain and what the sentiment in the parliament is towards the public domain. Overall, due to heavy lobbywork, the suggestion is raised that counterfeiting and breaches of copyright are to be the next war after terrorism. Currently, the odds are against reform of copyright law - there's a strong lobby in favor of keeping and strengthening the status quo and a severe lack of knowledge about public domain issues.

A lot can be done though, to influence the existing wave:

  • present facts & studies about the impact of new technologies
  • have artists proclaim their trust: conservative lobby currently seems to defend creativity
  • present data: seeming neutral helps alleviate the image of being "squatters of the internet who want to kill innovation"

We need to find a way to open up a polarized climate where it's safer to choose the establishment, if we want to secure an internet and knowledge culture that relies on principles of the public domain.

A contribution to a critique of free culture: From Anti-Copyright to the Creative Anti-Commons

Posted: November 14, 2010 at 4:38 pm  |  By: morgancurrie  |  Tags: , , , , , , ,

Dymitri Kleiner is a software developer working on projects that investigate the political economy of the internet, and the ideal of workers’ self-organization of production as a form of class struggle. Born in the USSR, Dmytri grew up in Toronto and now lives in Berlin. He is a founder of the Telekommunisten Collective, which provides internet and telephone services, as well as undertakes artistic projects that explore the way communications technologies have social relations embedded within them, such as deadSwap (2009) and Thimbl (2010).

Kleiner’s latest project however was the writing of “The Telekommunist Manifesto”, a book published by the Institute of Network Cultures of Amsterdam and launched in the Economies of Commons 2 conference at De Balie, Amsterdam, on Friday the 12th of November, 2010. Even though, Dmytri Kleiner introduced himself as a hacker or an amateur writer and not as an academic, his work stimulated an interesting and rather intense discussion.

In his talk in the session “Critique of the “Free and Open”, Kleiner follows the track from Anti-Copyright to the Creative Anti-Commons and presented it to the audience as a tragedy in three parts, which are described below.

Kleiner opened his talk claiming that copyright was not created to empower artists. Instead, it was created by the bourgeoisie to embed cultural production in an economic system that encourages the theft of the surplus value. In this context, the notion of “author” was invented just to justify the making of property out of cultural works.

Further on, he presented the three parts of the “tragedy”:

ACT 1: ANTI-COPYRIGHT- A proletarian movement

Anti-copyright is a proletarian or anti- capitalist movement, embraced by labor struggles, that opposes mightily to the existence of the individual author. It is based on the ideal of a common culture with no distinction between producers and consumers. An ideal that makes it incompatible with the needs of dominant Capitalism. Consequently, Anti-copyright could never be seen as nothing more than a threatening, radical fringe.

ACT 2: COPYLEFT – Invasion of the Bourgeoisie

Copyleft on the other hand, an alternative form of dissent to copyright that emerged with the development of Free Software, is fully compatible both to the contemporary economic system and to Bourgeois capitalism. The reason is simple: Software is capital. Producers depend on it so that they can produce and make profit out of the circulation of the generated consumer goods. Free software’s sustainability is based on the fact that it is largely funded by corporations, since it’s cheaper and more flexible compared to software developed from scratch.

ACT 3: THE CREATIVE COMMONS –The author reborn as Useful Idiot

Both Anti-copyright and Copyleft celebrated the death of the author. In the Creative Commons model however, that was boosted by the success of the Free Software Movement “the author is reborn as useful idiot”. He can’t reserve “all rights” as copyright suggested, but only “some rights”, including the options of “Non Derivative” and “Non Commercial”. The paradox of the Creative Commons, as presented by Dmytri Kleiner, is that the consumer is deprived from his right to become a producer and that the “Free Works” are not actually free, but private. Thus, the “Commons” turns into an “Anti-Commons”, where free sharing encounters constantly the barrier of incompatible licenses.

COPY-JUST-RIGHT

Developing his thought on the Creative Commons, Dmytri Kleiner claims that it is not an example of Anti-copyright or of Copyleft but a case of Copy-just-right: the model is based on content distribution but the “mechanical royalties” are being eliminated. However, he comes up with the antidote: Copy-far-left.

COPY-FAR-LEFT: THE ANTIDOTE

Copy-far-left, acknowledging that neither Anti-Copyright not Copyleft can provide a sustainable solution for economic support of cultural producers, brings a new perspective: the Non-Commercial clause used by some creative commons license can be sustained but with limitations. Copy-far-left suggests that commons based commercial use should be allowed explicitly to Co-operatives, Collectives, Non-profits and independent producers and not to profit seeking organizations. That way, free licensing remains a source of funding, while consumers regain the right to become producers, as long as they don’t become exploiters.

In his epilogue, Dmytri Kleiner points out that in order to have a free culture we have to assert a free society. Cultural workers have to work in solidarity with other workers on that big idea.

By Ilektra Pavlaki

Hans Westerhof: Paying the Cost of Access

Posted: November 13, 2010 at 7:04 pm  |  By: morgancurrie  |  Tags: , , , , , , , , , , , , , , , , , , ,

Hans Westerhof, deputy director at the Netherlands Institute for Sound and Vision and program manager of the Images for the Future project spoke about the cost that access bears on archives in a digital world in the panel Materiality and Sustainability of Culture.

The traditional archive of Sound & Vision consists out of 21 vaults, spread out over 5 floors in a building that opened in 2006. In the digital domain, the institute collects over 1 petabytes a year in both daily broadcasting ingest and the results of the Images for the Future project. The physical archive is contiuously starting to look very different: servers are replacing vaults (13-15 PB exected in 2014).

But what really weighs upon the budget, is not necessarily the storage costs (however we, as archives, have a firm disadvantage when it comes to negotiating server costs, as this is a new terrain to us), but the cost of access. Broadcast professionals and public users expect immediate digital hi-res downloads, which brings along:

  • robot tape-arms
  • proxies for all hi-res videos
  • software for creating proxies & restore
  • management system for data files

Sound and Vision is working hard at other ways of access through user generated content and metadata (wiki, openimages, waisda, collaborations with wikipedia) and education programs which tend to be project-based (academia, ed-it).

We can control the cost of access in numeorous ways, but the bottomline is that by going digital we create a lot more (re)use, which is a costly success.

We (the cultural heritage institutions) need to become better at:

  • going digital (get real, get digital, understand and own the subject matter which is often new to our institutions)
  • collaborating ( think and act beyond institutions boundaries, share platforms, create economies of scale)
  • negotiate (with service providers & private companies)
  • improve on arguing the value & benefits of our case (we're creating monetary value for others and should start thinking within the framework of people that can help us out)


Economies of the Commons 2 coming up!

Posted: October 5, 2010 at 9:49 am  |  By: admin  |  Tags: , , ,

This conference is second in a series on the political economy of new media and its consequences for the cultural sector. The first, Economies of the Commons, was held in Amsterdam and Hilversum from April 10-12, 2008, organized by De Balie, The Netherlands Institute for Sound and Vision, and Kennisland. For the 2010 event, the organising committee is strengthened by the participation of The Institute of Network Cultures and UvA, Faculty of Media Studies.

  • 11 November - OVC Europe at Hilversum, Beeld en Geluid
  • 12 & 13 November - De Balie at Amsterdam

The past 10 years have seen the rise of a variety of on-line public domain and open access information, knowledge and media resources attracting millions of users, sometimes on a daily basis. The overwhelming success of Wikipedia is without doubt the most striking case in point. Also known as ‘digital commons’, these resources signal a remarkable adoption of the collaborative production models of free and open source operating systems, such as Linux, that historically altered dominant practices of software development years earlier. No longer left to the exclusive domains of digital ‘insiders’, open content resources are rapidly becoming widely used and highly popular.

But while protagonists herald the naissance of ‘free culture’, and specifically its low-cost barriers and accessibility, proprietary content producers worry that these open resources severely compromise their market position and opportunities. Skeptics speak vehemently against open content, claiming it undermines the established “gatekeeping” functions of authors, the academy, and professional institutions while lacking of any reliable business model of its own. Even as they storm the bastions of institutional content producers, the sustainability of these open content resources remains entirely unclear: after all they are not free of costs!

The traditional public finance model can offer support to protect and nurture these open access resources but is not reliable in the context of shrinking government and public budgets. Derivative or spin-off economies can also be of help but do not seem to create the economies of scale needed to sustain high quality and labor intensive production and preservation. Donations prove unreliable in times of economic slump, especially for less prominent initiatives, while new style media conglomerates of the Yahoo!, Apple iAd, and Google type increasingly dominate the on-line advertising market. Many of these services rely on user (consumer) profiling, but this is undesirable from a privacy point of view.

Instead, of waiting for an innovative break-through revenue model (the ‘killer-mod’), what should be evaluated is how novel hybrid solutions of content distribution and preservation strategies can create both viable markets and open access resources serving the public interest. Beyond the critical discussion of new business approaches and revenue models, this also requires a redesign of the existing legal and political frameworks (national and international) in which content producers and archivists operate.

As Intellectual Property Rights law becomes a new breed of ‘rocket-science’, it is clear that a political intervention is needed to open up the innovative markets and serve the public interests of a 21st century information economy and digital culture, neither free nor proprietary, but instead open and competitive, creating a global level playing field.