JILT 2009 (1) - Hoeren
The European Liability and Responsibility of Providers of Online-Platforms such as ‘Second Life’
Prof. Dr. Thomas Hoeren
Institute for Information Law
University of Muenster
Virtual platforms like Linden Lab’s ‘ Second Life’, which simulate real life, have become very popular recently. As the vast number of participants grows, the possibility of misuse grows as well. This article casts a light on possible provider liability, if crimes or misdemeanours are committed by users by using child pornography as an example.
A liability may arise from three different possible violations of duties committed by the provider. First of all the provider must restrain from designing the platform, using pornographic content himself or enabling users to display pornographic content. Secondly he must technically assure that no minors may access the platform and finally he may be obliged to control the user-provided content.
In order to point out, which duties the providers have to fulfil, it is necessary to define, what exactly ‘ Second Life’ is. Therefore the article gives an overview of the liability of providers under the E-Commerce Directive 2000/31/EC in the European Union. According to the author ‘Second Life’ is a completely new phenomenon. Because of the fact that most of the content is user-provided, Linden Lab is classified as a host provider. Therefore it does not have to control all the content by itself but is obliged to remove content on notification. In that case the duty to hinder the reappearance of similar content may follow. The implementations of relating duties in Germany, France and the UK is presented and critically valued.
This is a refereed article published on 28 May 2009
Citation:Hoeren, T., ‘The European liability and responsibility of providers of online-platforms such as ‘Second Life’, 2009(1) Journal of Information, Law & Technology (JILT), <http://go.warwick.ac.uk/jilt/2009_1/hoeren>
Second Life, liability, Directive 2000/31/EC, Youth protection, Age verification system, notice and take down procedure.
Online platforms such as ‘ Second Life’ are quite a new phenomenon in the virtual world. Although a first version was made publicly available in 2003 and competitors such as ‘Active World’ released their platforms as early as 1995, it was not until 2006, that ‘Second Life’ received widespread attention as an online platform. In March 2007 Second Life registered 5 million accounts. 5 million people potentially use ‘Second Life’ by now to chat with friends, participate in virtual adventures or do business. In April 2007 ‘Second Life’ received its first negative publicity. Participants, so called ‘ Avatars’, had used the online platform to offer pornography, including child pornography, and had engaged in virtual paedophile activities, so called ‘age play’ between an adult and an infantile ‘ Avatar’.
Two types of ‘cybercrimes’ need to be distinguished with regard to this recent scenario. The first type involves real world child pornography. Pornographic content involving a real minor is exchanged or shown using facilities or services of ‘Second Life’, e.g. via screens in virtual cinemas. In the second scenario, one ‘Avatar’ has chosen the shape of a child and engages in sexual activities with another adult ‘Avatar’. Both ‘ Avatars’ in this case are probably adults as ‘Second Life’ is restricted to players above 18. Whether the virtual presentation of sexual intercourse with a child amounts to child pornography is disputed among the different jurisdictions. The cases have been subject to permanent debate in the media worldwide.1The public was concerned that especially minors were not sufficiently protected from harmful content and could be exposed to pornographic material. The legal debate centred around two main issues: the legal assessment of virtually committed acts, e.g. the rape of a virtual character by a virtual character; and the legal accountability of the provider of an online platform, which provided the framework to engage in such a virtual activity. For the following analysis only the second issue will be of interest. The liability or punishability of an ‘Avatar’ is a separate issue which does not need to be taken into consideration when determining the liability and duties of the provider.
For the provider of an online-platform, such as ‘Second Life’, liability could potentially arise from the violation of duties on three different layers:
The design of the platform: If the design of the platform itself induces violent or sexually explicit activities or permits these activities to be covertly performed, the provider could be obliged to change the design or the structure of the game itself.
The access to the platform: If an online platform provides violent or sexually explicit content, the access must be restricted to adults only. Providers must install a mechanism, which successfully bars minors from accessing the website or platform.
The control of the platform: Most of the information presented on a forum has not been uploaded by the provider himself. Nevertheless, as the exchange of information takes place within a framework provided by him, he might have certain duties to control the exchange of information on his platform.
Since the implementation of the E-commerce Directive 2000/31/EC there has been a lot of case law on the nature and scope of the legal obligations and duties of Content, Host or Access Providers. But since online platforms such as ‘Second Life’ are a new phenomenon which has not yet been subject to relevant litigation, their legal assessment is particularly challenging. The crucial question is: What exactly is ‘Second Life’? Is it a chat forum combined with elements of an online game? Is it a multiplayer online game focusing on socialization? Is it a business platform and an online game at the same time? Or is it an entirely new ‘ virtual world’?
Only after the nature and functioning of ‘Second Life’ platforms have been defined, the duties and obligations of its provider can be assessed. It is obvious that the legal obligations imposed on the creator of an online game will differ substantially from those imposed on a mere host provider.
2. What exactly is ‘Second Life’? – A legal appraisal of the nature and functioning of the ‘Second Life’ platform
In everyday language, ‘Second Life’ is often referred to as an online computer game. ‘Avatars’ are frequently called ‘players’ and the conditions set up by Linden Lab are considered the ‘rules of the game’.
What exactly are the characteristics of an online computer game? Classic computer games focus on a specific goal; to achieve it, the players have to score a certain number of points or overcome certain obstacles and in the end there are winners and losers. Of course, modern online games are often more complex than the classic game based on one score-system, or one goal common to all players. Games often take place in a relatively complex world and allow the players a wide variety of activities. ‘The Sims Online’ for example is a Multiplayer Online Social Game which is non-linear and does not stipulate an ultimate objective. Its focus does not lie on combat or strategy but basically on social life in a virtual world. Some argue therefore that ‘Second Life’ also is a Massive Multiplayer Online Social Game (MMOSG) focusing on socialization.2Instead of being based on combat, it focuses on the interaction of the participants and the creation of virtual objects, including models and scripts.
However, ‘Second Life’ does not have most of the characteristics of a game. It has no points, scores, winners or losers, no levels nor an ultimate strategy. Instead it allows the participants to interact in any way they choose to build up a ‘virtual life’.3Although ‘ The Sims Online’ is quite complex, it cannot be compared to the complexity of ‘Second Life’. A major difference is that ‘The Sims Online’ is used by all participants to play and socialize. The socializing effect is a by-product of participating in a game, traditionally a means to interact with other people. Almost all online games nowadays have a chat-option to talk to other players and to interact. There is, however, no link between activities in ‘ The Sims Online’ and activities in the real world.4People enter the game with only one purpose: to play and socialize as their Sims character.
‘Second Life’ in contrast is used by participants with widely varying motivations. The motivations behind registering as a ‘Second Life’ user vary almost as much as does the motivation to use the Internet itself. Nowadays, ‘Second Life’ is not only used to play a game as a leisure time activity. It is also used to exchange opinions, political or other; it is used as platform for distance learning; and for some it has become a serious source of income.5The following list gives an overview of the activities pursued on ‘ Second Life’:
Arts and Creativity: Many of the participants of Second Life have a creative background. There is a large virtual community of artists and designers which uses ‘Second Life’ to demonstrate and advertise their real-world art and to create new virtual art. Almost all the virtual objects in ‘Second Life’ are user-generated content. Linden Lab has installed a 3D modelling tool that allows a participant to build virtual buildings, landscapes, furniture etc.
Politics: As it gains importance in all fields of everyday life, ‘Second Life’ has also developed into a forum for politics. There are demonstrations held in front of the virtual White House protesting against the Iraq war, and the virtual press ‘Avastar’ reports the event. There was a gathering to commemorate the victims of the Blacksburg incident and in the 2007 elections in France the candidate and current president Nicolas Sarkozy extended his campaign to the Internet when he set up a virtual headquarter for this political party in ‘Second Life’.
Businesses and Organization: For many ‘Second Life’ has become an important market place to conduct business. Not only do they trade virtual objects for Linden Dollars, which can then be converted to real dollars, but conduct all sorts of online businesses that can be integrated into ‘Second Life’. Adidas, for example, advertises its shoes via a virtual shoe store in ‘Second Life’ and considers this campaign an important part of its online marketing.6 Furthermore, one can easily link the virtual shop to the website of an online shop where real-world goods can be ordered.7 Linden Lab is permanently improving the connectivity between ‘Second Life’ and regular websites.
Education: ‘Second Life’ has recently emerged as one of the cutting-edge virtual classrooms for major colleges and universities.8 It has been selling more than 100 islands for educational purposes including projects from Harvard, New York University, Stanford University etc.
Play: Some participants only enter ‘ Second Life’ to engage in playful activities and have no further purposes such as making new friends or doing business. Some islands are created as fantasy worlds and participants are not restricted to any chosen shape - they can appear as humans but also as animals.
This list of different purposes demonstrates that Linden Lab’s online-platform hosts a huge variety of virtual activities and fulfils a different function for each player. ‘Second Life’ is a new kind of online platform that combines most of the features of previous platforms. It can function as a chat forum, a web community, an online shop or a Multiplayer Online Game and is therefore almost as versatile as the Internet itself.
To conclude, ‘Second Life’ cannot simply be classified as a Massive Multiplayer Online Game.9Rather it is an online platform which, amongst other functions and services, provides the opportunity to engage in virtual role games or adventures with other ‘Avatars’. The future social and economic function or importance of ‘Second Life’ in everyday life can hardly be predicted from today’s point of vantage.
The above analysis makes it clear that Linden Lab cannot be seen as the creator of an online game. Because of the complexity of ‘Second Life’ and the many different information services it integrates, it has taken on the role of a provider of information in the Internet. To be able to estimate the liability of Linden Lab in the European context, it is necessary to examine Linden Lab according to the European system of provider liability.
Articles 12-15 of the European Directive on E-commerce 2000/31/EC10introduced guidelines according to which the Member States could regulate the liability of information providers. A directive is not directly applicable in the Member States; it needs to be implemented by the national legislature.11It thereby leaves a margin of discretion to the Member States and only provides a basic framework for the implementation. After analyzing the European system it is therefore necessary to examine the individual national implementation, although the European attempt to harmonize the law regarding provider liability has been quite effective.
Regarding provider liability a major distinction is made between providers of own and providers of foreign information. Without stipulating this directly, the Directive provides special rules on liability only for providers of foreign content. Someone who presents or transmits his or her own information will be held liable according to the general principles of liability.
In contrast, for all types of providers of foreign information there is neither general obligation to monitor the information which they transmit or store, nor a general obligation to actively seek facts or circumstances indicating illegal activity, (Article 15, paragraph.1). The Directive distinguishes between several types of providers of foreign information according to their conduct. It speaks of four categories of providers: Access Providers, Caching Providers, Host Providers and Content Providers. However it should be noted that these different categories only serve as a systematization of the law. Depending on a provider’s conduct, he could be Host and Access Provider at the same time. Therefore, in order to classify a Provider, one must focus on the service he actually offers.
3.1.1. The Access Provider, Art. 12 2000/31/EC
A provider offering the transmission of foreign information in a communication network is usually referred to as an Access Provider. Frequently, an Access Provider or an Internet service provider is a business or organization that provides consumers with access to the Internetand related services. In the past, most ISPs were run by phone companies. By definition, the Access Provider takes care of the technical aspects of access to information in a communication network. Article 12 of the E-commerce Directive stipulates that an Access Provider is not liable for the information transmitted, on condition that he does not initiate the transmission, does not select the receiver of the transmission and does not select or modify the information contained in the transmission.
3.1.2. Caching, Art. 13 2000/31/EC
Caching is provided where a service consists of the transmission of information in a communication network, as in Article 12, but with the sole purpose of the automatic, intermediate and temporary storage of information, performed for the sole purpose of making the information’s onward transmission to other recipients of the service more efficient. Since a provider which only temporarily stores information to forward it has no control over its content, a provider that undertakes the caching of information is also exempted from liability according to Article 13, paragraph 1 (a-e).
3.1.3. Host Provider, Art. 14 2000/31/EC
The information service provider which permits the storage of foreign information on its server and makes it available to third parties is usually referred to as Host Provider. Content or information is entered by users which use the webspace to communicate, to store information, to do business etc. Prominent examples are providers of chat forums or of online auctions such as eBay.12
Many websites store or transmit huge amounts of data every day. It is therefore impossible for a Host Provider to be aware of all the information that is transmitted via the website or displayed or offered for downloading on it. To expect the host to control all content on his website would render the operation of huge business platforms such as eBay impossible; the E-commerce Directive, however, aimed at facilitating electronic commerce on a large scale.13
Article 14, paragraph 1 of 2000/31/EC therefore exempts the Host Provider from liability for any hosted content if the provider has no actual knowledge of illegal activity or information or upon obtaining such knowledge or awareness, acts expeditiously to remove the information or to prevent access to it. The Host Provider is therefore not obliged to actively monitor all the content transmitted or stored by his service. This privilege does not affect the possibility of requiring the service provider to terminate or prevent an infringement, or of establishing procedures governing the removal or disabling of access to information (Article 14, paragraph 3).
Obviously, the impact of the privilege of paragraph 1 depends a lot on the Member States’ application of its powers according to paragraph 3. A wide interpretation of paragraph 3 could seriously undermine the scope of the privilege provided by paragraph 2 (Freytag, 2003, p.144). The duties imposed on Host Providers concerning the termination or prevention of an infringement will be the subject of the next paragraph. Especially the ‘prevention’ of an infringement could be problematic since it addresses future violations of law which can hardly be avoided without imposing an obligation to actively monitor the information-traffic, an obligation which the privilege provided by paragraph 1 seeks to avoid.
3.1.4. Content Provider
The E-commerce Directive does not directly mention the Content Provider. However, it is the implicit counterpart to an Access or Host Provider. If someone presents foreign content on his website, there must be a Content Provider providing this content. Since a Content Provider presents his own information which is completely under his own control there is no privilege or exemption regarding liability for him.
3.1.5. The horizontal approach
By determining provider liability independently from any specific field of law, the Directive takes a horizontal approach (Freytag, 2003, p.144). The guidelines on provider liability form a basis for liability in all fields of law, no matter whether the issue at hand falls within the area of criminal, civil or public law. A Host Provider for example should not be held criminally liable for libel because of a libellous entry in the chat forum provided by him if he was not aware of this entry. So before the liability according to the criminal, civil or public law is determined, in a first step the type of provider needs to be defined according to the implemented Articles 12-15 of the E-commerce Directive (Frydman and Rorive, 2002, p.54). They act as a filter that predetermines whether the Provider can be liable at all or whether he is exempted.
The Directive is silent on the provider’s obligation to detain and store information about his users. If a member of a chat forum is guilty of libel and is subject to state prosecution, the Host could be obliged to provide the authorities with all relevant information to permit the identification of the user. As a consequence, the Host would be required to operate a reliable registration system that allows the identification and location of a user when necessary. It is again up to the individual Member State to impose such a duty on a Provider.
The German legislature implemented the provisions on provider liability first in §§ 6-9 MDStV and §§ 8-11 TDG. The two statutes have now been fused without any material changes into the Telecommunication and Media Law (TMG)14which regulates provider liability in its §§ 7-10. From its structure and wording, the TMG remains very close to the Articles 12-15 of the E-commerce Directive. The major distinction is made between providers of their own and of foreign content, as becomes clear from § 7 entitled ‘General Principles’: Provider of his own information is liable according to these general principles. Only the Provider of foreign information as defined in §§ 8-10 TMG is not obliged to actively monitor its service for illegal content as stated in Article 15 of 2000/31/EC.
The TMG then equally differentiates between the different services according to their activities. German literature and jurisprudence have adapted the same systematization distinguishing between Access, Host and Content Providers, and using the same definitions and privileges. Article 8 deals with the conduct of the Access Provider, Article 9 with Caching and Article 10 with the conduct of the Host Provider.
In addition, the German legislature has chosen the same horizontal approach. The test as to whether a Provider is exempted from liability or not serves as a filter to liability in all fields of law (Schwarz and Nelles, 2007, Ch. 20-G, par. 42; Sieber and Höfinger, 2007, Ch. 18.1. par. 20-29; Sieber, 1999, p.114; Critical: Sobola and Kohl, 2005, p.445). Whether a Provider can be held liable e.g. according to public youth protection laws or criminal law depends on a first assessment of whether he is considered a Content Provider of the information in question or whether he is dealing with foreign information.
As it’s common to modern UKlegislation, the UK legislature implemented the E-commerce Directive almost one to one in the Electronic Commerce (EC Directive) Regulations 200215. Articles 17-22 deal with the Providers of foreign information. The definitions of the function of Access Provider, Caching Provider and Host Provider are exactly the same as the regulation of liability. Article 17 provides for the same definitions and exemptions for the transmission of information in a communication network, as does Article 18 which addresses Caching. Article 19 contains the same exemption from liability for the storage of foreign information by a Host Provider. Besides it excludes the rights of any party to apply to court for relief to prevent or stop infringement of any rights resulting from the enticement of this exemption.
Furthermore, the UK legislature followed the encouragement of the European Union and implemented a ‘notice and take down’ procedure in Article 22. The precise requirements for a valid notification will be pointed out below.
The French legislature implemented the Directive in a slightly different systematic manner. The whole system of liability for foreign content was incorporated in only one Article without any changes on its material content. In Article 6 paragraph 1 Nr. 1 and 216the ‘loi pour la confiance dans l'économie numérique’17defines Host and Access Provider and exempts them from liability for illegal content unknown to them. The French statute thereby refers to the same general principle as stated in Article 15 2000/31/EC, which exempts the intermediaries from the obligation of actively searching their website for illegal content. In Article 6 paragraph 1 Nr. 5 the French statute contains the same provision to implement a ‘notice and take down’ procedure as the UK regulation.
After having discussed the different definitions, the question is as what kind of Provider Linden Lab would qualify? As pointed out above, all European legal systems use identical criteria when it comes to distinguishing the different types of providers or their conduct. The crucial question therefore is how the activities and functions of Linden Lab would be classified under the European System of Provider liability.
The first question is whether ‘Second Life’ is a Provider of foreign information or whether the content they store or display can be attributed to them as their own information. ‘Second Life’ evolves according to the initiatives of the participants. The appearance of the ‘Second Life’ world is entirely determined by the registered users. They ‘build’ houses and landscapes; they determine their own appearance and their mode of interaction according to the options provided to them in the virtual world. Many functions ‘Second Life’ fulfils today, such as the use of virtual classrooms or universities as distance learning centres had not been foreseen by the creators of the platform.18And many new uses, not foreseeable at this point, are yet to be discovered. Linden Lab as the creator of ‘Second Life’ nothing more than offers a webspace that gives the widest freedom of interaction and creativity to the user. To this end, Linden Lab is continually improving the different functions and the operability of ‘Second Life’, for example the connectivity between the ‘Second Life’ platform and other websites has been improved in order to strengthen ‘Second Life’ as a market place.19From this approach it becomes clear that Linden Lab can by no means be classified as a Content Provider. Suing Linden Lab for any content displayed, for example in a virtual class room, would come close to suing a house owner who rented his property to a third party, for any illegal activities of this party in his house.20
3.5.1. Host or Access Provider?
As Linden Lab cannot be classified as a Content Provider, it could either qualify as an Access or a Host Provider.To fall into the category of Access Provider, Linden Lab would have to transmit information in a communication network. Although one can argue that Linden Lab offers services to transmit different kinds of information, it is questionable whether ‘Second Life’ constitutes a ‘communication network’. The term ‘ communication network’ encompasses communication services from different layers of Internet communication (Freytag, 2002, p.116). Providing a ‘communication network’ can mean providing a network to transfer data in a technical sense but it can also mean providing a service for a user in an existing network such as a mail-server. In both cases, the provider offers a communication infrastructure, in one case on a technical level and in the other on a user level (Freytag, 2002, p.116). ‘Second Life’ basically is software that can be downloaded by everyone online. It does not connect recipients by providing some sort of communication infrastructure; it simply gives access to one coherent software. The infrastructure needs to be there first to install and use ‘Second Life’. Therefore it is comparable to communication services, such as ‘msn messaging’ or to voice-over IP, for example ‘Skype’, and these services are not considered ‘Access Providers’.
It is undisputed among the different European jurisdictions that Providers of platforms for online auctions allowing third parties to auction their goods and Providers of platforms are classified as Host Providers(For Germany: Sobola and Kohl, 2005, p.444; Leible and Sosnitza, 2004, p.3225). Linden Lab therefore falls into the category of a Host Provider since it combines the functions of chat forums, online shops or auctions etc. When the Provider has no knowledge of an illegal activity taking place, a liability of those activities would be completely exaggerated.21Linden Lab will therefore be subject to all legal rights and duties that are traditionally imposed on Host Providers in Europe.
4. Control of the platform: How to manage an online-platform and avoid Host Provider liability? – The ‘notice and take down procedure’
The fact that Linden Lab qualifies as a Host Provider under European law means, first of all, that it does not have a general obligation to monitor the content transmitted or stored via its platform or to actively search for illegal activities. Under all national laws analyzed so far the Host Provider is privileged if he did not have actual knowledge of the illegal content and, if upon obtaining such knowledge or awareness, he immediately removed it or disabled the access to it. The conditions of the privilege mean that a Host Provider does indeed have a restricted liability to control his platform or manage it responsibly. Once the Host Provider has been notified of any illegal activity occurring on his platform and has positive knowledge of its location, a whole set of duties arise with regard to taking down the content and preventing future access to similar content.22
Article 15 paragraph 2 of Directive 2000/31/EC furthermore stipulates the duty of information service Providers to inform competent public authorities of any illegal activities. In addition, Hosts could be obliged to operate sophisticated registration systems which would permit the identification of users when necessary. The European legislator refrains from including any details on a ‘notice and take down’ procedure in the E-commerce Directive, although the Members are encouraged to implement such a system.23The duties of Host Providers have therefore developed differently in the individual Member States with the judicature taking a major influence on their scope and nature.
The German legislaturedid not expressly implement a ‘notice and take down’ procedure in the TMG as it was suggested by the European Commission. The statute itself does not stipulate how a valid notification should be drafted and in what manner the provider should react. Legal literature and the court decisions have provided several guidelines. The notification has to be sent by a non-anonymous sender and has to clearly identify the illegal content, i.e. its nature and its precise location, e.g. by providing a link.24The notification has to contain precise information and not vaguely refer to the general appearance of illegal content on a website. The Provider does not have to act upon vague or imprecise notifications. Basically, the stipulations are identical with the ones required by the UK or the French statute, pointed out below.
4.1.1 The ‘ eBay-judgements’
In addition, the nature and the scope of the legal duties of a Host Provider have been shaped by the so called ‘eBay-judgments’.25Users had offered plagiarized versions of Rolex watches on eBay and had been discovered by the owner of the Rolex trademark. The owner sued not only the actual user and offeror of the plagiarized goods but also ‘eBay’ for having provided a market place for illegal goods and having facilitated their distribution. The case went all the way up to the Supreme Court which commented on the liability of the Provider of ‘eBay’ and on the duties ‘eBay’ has to control the content on its platform. With reference to the E-commerce Directive, the Court pointed out that ‘eBay’ as a Host Provider has no obligation to actively monitor every offer before it is displayed. Such a duty would render the operation of the whole platform impossible.26However the interest of the Host Provider in operating his platform is not weighed higher than the interest of the owner of a trademark right in selling his goods.27To fairly balance the two interests, the Host Provider has the obligation to react on being notified of the illegal content. The Court established two duties:
to take down the content immediately
to prevent similar illegal content being displayed in the future
In the case at hand ‘eBay’ was obliged to check all future offers of Rolex watches on trademark infringements. Unfortunately, the judgment remains vague with regard to further technical details of the Provider’s duty to control content. It was left to ‘eBay’ to install particular filters or take other measures to control their platform. The precise legal basis for this duty to prevent future infringements remained unclear. The language of the court with regard to the preventive duties is therefore rather cautious and ambiguous.28
A similar judgment by the District Court of Hamburg involving plagiarized perfume29shed a little more light on the monitoring duty by pointing out that any filter used to detect illicit offers must function in a preventive manner. The auctioneer had installed a filter that detected offers of plagiarized perfume. The filters were held to be insufficient because they were only capable of detecting an offer after it had been displayed.30Since however the Host Provider has a duty to prevent the reappearance of similar offers, the filter would have had to detect offers before they were displayed on the website in order to be effective.
4.1.2. Libel in chat forums
A second group of cases focused on libellous statements in chat forums.31Again, the Host Provider was obliged to remove the libellous content and to monitor his forum in the future for similar content, and again the technical means for doing this were not specified.32Regarding chat forums it has also been discussed, whether the provider should be obliged to set up a registration system allowing the identification of the users who were guilty of libel. So far this question has not been decided conclusively. The High Court of Düsseldorf ruled that a provider can be requested to operate such a registration system.33If he is unable to provide any information about the user he will be the only party subjected to an injunctive relief by the court and he cannot pass on the liability to the user. Unfortunately, the Court did not make clear whether it considers such a registration system mandatory nor has this point been decided by any higher courts.
The only statement that can be made regarding the question of user registration is, that a non-anonymous and reliable registration system is advisable so that the Host Provider is not the only entity the court can get a hold of in case of a legal dispute.
4.1.3. Which duties would Linden Lab have?
The crucial question now is, how the rules drafted for chat forums or ‘eBay’ as an online auction house can be transferred to ‘Second Life’. It is undisputed that upon receiving a valid notification Linden Lab would be obliged to take down the illegal content. But what about the duties pointed out in the ‘eBay’ judgments to prevent future infringements? Until today their scope and nature have not really been clarified. Once Linden Lab has been notified of illegal content involving pornography or child pornography does this mean it would have to scan its platform for all future pornographic offers in order to avoid liability?
Especially since the legal basis of the preventive duties is unclear in consideration of the wording of the TMG, they should not be overemphasized or extended further than the Supreme Court judgment requires. In the judgment the duty to control future content was limited to offers of a very similar content, in this case to other offers of Rolex watches.34The structure of the ‘eBay’ platform easily allows the installation of filters since the offers displayed are of a very homogenous nature. The judgment emphasizes several times, that a duty to monitor is limited by the technical possibilities of the provider.35This double limitation to similar content and to the technical options available would also apply to Linden Lab.
It follows that Linden Lab would not be obliged to search the whole platform for pornographic content as this would effectively invalidate its privilege as a Host Provider. It would probably have to control the webspace of the user that had displayed the illegal content or any location where a similar content would be likely to be displayed again. The measures used, such as filters etc., would depend on the technical possibilities Linden Lab has, although of course a provider cannot hide behind a technical problem to avoid liability.36The duty of preventive control which is formulated in the ‘eBay’ judgment should rather be understood as ‘not turning a blind eye to illegality’ than as a drastic extension of the existing duties. It simply means that the Host Provider should not hide behind his privilege if it is obvious to him that future infringements might occur.
In contrast to the German statute, the French legislature implemented a paragraph on the ‘ notice and take down’ procedure. The French legislature was obviously inspired by the US Digital Millennium Copyright Act which serves as a model for all such ‘notice and take down’ procedures. Before the implementation of this procedure, providers of foreign content were obliged to remove illegal content upon the receipt of a judicial order (Renard and Barberis, 2003, p.135). Article 6 paragraph 1 Nr. 7 restates the principle of Article 15 of the E-commerce Directive that the Host or Access Provider has no obligation to actively monitor content transmitted or stored on his website. No obligation to prevent future infringements can be deduced from the language of the statute.
However, apart from these general features the French legislature has added a few new aspects which are not part of the German or UK statute. Pursuant to Article 6 paragraph 2, for example, the Host Provider is legally obliged to obtain and to keep data of its users that allows their identification in case of dispute.37This provision, if it is realized in an effective manner, will be a great asset for the handling of liability cases involving a Host Provider. Otherwise, if the Host cannot be held liable because he was privileged and the Content Provider cannot be detected, the damaged party or the state prosecution would have to close proceedings. To deter third parties from filing incorrect notifications and in order to stop the diffusion of the information, the filing of abusive notifications is considered an offence and can be punished with a year of imprisonment and a fine of 15.000 Euros. According to Article 6 paragraph 7 sub-paragraph 4, the Provider of foreign content has to inform the public authorities when he receives a notification. In addition he has to install a mechanism that makes it easier for users to send a notification.38
In conclusion, the French implementation does contain a few elements which are different from the UK and German statutes but with regard to the duties of Host Providers it presents no material differences.
The Electronic Commerce Regulation 2002 also deviates from the Directive in so far as it is much more precise in defining when the Host Provider can be considered notified. The UK legislature followed the European Commission’s initiative39and implemented a ‘notice and take down’ procedure for illicit information similar to that provided in the US Digital Millennium Copyright Act. Similar to the French statute, Article 22 sets out the precise conditions of a notification that obliges the provider to take down the content. Although this provision is different to the German statute, the requirements for a notification have essentially been interpreted in the same way as the stipulations of the UK statute by the courts.
The recent occurrences demonstrate that virtual or real pornography including child pornography has been displayed via ‘Second Life’. Child pornography is generally banned, but pornographic content is not illegal in general, although the access for minors is regulated by youth protection laws. Part of the media attention in the last weeks centred on the question as to whether the age verification system operated by Linden Lab fulfils the requirements of youth protection laws, in particular the strict conditions of the German legislation. By several institutions such as JusProg e.V., an association dedicated to youth protection, ‘Second Life’ has already been put on several ‘black lists’ for providing content harmful to children and teenagers.40
As a reaction to the concerns regarding the inadequate protection of children and teenagers, Linden Lab has already improved its age verification system.41In April 2009 Linden Labs published upcoming changes for adult content.42There are three measures to ensure that only adults have access to adult-oriented content. After beginning with the implementation process in the end of June all regions will be maturity-rated, search will be filtered for everyone, Adult content on the mainland will be moved to the new ‘Adult Only continent’, and access to Adult regions and search results will be limited to Residents with verified accounts. To get an age-verified accountResidents must provide a few simple details about their identity – generally, name, date of birth, and address. Additionally, Residents will be asked to provide specific identifying information, such as a driver's license number, passport or national ID card number, or the last four digits of a social security number (this is dependent on where geographically the Resident is based). This is then cross-checked against pre-existing databases of public record to verify that Residents are of legal age.43
With the implementation of the ‘Treaty on youth protection in the media’44(hereafter referred to as JMStV), the competence to regulate youth protection in media and information services have been redefined in Germany (Hartstein, Ring Kreile, Dörr and Stettner, 2005, Ch 1.3.1, p.1). The individual federal state now has the competence to regulate youth protection in the context of electronic media. The regulation of media also involves supporting mediums, such as films, videos, DVDs and all sorts of video games.
The Treaty lists content which is considered harmful to children and teenagers. The distribution or making available of such content via any electronic media amounts to a summary offence and will be sanctioned with a fine up to 500.000 €. § 4 distinguishes between content, which is considered to be harmful to such a severe degree that under no condition should it be distributed or made available via electronic media (paragraph 1) and content, which may be distributed or made available via an electronic medium, such as the Internet, but where children or teenagers must be barred from accessing the content (paragraph 2). If such absolute forbidden content, like obscene depictions involving children and teenagers or ‘hard pornography’ is detected on a website, for example, liabilities could arise for the Content Provider or the Host Provider after notification. Content listed in paragraph 2 may only be accessible to a restricted group of users. Besides the Provider is obliged to install an effective age verification system which ensures that only adult users can access the information.
The required standard of such a system for it to be considered effective is extremely high. Several state courts have decided that systems requiring the registrant to identify himself by typing in the digits of an ID-number are insufficient, even in combination with a credit card number.45To guarantee that no minors access the information personal contact with the client to verify his age is required.46The provider must contact his clients personally or must establish a so called Post-Ident procedure where the client’s identity is verified by a local post office (Hartstein, Ring, Kreile, Dörr and Stettner, 2005, Ch.3. § 4 JMStV, p.60-66; Nikles, Roll, Spürck and Umbach, 2005, p.207). The age verification system currently operated by Linden Lab obviously does not fulfil these requirements. If pornographic or indicated content is found without the protection of an effective age verification system, the same liabilities arise as they do for the distribution or making available of content.
5.2.1. Who is obliged to install an age verification system?
As pointed out above, Linden Lab does not qualify as a Content Provider but only as a Host Provider. To guarantee the uniformity of law the legislature has taken a horizontal approach when regulating Provider liability. The reductions and exemptions for host providers are therefore applicable to the law of youth protection as well. The main message of § 10 TMG is that a Host Provider does not have a general duty to control foreign content on his website or platform. He will not be held liable for any illicit content on his website except when his attention has been drawn to it. This policy would definitely be circumvented if he could be held liable for such content according to the JMStV or Criminal Law. The function of the §§ 7-10 TMG as a filter for liability in all fields of law is often overlooked but generally undisputed.47As a consequence, Linden Lab is as a Host Provider not responsible for the installation of an effective age verification system if his offer cannot be qualified as an offer generally containing harmful content as enlisted in § 4 JMStV. The above analysis shows that the information service offered by ‘Second Life’ is a very content-neutral one. It is impossible to make a statement regarding the harmfulness of ‘Second Life’ content in general because of the huge variety of information and services provided. To label ‘Second Life’ as a pornographic offer or service simply because one user opened a virtual cinema for pornographic movies would not be in accordance with the ‘virtual reality’ of ‘Second Life’. As the Host Provider of a content-neutral information service Linden Lab would not be responsible for installing an effective age verification system.48
5.2.2. The ‘Second Life’ dilemma
According to the logic of the JMStV, the Content Provider would then be the one to have to fulfil the requirements and install an effective barrier to keep children and teenagers from downloading or viewing illicit content. In this case the Content Provider is the individual user who displays or offers pornographic content for download. In the current system, the only means he has to block minors from viewing his content is to flag his island as ‘adult content’ so that it is not visible to minor users. But the user can only be identified as a minor by a verification system which according to the German youth protection law is unsatisfactory because it can be circumvented too easily. We therefore find ourselves in a dilemma: The one who has the technical opportunities to install such a system is not obliged to do so by law and the one who would theoretically be obliged to do so does not have the means to do so since he cannot define the criteria for accessing ‘Second Life’.
This problem can only be solved either by Second Life installing a strict age-verification system (although it is not obliged to do so) or by changing the structure of the platform so that each user can determine the access criteria for his island. With regard to the importance attached to the protection of minors it cannot be expected that Linden Lab be released from the duty to install a satisfactory age-verification system.
The Youth Protection Laws of other European Countries contain similar restrictions. However, it should be noted, that the German Youth Protection Law is one of the most stringent and the most advanced with regard to the control of content in the Internet.49
As a Host and not as a Content Provider Linden Lab cannot be held liable for illicit content transmitted or made available via its website. However, it should avoid structuring its platform in a way that facilitate or even promote the exchange of such content. The structure and the rules governing the entrance and the activities in ‘Second Life’ are a ‘content’ provided by Linden Lab. If anything in the structure or in the rules of the platform induces or facilitates the exchange of illegal content Linden Lab might be held liable for damages caused. Since ‘Second Life’ is a new phenomenon, the law has not yet found a way of appropriately distributing responsibilities within the platform. The individual user creates content more independently than in a chat forum or on a business platform; however, he does not act as independently as if he were operating his own website. Linden Lab is a bit of both. On the whole it is a Host Provider, however, with regard to the presentation of the ‘virtual world’ and its basic features, it also acts as a Content Provider.
Before the law has settled this issue, Linden Lab should do everything possible to avoid the exchange of illegal content on its platform. If the design of the platform could be adapted to create a safer virtual environment, this would be a good preventive measure. With regard to the problem of virtual child pornography, for example, it has been suggested that the platform be modified in a way that sexual interaction between infantile and adult ‘Avatars’ becomes impossible, e.g. by offering children’s shapes only without a specific sexuality. There is actually no legal reason why children should engage in sexual activities at all. Another option would be a restructuring of the platform so as to allow a better distribution of responsibilities. The options available for regulating access to one’s webspace could, for example, be extended instead of leaving the question of age-verification to the self-regulating procedures of the web-community.
A system, restricting or limiting anonymous registration in ‘Second Life’ could also signal disapproval of any illegal activities taking place in ‘Second Life’. None of the suggestions above is at present directly required by a statute. They only serve as an inspiration for future legal developments.
‘Second Life’ is a new phenomenon in the virtual world and due to its complexity cannot be compared to other online services, such as chat forums, or to online business platforms, such as eBay. It is the combination of a whole range of different services which previously were provided by the Internet separately. Each ‘Avatar’ might have a different motivation to use ‘ Second Life’ and might make use of different functions of the platform. Since most of the content in ‘Second Life’ is created by the user, Linden Lab should be seen as a Host Provider. In its role and function it is comparable to the Provider of chat forums or business platforms. The fact that ‘Second Life’ as a platform combines many services or even adds formerly unknown services does not impact this interpretation. As a Host Provider, Linden Lab has no general obligation to monitor the content displayed in ‘Second Life’. To avoid liability the Provider merely has the legal obligation to act on notification and remove the illicit content. Further legal obligations might arise to avoid the reappearance of similar content.
Problems could arise from European Youth Protection Laws, especially from the strict requirements of German youth protection law on age-verification systems. Linden Lab would, with regard to its position as a Host Provider of a content-neutral platform, is not legally obliged to install an age-verification system. However, technically no one but Linden Lab is able to install such a system in ‘Second Life’. The individual user can only flag his content as ‘adult content’. No European Youth Protection Law is going to be satisfied with a system of self-regulation with regard to age-verification. In the long run this problem can only be solved by changing the structure of ‘ Second Life’ and giving the individual user the option of installing a sufficient age-verification system.
1 http://news.bbc.co.uk/2/hi/technology/6638331.stm(last accessed: 24-04-09); http://news.scotsman.com/scitech.cfm?id=735282007(last accessed: 24-04-09); http://www.tagesspiegel.de/weltspiegel/Welt;art118,1994755(last accessed: 24-04-09).
3 Although Hopf and Braml (2007) p. 358 assume that ‘Second Life’ is an online game they at the same time emphasize that the user is the actual creator of the game.
5 Anshe Chung became world famous by making a million dollar with real estate businesses in ‘Second Life’, see http://www.anshechung.com/include/press/press_release251106.html (last accessed: 24-04-09).
6 http://notizen.typepad.com/aus_der_provinz/2006/09/adidas_wie_brin.html (last accessed: 24-04-09).
7 http://notizen.typepad.com/aus_der_provinz/2006/10/virtuelle_welte.html (last accessed: 24-04-09).
10 Directive 2000/31/EC of the European Parliament and of the Council of June 8, 2000 on certain legal aspects of information services, in particular Electronic Commerce, in the Internal Market, OJ L 178/1, published July 17, 2000.
11 Art. 249 par. 3 Treaty establishing the European Community, OJ C 321 E/1-331, published December 29, 2006.
12 For the qualification of eBay as a Host Provider see the judgement from the German Supreme Court, BGH (2004) Az. I ZR 304/01.
13 Directive 2000/31/EC, s. 1-5.
14 Telemediengesetz, February 26, 2007, BGBl. I, p. 179.
16 Art. 6 was been modified by loi no. 2007-297, March 5, 2007, Art. 40 I.
17 Loi por la confiance dans l’é conomie numérique, loi no. 2004-575, June 21, 2004, consolidated version of March 7, 2007; accessible at: http://www.legifrance.gouv.fr/texteconsolide/PCEBX.htm (last accessed: 24-04-09).
18 Hopf and Braml (2007) p. 360 qualify ‘Second Life’ as an online game but in the next paragraph they point out, that the players themselves are the ‘creators of the game’ which can even be protected by copyright. In terms of the legal assessment of liability, this classification is a contradiction in itself. Either Linden Lab is the creator of ‘ Second Life’ and then one could consider it as an online game created by one person or one sees the individual players as creator which immediately changes it to an online platform providing foreign content.
19 http://notizen.typepad.com/aus_der_provinz/2006/10/virtuelle_welte.html (last accessed: 24-04-09).
20 BGH (2004) Az. I ZR 304/01, par. 46.
21 Directive 2000/31/EC, s. 40-42.
22 The use of the term ‘prevent’ in Art. 14 par. 3 2000/31/EC indicates that there might be an obligation with regard to future content, independent of the present notification. Indeed, a generous interpretation of Par. 3 could seriously undermine the privilege granted in Art. 14 par. 2.
23 The idea of a so called ‘notice and take down’ procedure has been introduced by Digital Millenium Copyright Act (DMCA) s. 512 which provides clear guidelines on the nature of a notice and the information it must contain, Sec. 512c par. 3 (A). The European Commission consciously refrained from giving such guidelines in the Directive and encourages the Member States to draft them themselves, see Directive 2000/31/EC, s. 40; Rücker (2005) p. 354.
24 For a discussion of the requirements see: Sobola and Kohl (2005) pg. 446-447; OLG München (2006) Az. 6U 675/06 emphasizes that the notification also has to clarify the illegality of the content. The present case dealt with the presentation of copyright infringing content. The Host Provider could not be held liable because it did not become clear from the notification that the content really was infringing and thereby illegal.
25 BGH (2007) Az. I ZR 35/04; BGH (2004) Az. I ZR 304/01; OLG München (2006) Az. 29 U 2119/06; LG München (2006) Az. 21 O 2793/05; LG Hamburg (2005) Az. 312 O 753/04.
26 BGH (2004) Az. I ZR 304/01, par. 46.
27 With the same line of argument BGH (2004) Az. I ZR 82/01.
28 BGH (2004) Az. I ZR 304/01, par. 46.
29 LG Hamburg (2005) Az. 312 O 753/04.
30 LG Hamburg (2005) Az. 312 O 753/04.
31 OLG Düsseldorf (2006) Az. I-15 U 180/05; OLG München (2006) Az. 6U 675/06.
32 OLG München (2006) Az. 6U 675/06.
33 OLG Düsseldorf (2006) Az. I-15 U 180/05.
34 BGH (2004) Az. I ZR 304/01, par. 46.
35 BGH (2004) Az. I ZR 304/01, par. 46.
36 To problems arising from the structure of ‘Second Life’ and its hostility to technical control measures see VI.
39 Directive 2000/31/EC, s. 40.
40 http://www.heute.de/ZDFheute/inhalt/28/0,3672,5264828,00.html (last accessed: 24-04-09).
41 https://blogs.secondlife.com/community/features/blog/2007/05/05/age-and-identity-verification-in-second-life (last accessed: 24-04-09).
42 https://blogs.secondlife.com/community/community/blog/2009/04/21/update--upcoming-changes-for-adult-content (last accessed: 24-04-09).
43 https://support.secondlife.com/ics/support/default.asp?deptID=4417&task=knowledge&questionID=6010 (last accessed: 24-04-09).
44 Jugendmedienschutz-Staatsvertrag (JMStV), entered into force April 1, 2003, available at: http://www.hessenrecht.hessen.de/gesetze/Staatsvertraege/71-G_JugendschutzSV/staatsvertrag/staatsvertrag.htm (last accessed: 24-04-09); Parallel to the introduction of the treaty, the federal youth protection law ‘Jugendschutzgesetz (JuSchG)’ was amended, see BGBl. I, p. 2730.
45 BGH (2007) Az. I ZR 102/05; VG München (2007) Az. M 17 S 07.144; KG Berlin (2005) Az. W 31/05; LG Krefeld (2004), Az. 11 O 85/04.
46 See for example: VG München (2007), Az. M 17 S 07.144.
47 Hopf and Braml (2007) p. 358 for example completely ignore the significance of the §§ 7-10 TMG when discussing the liability questions of ‘Second Life’.
48 The same answer would be given to the question of a criminal liability of Linden Lab according to §§ 184c, 184b par. 1 StGB for making available virtual or real pornographic content.
49 Traditionally, youth protection laws focus on media such as films, videos or computer games based on a carrier medium and restrict their sale to people above 18.
Freytag, S (2002), ‘Harmonisierung der Providerhaftung’ in Lehmann, M (ed.) Electronic Business in Europa (München: Verlag C.H. Beck).
Freytag, S (2003), ‘Zivilrechtliche Providerhaftung’ in Heermann, P W and Ohly, A (ed.) Verantwortlichkeit im Netz - Wer haftet wofür? (Stuttgart, München, Hannover, Berlin, Weimar, Dresden: Richard Boorberg Verlag).
Frydman, B and Rorive, I (2002), ‘Regulating Internet Content through Intermediaries in Europe and the USA’, Zeitschrift für Rechtssoziologie 23, pg. 41.
Hartstein, R; Ring, W-D, Kreile, J, Dörr, D and Stettner, R (2005), Jugendmedienschutz-Staatsvertrag, Rundfunkstaatsvertrag Kommentar III (Heidelberg, München, Landsberg, Frechen, Hamburg: Verlagsgruppe Hüthig Jehle Rehm).
Hopf, K and Braml, B (2007), ‘Virtuelle Kinderpornographie vor dem Hintergrund des Online-Spiels Second Life’, Zeitschrift für Urheber- und Medienrecht (ZUM), 354.
Leible, S and Sosnitza, O (2004), ‘Neues zur Störerhaftung von Internet-Auktionshäusern’, Neue Juristische Wochenschrift (NJW), 3225.
Nikles, B W, Roll, S, Spürck, D and Umbach, K (2005), Jugendschutzrecht (München/Unterschleißheim: Luchterhand).
Renard, I and Barberis, M A (2003) in Spindler, G and Börner, F (ed) E-Commerce-Recht in Europa und den USA (Berlin, Heidelberg, New York: Springer Verlag).
Rücker, D (2005), ‘Notice and take-down Verfahren für die deutsche Providerhaftung?’, Computer und Recht (CR), 347.
Schwarz, M and Nelles, K (2007) in Schwarz, M and Peschel-Mehner, A (eds.) Recht im Internet – Der große Rechtsberater für die Online-Praxis (Frankfurt a.M.: Verlag Recht und Wirtschaft).
Sieber, U (1999), ‘Verantwortlichkeit im Internet’ (München: Verlag C.H. Beck).
Sieber, U and Höfinger, F M (2007), in Hoeren, T and Sieber, U (ed) Handbuch Multimedia Recht – Rechtsfragen des Elektronischen Geschäftsverkehrs (München: Verlag C.H. Beck).
Sobola, S and Kohl, K (2005) ‘Haftung von Providern für fremde Inhalte’, Computer und Recht (CR), 443.