The issue of responsibility for governing virtual worlds has implications on multiple levels. First, virtual worlds are not restricted to the same lines of jurisdiction as other, more traditional environments. By the very nature of their existence, virtual communities transcend the physical limitations of geography. For example, five Second Life residents occupying the same plot of simulated land, or sim, can be controlled by actual world residents located on five different continents. Governing acceptable behaviors and interactions between those five in terms of what actually constitutes illegal behavior cannot easily be worked out. If, for example, their interactions are sexual in nature, the problem arises of which jurisdiction’s laws should stand? In the United States, First Amendment protections allow “age-play,” or virtual sexual encounters with childlike avatars. Other countries may not share the same view on freedom of expression and thus may deem such activity illegal. Furthermore, one or more of the residents interacting in such a way may actually feel harassed or put off by such interactions. Boellstorff notes that, while “age-play” may not be a particularly uncommon occurrence in certain virtual circles, “for many residents virtual pedophilia [is] highly discomfiting, not least because in some jurisdictions even simulated sex with minors could be a crime” (Boellstorff 164). Even if simulated sex with minors is not illegal, concerns may arise about the kinds of individuals who would fancy such behavior and whether or not this victimless virtual behavior would or could translate to actual world behavior.
The second issue with governance is the gap between law and contract agreements. Most virtual worlds have contractual agreements between users and governing bodies, such as Linden Labs, the San Francisco-based creators and maintainers of Second Life. Users must agree to the terms of service, which typically dictate the kinds of behavior that is deemed “illegal.” Such behavior might be hacking, sexual and other forms of harassment, or threatening other players. Other rules may be irrelevant to anything outside of in-world activity, such as the building of certain types of virtual designs in restricted areas of sims in Second Life, thus typical laws would not apply. With regard to the breaking of virtual rules, "the imposition of punishment for the breach of internal rules exists in a difficult conceptual gap between criminal law and the predominantly compensatory remedies of contractual doctrine” (Suzor §1). Punishment for crimes, then, is determined by which side of the gap the crime falls into. Whether or not a crime is actual, virtual, or both must be determined before disciplinary action, should any be required, is enforced.
The last issue with governance is crime reporting. In the actual world, a victim, observer, or bystander typically has to report a violation or crime when one has occurred. Occasionally, a criminal may choose to turn himself in without law enforcement even being aware of a crime. However, the virtual world offers a unique tool for rule enforcement that is not present in the actual world, at least in most places: continuous activity monitoring and logging. Because the whole of a virtual environment’s existence is possible only through the use of computers, that existence can be logged and monitored. Granted the size of some worlds, such as Second Life or World of Warcraft, are so large, the task is no small order. Yet, the storage capacity of today’s servers is vast, and while the task of actually monitoring logs can be tedious, there are elements in place that can allow for automated scripts to seek out specific types of information. This ability is typically not afforded to law enforcement outside of electronic crime investigations.
The question remains, however, whether or not users would agree to continuous monitoring, even for security purposes. Because Internet users, at least those in the West, are accustomed to “freedom of information” and minimal censorship, many would choose a virtual option that does not track their activity over one that does. So, while the ability for continuous monitoring may exist, the acceptance of such a process would likely be quite small—even if it meant an increased sense of safety.
Perhaps the one area that most people would not argue against increasing legal protections is in crimes against children. Joan Shaughnessy points out that there is crossover between existing laws against child exploitation and the virtual sex act of age-play. She found that “in many states, an adult who engages in virtual sex with a child may be subject to prosecution,” and that in some other cases, “the law has criminalized sexual behavior aimed at children, even if no touching is involved” (Shaughnessy). Laws such as these can be used against virtual crimes against children or where children can be harmed, even emotionally. Despite the laws not being written specifically with virtual crimes in mind, they can be applied due to the actual world developmental effects that such activity can have on a child. Perhaps most important to note is that in virtual environments, sexual interaction between users is direct interaction, whereby users specify language and actions to a particular recipient, or multiple recipients. In summation, and with regard to crimes against children, Shaughessy notes that if such a recipient is a child, “the child has suffered from the very type of harm the criminal law intends to prevent” (Shaughnessy).
--------------------------Boellstorff, Tom. 2008. Coming of Age in Second Life: An Anthropologist Explores the Virtually Human. Princeton: Princeton University Press.
Shaughnessy, Joan. “Protecting Virtual Playgrounds: Children, Law, and Play Online.” Washington & Lee Law Review, 66 Wash & Lee L. Rev. 995, Summer 2009, accessed April 28, 2012.
Suzor, Nicolas. “Order Supported by Law: The Enforcement of Rules in Online Communities.” Mercer Law Review, 63 Mercer L. Rev. 523, Winter 2012, accessed April 28, 2012.