Imagine sitting in a seat and looking down at hands that are not your own. You’re wearing clothes that are not your own, and you’ve invaded someone’s personal space. Next to you stands a person you’ve never witnessed before, stroking your shoulder, then slapping your face. Were you wearing a virtual reality (VR) headset and experiencing this simulation, researchers have shown that you’d likely experience body transfer illusion, feel the sting on your cheek and the cardiovascular response that comes from being struck. It works for all sorts of sensations as other studies have shown, but in realizing that its effects can set in just moments after entering a simulation, VR is raising all sorts of ethical concerns for hardware and software developers across the tech sector and in the entertainment industry.
VR has been promised to the public in some form for decades by tech companies, and the dream has been kept alive in books and movies, but the earliest sincere effort came in July, 1995. That’s when Nintendo released the Virtual Boy, its first 3-D console. The red and black color scheme was supposed to simulate depth as Nintendo classics and new games were adapted for this console, but it didn’t work as intended and sales showed that. By December is was pulled in Japan, and in March of 1996 is had disappeared from North America as well. It would take another 20 years of patience before viable alternatives hit the market, and once again it’s the video game industry that’s making the most of this concept. With this unknown technology making its way to consumers, the role of the developers of these platforms and their software comes into question. Do they have an inherent responsibility to their consumers and audiences? Are there interests that may seek to limit that responsibility in favor of some other motive? And what is the future of this technology should they choose to ignore their role when so much of this technology remains untested?
Tennis for Two
On January 1, 1993, Wired’s first issue featured “War is Virtual Hell,” penned by journalist, author, and cyberpunk authority Bruce Sterling. In the piece, Sterling wrote about his visit to the Combined Arms and Tactical Training Center at Fort Knox. There he witnessed soldiers train for operations and combat with video games. For readers of that first issue, it was arguably a radical idea to consider the United States military utilizing what’s still seen as a medium for diversion as a means to prepare soldiers for combat. In truth, the video game medium wouldn’t have formed without it.
Tennis for Two, created in 1958, is cited as being the first video game in existence. This prototypical pong, playable on an oscilloscope and available to the public, was designed by scientists at the Brookhaven National Laboratory at Camp Upton, Long Island, New York. Leading the design team was physicist William A. Higginbotham, a name forever attached to the creation of the first nuclear bomb. Though Higginbotham later went on to denounce nuclear weapons along with the members of the Federation of American Scientists, his other invention created an entirely new medium for the US military to develop.
Were Sterling to pen his Wired article today he’d likely head to the Center for Applied Strategic Learning at the National Defense University, where the Department of Defense foots the bill for soldiers of all ranks from allied nations to come and train for conflicts with video games. Or he’d head to the University of Southern California and watch members of the US Army help developers working in the industry design more believable video games and VR tech at the Institute for Creative Technologies, creating new training methods and sharing insights with development studios.
It’s a beneficial relationship for both parties. The military becomes more technologically adept to carry out its objectives through VR, and those developments help developers and publishers secure their piece of the $90 billion video game market. And one of the greatest success stories from this relationship is one of the most profitable franchises in video game history.
While there have been war video games since the early days of Atari, none have been as profitable nor visible as the Call of Duty (COD) series, particular over the last 10 years. Since the release of Modern Warfare in 2007, COD has sold more than 220 million units of its base games and 250 million over its lifetime. This does not account for sales through downloadable content, merchandise, and licensing.
Sledgehammer Games, one of several studios that works on the COD franchise, is an example of how this special relationship gets even closer than sharing technology. When developing Advanced Warfare, Micheal Condrey had consultation from the Pentagon. In fact, there’s a lot of open communication between the two. In an interview, Condrey laid out how well the military fits in with different entertainment mediums.
Often we are able to extend out network through existing relationships within the Call of Duty franchise. We worked with Mark Bohl, writer of The Hurt Locker, and were put in contact with his retired Navy Seal Team 6 adviser through shared contacts. Other times, we research experts in the field and reach out directly. Retired Delta Commander, Dalton Fury, is an example. We read his book, Kill Bin Laden, and made an inquiry on his interest and availability.
So it’s clear the US military sees value in COD, even going so far as to invite one of its writers to advise on actual warfare. But why do gamers like it so much? It’s difficult to say.
Shooters are among the most popular genres of games, and nearly weekly there’s a new big budget title hitting shelves, and updated content for online games that have years worth of competition supporting them. But according to a 2015 survey by the Entertainment Software Association, after considering the cost of a game, the next deciding factor in purchasing a game has to do with its story. The industry seems to understand this, which is what makes franchising so popular in the industry, and why spin-offs starring supporting characters are increasing in number. But more than just familiarity with an IP, gamers seem to want good story, as the ESA found that when not playing video games 50% of respondents spent their free time watching movies, 34% turned to TV.
There’s nothing special about COD’s stories. Though they’ve attracted talent like Kevin Spacey and Game of Thrones’ Kit Harington, these are very simple tales of explosive conflicts with thick borders that separate hero from villain and rarely leave a lasting impression.
COD isn’t alone in repeating its formula with different character and weapon skins while not bothering to introduce anything that would cause it to stand too far apart from what it has already done. It’s rather standard practice in the industry and has ensured positive numbers. Yet the drop in sales for Infinite Warfare compared to the rest of the series, questions the allure of superficial shooters as games move to VR. But the prospect of doing something new doesn’t always pan out either.
Never Been So Much Fun
Sensible Software was a humble UK studio that might’ve been lost to time were it not for their 1992 hit Cannon Fodder. This satirical game poked holes in the seriousness surrounding war representation in movies and novels by showing how senseless violence ought to be mocked. It was the cause of one of the earliest and biggest video game controversies at the time and has lived on as a chapter of the industry’s collective history.
Though many, like the Royal British Legion, denounced Cannon Fodder’s use of the remembrance poppy and what they saw as an insult to every fallen soldier, prompting the need for the well-known disclaimer at the beginning of the game, it’s since been reexamined as one the first example of satirizing a serious topic in the medium and managing to make a point.
Other controversies would arise after Cannon Fodder went on to earn its place in gaming history, but 2009 brought a similar fight with a different outcome that illustrates the permanence of the attitude towards this medium.
Atomic Games, a studio that has a long relationship with US Marine Corps (USMC) and developing several of their training simulations, announced the development of Six Days in Fallujah in 2009. The game would recreate Operation Phantom Fury which took place in 2004 and was being developed with the procedural expertise and personal accounts of soldiers who had survived.
From the outset, Six Days in Fallujah wanted to differentiate itself from other military shooters by developing mechanics that complimented the rules of engagement in use at that time in Iraq. This meant scenarios where suspected enemies were arrested instead of indiscriminately firing, and players had to continually communicate with other players or AI squadmates in order to progress as a unit through missions. If not promise, the game certainly presented something new at the time, and with veterans on staff, a working relationship with the USMC, and a story that aimed to honor one of the most involved campaigns on the War on Terror, Atomic Games assumed it had the support of the public and the military.
In reality, Six Days in Fallujah was hounded by controversy the moment it was announced. Like Cannon Fodder, it was branded as a crass representation of actual world events, wholly unfit to capture the essence of the real thing. Rather than back down, Atomic Games made their case to the media, invited cameras into the studio to meet the veterans working with them, show the critics there was merit behind the story they wanted to tell. But after their media campaign, publisher Konami decided to pull Six Days in Fallujah from development. It never found another publisher.
The relationship between the US military, the video game industry, and the portrayal of war in video games has both helped out either side while also creating a powerhouse of entertainment. At the same time, it has kept the medium in a position of constantly having to defend its legitimacy as respected media because leaders in the industry preferred inoffensive content to maintain broad appeal, and mid-level studios attempted to succeed by following their business model. But these boyhood power fantasies take on new significance in VR.
Though early in research, studies have found altruism spike and prejudices questioned when subjects were placed into convincing simulations. Knowing the effects VR can have on the player, creating an endless series of joyful jaunts through endless campaigns of war begins to seem both irresponsible and detrimental to the growth of the medium. But do developers have a responsibility to produce content that doesn’t consider industry relationships so special, or tries to generate the largest amount of chaos a console can handle? The answer to that depends on whether video games are considered art.
In 2012 the Smithsonian entered the debate with their Art of Video Games exhibit, showcasing the evolution of the medium and its rapid maturation alongside technology. In the years since games like Resident Evil 7 showed that VR can use embodiment to both tell story and entertain; That Dragon Cancer was able to chronicle actual tragedy and provide players examples of genuine healing; The Last of Us mastered the form of classical arts and married them with gameplay conventions, earning awards and accolades at an unmatched rate. Despite this, the medium continues to be deemed frivolous, or it’s accused of making gamers cruel if not outright more violent, keeping bad attitudes circulating within the industry.
Marketing firms, those who develop trailers for all the big AAA studios, don’t have the same hangups.
Trailers from AAA studios are often cut in a manner that makes them fit comfortably alongside Hollywood blockbusters and low-budget indies. Main characters are put front and center, the characters most important to them doling out relevant story beats in the couple of minutes allowed to sell the premise of the world and the game, often long before discussions about combat, weapons, and leveling systems are ever mentioned or asked about.
The business of selling a game is like that of a movie, putting the story above all else. Which makes sense considering that both are collaborative efforts.
Video games are made by programmers, writers, directors, models, actors, martial artists, stunt coordinators, athletes, cultural experts, animators, photographers, illustrators, musicians, and even singers. It’s a high-demand medium in need of artists with transferable skills and talents to relay the story to players as required.
While the artistic merit of video games may be contested, what is without dispute is that it is a collaborative effort by artists of many disciplines, meaning that the work they produce subjects itself to scrutiny.
In her essay “Social Responsibility and the Place of the Artist in Society,” found in Zones of Contention Carol Becker, Dean at the Columbia University School of the Arts, explored the cost societies pay when artists merely entertain rather than reflect their surroundings:
In our reluctance to ask how work engages within the larger social context, we are attempting to protect art and the artist from censorship. In practice, however, we are participating in the bourgeois notion of the isolation of the artist from society and of so-called high culture from the debates of representation and plurality current in popular culture. Instead of healing the split between the flatness of mass media and the complexity of the art world, we are allowing the split to become an abyss. In our refusal to contextualize the work historically–not art historically, but world historically–we contribute to the relegation of art to the sphere of entertainment and commodification.
Becker’s argument accurately describes the morass of self-censorship that has helped the video game industry rise as a financial institution and kept it from being considered little more than toys for decades. COD is just one example of how that self-censorship manifests, managing to avoid conversations of substance while also pleasing as many consumers as possible. But VR–Becker notes that the emergence of new technologies changes the effectiveness of that approach and gives the artist a platform to create something with permanence.
In the case of military representation, moving away from this special relationship in the US, we’ve had examples of complex philosophy and politics set in the improbable theaters of war of Metal Gear Solid. Hideo Kojima’s series of war games move away from those power fantasies to explore the effects of combat across generations and the political implications when nations move unconcerned with the lives of its people. The historical accounts in Valiant Hearts, communicate the personal cost of being forced into war. Personal experiences of displacement in Sarajevo as told by the developers of This War of Mine,shows how war can creep up one normal people who never expected it. Even divisive arguments can be made over the very nature of war, what ti does to those who fight it, and the very complicated act of calling anyone a hero after the fact like in Spec Ops: The Line.
But these make up a minority of the content available. Permissive attitudes in the industry make sure impotent works of fiction circulate and multiply, communicating nothing and, in time, failing to entertain as well. That bodes well for developers and studios that have managed to grab success by standing apart and maturing along with the medium rather than chase focus-group-tested demos. They now find themselves with the opportunity to both further the maturation of a medium into VR by developing software with an understanding of the responsibility they hold to their audience.
VR is technology made for video games. It will dominate the first several generations of the technology’s lifespan, if not indefinitely. Though Facebook’s Oculus Rift has been on shelves since March 2016, and new social media features are now making their way to VR, they’ve only sold 240,000 units. Similarly, HTC has invested a lot in “experiences” and 360° video, but it’s only nabbed 420,000 sales with over a year of availability. Sony, however, managed to move more than 920,000 Playstation VR units in its first four months.
It’s clear who’s the interested demographic when it comes to this hardware.
With what little is understood about the effects of VR, there still doesn’t exist an ethical framework to decide what is and isn’t harmful software for a platform that arrests so much sensory input from the user.
The one code given some authoritative attention was pieced together by Michael Madary and Thomas K. Metzinger of Johannes Gutenberg University of Mainz in their paper “Real Virtuality: A Code of Ethical Conduct.” In addition to a wealth of information covering the known effects of VR on the brain and the vast territory left to cover, are six rules meant to protect the mental wellness, privacy, and consent of the user while also pointing out the responsibility of developers to not abuse the trust they are given.
– No experiment should be conducted using virtual reality with the foreseeable consequence that it will cause serious or lasting harm to a subject.
– We recommend that informed consent for VR experiments ought to include an explicit statement to the effect that immersive VR can have lasting behavioral influences on subjects, and that some of these risks may be presently unknown.
– VR researchers aiming at new clinical applications should therefore work slowly and carefully, in close collaboration with physicians who may be better situated to make informed judgments about the suitability of particular patients for new trials.
– Overall, scientists and the media need to be clear and honest with the public about scientific progress, especially in the area of using VR for medical treatment.
– Torture in a virtual environment is still torture. The fact that one’s suffering occurs while one is immersed in a virtual environment does not mitigate the suffering itself.
– We leave the implementational details open, but urge the scientific community to take steps to avoid the abuse of informed consent with this technology, especially in the interest of preserving public trust.
Scientists must understand that following a code of ethics is not the same as being ethical. A domain-specific ethics code, however consistent, developed, and fine grained future versions of it may be, can never function as a substitute for ethical reasoning itself.
What is understood in the early days of this technology is that the bulk of VR content will be video games, and what dominates the industry now can prove to be harmful when it transitions to the realm of substituted reality. That is the minefield developers find themselves in because in accepting VR as the next logical platform they have unknowingly agreed to become the laboratories that will define the ethical boundaries of a technology that could well reach beyond their medium. But if those intrepid voices that managed to utilize art responsibly and apply technology creatively manage to stray from industry attitudes they may very well define the future of VR.
You can read Carol Becker’s essay, Social Responsibility and the Place of the Artist in Society, in the collection Zones of Contention. You can get a copy here.