Thompson returned to Miami for the birth of his first son, whom he and his wife named John Daniel Peace. Three weeks later, on August 24, 1992, Hurricane Andrew bore down. As his windows rattled and lightning slashed the sky, Thompson braced himself at the door in a scuba mask, holding it tight so that the glass wouldn’t blow through. His wife stood behind him holding little Johnny in a blanket. Thompson relished the biblical imagery and equated it to his own fight against what he called the “human hurricane” of rappers, pornographers, and shock jocks.
He survived the storm—and won the battle against Ice-T, who was dropped from Time Warner soon afterward. The ACLU voted Thompson one of 1992’s “Censors of the Year,” a title that made him proud. “Those on the entertainment ship were laughing at those on the other vessel,” he later wrote. “I felt that I had grabbed the wheel of the decency ship and rammed that other ship, convinced that the time for talk about how bad pop culture had become was over. It was time for consequence. . . . it was time to win this culture war.”
“COME ON, come on, come on, come on, take that, and party!”
Sam Houser stared into the smiling white faces of five clean-cut boys singing these words onstage. The group was Take That, a chart-topping boy band from Manchester, Britain’s answer to New Kids on the Block. In his new job as a video producer for BMG Entertainment, Sam was directing their full-length video, named for their debut hit, “Take That & Party.” For a kid weaned on crime flicks and hip-hop, this scene couldn’t be further from his more rebellious influences. The videos showed the boy band break-dancing, chest-bumping, and leaping from Jacuzzis. But it was a job—a creative job that fulfilled Sam’s lifelong ambition of working in the music industry.
By 1992, Sam had successfully retaken his lackluster A-Level tests and enrolled at University of London. Between classes, he headed over to intern part time at BMG’s office off the Thames on Fulham High Street. After his fateful lunch in New York, Sam had gotten his break interning in the mailroom at BMG—an accomplishment he took to heart, considering the obnoxious way he got in. Yet it epitomized his style: risking everything, including pissing people off, if it meant achieving his goals. “I got my first job by abusing senior executives at dinner tables,” he later recalled.
Sam already had his eyes elsewhere: the Internet. Though the World Wide Web had not yet become mainstream, Sam saw the opportunity to bring the kind of DIY marketing approach pioneered by Def Jam into the digital age. He convinced the BMG bosses that the best way to promote a new album by Annie Lennox was with something almost unheard of at the time, an online site. They relented, and Sam got to work. When Diva hit number one on the UK charts, it bolstered his cause.
BMG soon made waves in the industry by partnering with a small CD-ROM start-up in Los Angeles to create what the Los Angeles Times heralded as “the recording industry’s first interactive music label.” The newly formed BMG Interactive division saw the future not only in music CD-ROMs, but in a medium close to Sam’s heart, video games.
In 1994, the game industry was bringing in a record $7 billion— and on track to grow to $9 billion by 1996. Yet culturally, games were at a crossroads. Radical changes had been sweeping the industry, igniting a debate about the future of the medium and its effect on players. It started with the release of Mortal Kombat , the home version of the ubiquitous street fighting arcade game. With its blood and spine-ripping moves, Mortal Kombat brought interactive violence of a kind never seen before in living rooms.
Compared to innocuous hits such as the urban-planning game SimCity 2000 or Nintendo’s Super Mario Brothers All-Stars , Mortal Kombat shocked parents and politicians, who believed video games were for kids. The fact that the blood-soaked version of the game for the Sega Genesis was outselling the bloodless version of the game on the family-friendly Nintendo Entertainment System three-to-one only made them more nervous.
The Mortal Kombat panic reached a sensational peak on December 9, 1993, when Democrat senator Joseph Lieberman held the first federal hearings in the United States on the threat of violent video games to children. While culture warriors had fought similar battles over comic books and rock music in the 1950s and over Dungeons & Dragons and heavy metal in the 1980s, the battle over violent games had an urgently contemporary ring. It wasn’t only the content that they were concerned about, it was the increasingly immersive technology that delivered it.
“Because they are active, rather than passive, [video games] can do more than desensitize impressionable children to violence,” warned the president of the National Education Association. When a spokesperson for Sega testified that violent games simply reflected an aging demographic, Howard Lincoln—the executive vice president of Nintendo of America—bristled. “I can’t sit here and allow you to be told that somehow the video game business has been transformed today from children to adults,” he said.
Yet video games had never been only for kids in the first place. They rose up to prominence in the campus computer labs of the 1960s and the 1970s, where shaggy geeks coded their own games on huge mainframe PCs. From there, the Pac-Man fever of home consoles and arcade machines lured millions into the fold. By the early 1990s, legions of hackers were tinkering with their own PCs at home. A burgeoning underground of darkly comic and violent games such as Wolfenstein 3-D and Doom had become a phenomenon among a new generation of college students.
At the same time, Sam’s peers were riding a gritty new wave of art. Films such as Reservoir Dogs and music like Def Jam’s shunned cheesy fantasy for gutsy, pop-savvy realism. These products were bringing a lens to a world that had not previously been portrayed. When Los Angeles erupted in riots after the Rodney King beating, Sam watched—and listened—in awe to the music that reflected the changing times. The fact that Time Warner had dropped “Cop Killer” only seemed to underscore how clueless the previous generation had become.
Now the same battle lines were being drawn over games. To ward off the threat of legislation as a result of the Lieberman hearings, the U.S. video game industry created the Interactive Digital Software Association, a trade group representing their interests. The industry also launched the Entertainment Software Ratings Board to voluntarily assign ratings to their games, most of which fell under E for Everyone, T for Teen, or M for Mature. Less than 1 percent of the titles received an Adults Only or AO rating, the game industry’s equivalent of an X—and, effectively, the kiss of death because major retailers refused to carry AO games.
Yet with Mortal Kombat still burning around the world, the media eagerly fanned the flames. Nintendo, which ruled the industry, had sold a Disneylike image of gaming to the public, but this was now in jeopardy. Video games were “dangerous, violent, insidious, and they can cause everything from stunted growth to piles,” wrote a reporter for the Scotsman , “. . . an incomprehensible fad designed to warp and destroy young minds.”
While the medium was being infantilized by politicians and pundits, however, one of the biggest corporations in the entertainment business was taking up the fight. In 1994, in Japan, Sony was working to release its first-ever home video game console, the PlayStation, built on the idea that gamers were growing up. Phil Harrison, a young Sony executive tasked with recruiting European game developers, thought the game industry was being unfairly portrayed as “a toy industry personified by a lonely twelve-year-old boy in the basement.” Sony’s research told another story—gamers were older and had plenty of money of their own to spend.
Читать дальше