We see it.
Gliding through the sky, long neck undulating, great, ridged wings beating, the dragon looks ... beautiful. Until it lands.
Thumbs working the controller, Matt Fries, a freshman at American University in Washington, D.C., throws fireballs at it with both hands. The dragon lifts off, and lands again. It belches out a stream of yellow and orange flame.
"He's done a lot of damage," Mr. Fries mutters. But it's early in the game.
As in video game. Sitting in his tiny Washington apartment, Fries is doing what millions – actually, 10 million – have done over the last few months: fighting dragons in the celebrated new game Skyrim.
Since its November release, Skyrim has won award after award and led reviewers to call it the "greatest role-playing video game ever made." In its first month, it made $650 million, almost double the entire year's gross in the United States for "Harry Potter and the Deathly Hallows: Part 2," the bestselling movie of 2011.
Gamers know this. Why don't you?
C'mon. You don't. One surprising thing about the video game industry is that while adults play – in fact, 25 percent of players are over age 50 – most are unaware of how prevalent it has become in American culture.
For many parents, video games are what our kids love – and we fear. One antigame blogger describes an avid user this way: a kid who "rarely goes outside, showers, or interacts with the opposite sex." The American Academy of Child and Adolescent Psychiatry warns that children playing violent games "can imitate the violence they see."
We have esthetic complaints, too. A few years ago film critic Roger Ebert infuriated gamers by arguing that video games "can never be art."
Skyrim is a useful starting point to examine that view precisely because it has won so much praise.
"We design worlds," says industry legend and Skyrim director Todd Howard.
Mr. Howard means that instead of giving players the simple, gobble-up-the-bad-guys goal of the 32-year-old video game icon Pac-Man, games like Skyrim allow players to explore richly textured worlds, full of choice.
Well, the video game industry is itself a world worth exploring. Just how big is it? How many play? What makes games so popular? Can they do harm? Are they useful? Or – as a Wall Street Journal headline put it – "Are Violent Videogames a Threat to Society? Or Works of Art?"
* * *
First, the big picture. In 2011, the American video game industry says it:
•Recorded $25 billion in sales.
•Accounted for about 120,000 American jobs directly or indirectly.
•Paid workers an average of $90,000 a year, mostly in five states: California, Texas, Washington, New York, and Massachusetts.
Since 2005, the industry has grown eight times faster than the US economy. This is enough to have earned it the ultimate status symbol in Washington: a bipartisan congressional caucus to support games.
Undergirding these numbers is a zealous and global fan base. According to game designer and writer Jane McGonigal, a half-billion people on earth play video games an "hour a day," of whom 183 million are American. In fact, 97 percent of American young people ages 12 to 17 play video games. Five million Americans play at least 40 hours a week.
What they play runs the gamut: games for arcades – what you see at truck stops, in storefronts, at amusement centers – as well as for consoles like the Xbox 360, or for PCs, smart phones, and iPads. And they play things we forget are video games.
"The most popular video game?" asks David Johnson, a professor at American University. "Solitaire."
Of course! The pastime mesmerizing millions of Americans, including one woman I interviewed who could only stop playing at work if she turned her computer to face the hall, where her boss could see her screen. And there's Angry Birds, which has millions discovering the joy of shooting birds at pigs with a slingshot.
These are the "casual games." They don't take much skill. You can play them for five minutes on your phone.
Five myths about video games
Professor Johnson, armed with degrees in both divinity and anthropology, finds this segment of American culture fascinating. He teaches American University's only course on video games, which includes some history.
"You guys get to make a game today," he says one morning, rushing into class.
Fries is in the class. He's excited. Johnson doesn't mean any game. He means the result of that seminal moment in 1971, when Atari cofounder Nolan Bushnell asked an engineer named Allan Alcorn to create a simple game people might play in bars for quarters.
Mr. Alcorn did, and set it up in a local tavern. Soon, though, it broke down.
What went wrong? When Alcorn looked, the answer was clear. Nothing. Players couldn't stop. They had poured in quarters until the machine jammed.
Alcorn had invented ...
* * *
It's a game Bruce Nesmith remembers well. Now 52, with three daughters, Mr. Nesmith was about 11 when his dad brought Pong home. "I thought, 'Hey! Games aren't just played with little pieces of cardboard.' "
Nesmith, lead designer on Howard's Skyrim team, sits in the headquarters of Bethesda Game Studios, the Maryland company that produced Skyrim, with two teammates. They also got hooked early.
Matt Carofano, Skyrim's lead artist, 34, got turned on at age 5 by the Atari he and his brother got for Christmas.
"I was a latchkey kid," remembers production director Ashley Cheng, 38. "When my grandmother came home, she'd feel the TV. If it was warm, that meant I was playing games – instead of practicing piano."
The three of them typify one part of the video game world: its creators. For 10 years, they have worked together under Howard on a series of role-playing games called The Elder Scrolls. In RPGs, players create characters, assign them a role, and direct them on quests. The Elder Scrolls have been very popular.
With Skyrim, the fifth in the series, the team wanted to go beyond what they had ever done.
Which means ... what? They definitely wanted to include dragons. "It's like the holy grail," says Mr. Cheng. But the team didn't want ordinary ones.
"We want to produce suspension of disbelief," Nesmith says. He looks at me to make sure I know what he means.
Like any good English major, I do. He's quoting Samuel Taylor Coleridge, writing in 1817 of what he wanted to achieve in "The Rime of the Ancient Mariner": the "semblance of truth" that might make readers forget he'd made it up.
And so, in the quest for realism, the Skyrim team invented a language for dragons. They studied film of bats to give dragons the qualities that would make them look familiar. Mr. Carofano remembers with pleasure how, as he previewed Skyrim for reviewers, they applauded when the first dragon appeared.
I want to understand what absorbs game designers. I ask Nesmith to describe something he'd obsess about with Skyrim.
"I wanted our magic system to get a face-lift," he says. In other games, when a character threw a magic fireball at the enemy, it was just a little red ball. "We wanted it to have a tail. Scatter flames! Leave a footprint!"
To watch Skyrim confirms Howard's vision. The obsessions Nesmith describes, the technical advances spurred by the industry, the years Bethesda was willing to allot to it – all combine to produce a beguilingly varied world.
Now the Skyrim team is sending out "patches" – ways to fix the inevitable bugs players have reported. So far, not only have 10 million played, but those who have done so on a PC, which the company can track, average a total of 75 hours each.
What's so compelling?
World's top video game markets
* * *
Fries sits in his parents' living room in Virginia, wearing a faded green Peace&Love T-shirt, controller in hand. He's got Skyrim up on the big screen. His father and a friend, Tom Harvey, watch.
No dragons this time. His character has a more limited quest: making his way across frozen tundra toward a town. As he travels, Fries makes choices for him.
Fight or retreat? Enter a cavern or choose another way? Walk slowly or run ahead? It's what Fries has done playing video games for more than a decade.
"The average young person racks up 10,000 hours of gaming by the age of 21," says Ms. McGonigal. Ten thousand hours. It's a number made famous recently by Malcolm Gladwell's book "Outliers." In it, he offers the 10,000-hour-rule, based on research by a Swedish psychologist who argues that it might take that much time to become really good at something.
Pianists do it. Why not gamers? Forty-hour-a-week gamers might seem scary. Fries and Harvey are more typical. They met when Harvey managed one of the 6,500 stores in the GameStop chain, now the largest American retail outlet for video games. When Fries turned 16, Harvey hired him as his assistant.
Games don't totally dominate their lives. Fries keeps up with schoolwork. Harvey now manages a clothing store. They are articulate, funny, and take showers. But they've both put in their 10,000 hours – including entire days on weekends.
And Harvey illustrates something else. The average American gamer is about 37 and has played for 12 years. Harvey is 29. He's played for 13 years. This isn't something kids outgrow.
Why not? Yale professor Paul Bloom, author of "How Pleasure Works," points out that Americans find many products of the imagination – games, movies, TV – more interesting than real life.
"Why would individuals ... watch the television show 'Friends,' " he quotes one psychologist as saying, "rather than spending time with actual friends?"
Among other things, Dr. Bloom says, the adventures of fictional characters are usually "much more interesting" than ours.
Fries sees the relevance to games. "You can't walk around on giant tundra with a sword," he says about real life. "You can't swim next to a submarine. I could go skydiving, but I'm horribly afraid of heights. If I hit the ground in a game, I won't die."
Bloom offers another reason, quoting television and literary critic Clive James. "Fiction is life with the dull bits left out."
Nesmith confirms that. In real life, he says, "we often say nothing of consequence. You don't want that in a game."
When you ask what's unique to games, though, designers or players all mention one thing. "You get to interact," says Carofano. "There's something rewarding about that."
Clearly, interaction – choice – separates video games from other forms of storytelling. You don't just read about someone killing a dragon. You do it.
Howard offers yet another attraction. Wearing jeans and sneakers, one hand in his pocket, the other holding a remote, he is the keynote speaker at a conference in Las Vegas, which has named Skyrim "game of the year."
"What can games give you that nothing else can?" he asks.
Against a black screen behind him, the answer appears. PRIDE.
"Pride in something you did," he says.
"Definitely true," Fries says. "Sure, you get a feeling of pride reading a book. With games you're participating. You work towards beating the game."
Finally, critics of games point to another allure: their violence. Clearly there's something to the charge: When companies release violent and less violent versions of the same games – one famous example is Mortal Kombat – the violent ones sell better. But does that make players more violent in real life?
This possibility alarms people – and politicians. In 2005, despite an industry rating code, California Gov. Arnold Schwarzenegger (R), star of some of the most violent movies of all time, tried to ban the sale of violent video games to minors. The move launched a lawsuit.
It wound up in the US Supreme Court.
IN PICTURES: Video games: A $25 billion industry
* * *
"California asks this court to [permit] states to restrict minors' ability to purchase deviant, violent video games ... harmful to the upbringing ... "
Justice Antonin Scalia doesn't let California's lawyer finish. "What's a deviant? As opposed to what? A normal violent video game?"
It's June 2, 2010. The Supreme Court is hearing California's argument.
"Yes, Your Honor. Deviant would be departing from established norms."
"I mean, some of the Grimm's fairy tales are quite grim, to tell the truth," continues Mr. Scalia. "Are you going to ban them, too?"
California's lawyer remains deferential. "The interactive nature ... is especially harmful to minors," he says a little later, citing studies.
Justice Sonia Sotomayor: "One of the studies says the effect is the same for a Bugs Bunny episode."
Is she right?
Two leading researchers, Iowa State professors Douglas Gentile and Craig Anderson, believe otherwise. Citing 130 studies, they found "consistent evidence" that violent games promote aggressive "thoughts, feelings ... and behaviors."
The question is, how much? That's something researchers hotly debate. Most agree on at least one point: When a mass shooting occurs at a school, and the kid who did it turns out to have played video games, the reason for the outburst was probably not the time spent in front of a computer. All kids play. There's some evidence that shooters play video games even less than average.
Harvard University researcher Lawrence Kutner spent two years studying the effects of violent games. He acknowledges that kids who play them more than 15 hours a week seem more likely to get into trouble. But that, he quickly adds, doesn't mean the games caused it. His advice for worried parents: "Relax."
In the end, mostly because of free speech issues, the high court ruled 7 to 2 against California. It found that studies trying to connect violent games and actual violence "were not persuasive." That's not the same as finding there's no connection. Still, it might reassure worried parents that both Ms. Sotomayor and Scalia, who agree on practically nothing, agreed on that.
* * *
Aside from what the court ruled about violence, the California case illustrates something else about video games: the degree to which the industry finds itself inextricably bound up with government. One issue in particular absorbs Rep. Jim McGovern (D), the 10-term congressman from Worcester, Mass., who is a member of the video game caucus that was formed last year.
"My question," says Mr. McGovern, "is what will we make five, 10 years from now." Then he asks two more. "How can we have a well-trained workforce? How can we be an incubator?"
The video game industry interests him because two schools in his district, Worcester Polytechnic Institute (WPI) and Becker College, both appear on the Princeton Review list of colleges with the 10 best game-design programs in the country. In the national debate about whether government can create jobs, McGovern's stance is clear. He believes it can. He's gotten Massachusetts game companies a grant to bolster them.
"I don't want [video game] jobs [to go] overseas," he says. "I want them here."
But then McGovern talks about an issue that isn't just economic. In January, Congress bitterly debated the Stop Online Piracy Act (SOPA), intended to protect intellectual property rights. It drew protests from online giants like Wikipedia and Craigslist, which temporarily shut down their sites.
The issue split the video game industry: Its trade association supported the measure, while many small companies, dependent on untrammeled Internet access, opposed it. McGovern opposed it.
Shouldn't we stop online piracy?
"You have to be careful of unintended consequences," McGovern says. He worries about the effect on dissidents in countries who need the Internet to communicate. "You can't just design a bill and drop it in," he says about piracy. "We'll have to deliberate – something we're not good at."
McGovern isn't just a member of Congress, of course. He's a parent. Does he worry about the effects of games on his kids?
"I've made trips to GameStop with my son," McGovern says. "I'd be lying if I said I watch every one of his games."
Then he kind of gauges things, pulls out his cellphone, and calls home. "Patrick, what games do you play? Just tell me the acceptable ones."
"A reporter," McGovern says.
His son knows the score.
"Madden 12. OK," McGovern relays, talking about the successful John Madden football video game that includes violence Americans don't mind.
Another pause. Patrick is mentioning another one.
"That doesn't sound acceptable," McGovern says.
He looks around to make sure I know he's kidding. He trusts Patrick. He's relaxed. Mr. Kutner would approve.
* * *
"Ten years ahead? Impossible," Brian Moriarty says. I'm sitting in a conference room with him and Dean O'Donnell, both teachers at one of the schools in McGovern's district: the 147-year-old WPI. I've made the mistake of repeating McGovern's question. "Five years ago we didn't have the iPad," Mr. Moriarty reminds me.
He and Mr. O'Donnell will be happy if students get jobs after graduation – and five years out, have the training to adapt to what's new.
If it was ever appropriate to call colleges ivory towers, those days are gone. Colleges fight to develop marketing niches. Game design is one. Invisible on campuses a decade ago, it now appears as a major in more than 300 college course catalogs.
And the future of the industry holds more than games like Skyrim, Moriarty and O'Donnell point out. They mention a buzzword: gamification. Inspired by the magnetic pull of games, developers are trying to incorporate video game elements into everything: simulators to teach pilots, instructional DVDs for people who've bought a washing machine, or games allowing people to explore moral dilemmas like fighting a nuclear war. These are what are called "serious games."
"Hate that term!" O'Donnell says.
You can see why. Can't a richly imaginative game be "serious?" Yet O'Donnell values so-called serious games, too. Making them might be where his students wind up. And isn't there something thrilling about the idea that a game might help a pilot fly, a kid learn algebra, or a wounded veteran, back from Iraq, deal with injuries? This is a big and expanding part of the video game world.
I've looked forward to meeting Moriarty, a 25-year industry veteran. Five years after Mr. Ebert's games-can't-be-art article created a storm of criticism, the controversy began again. At last year's biggest video game convention, the Game Developers Conference, Moriarty gave a speech in Ebert's defense. That also sent the video game world into a tizzy. He was one of them!
Ebert had two main arguments. First, that games aren't art because they are interactive. Great art, he said, is contemplative. Tolstoy and Mozart didn't want listeners or players making choices.
The counterargument? It's not true. Yes, Skyrim players have choices, but only along lines carefully thought out by its designers. Besides, who says those engaging with works of art have no choice? When pianists play a Mozart sonata, they choose tempo, dynamics, and pedaling. When jazz musician Dave Brubeck improvises on "Take the 'A' Train," he makes thousands of choices. In 2009, the White House gave Mr. Brubeck a medal for excellence in "performance arts."
Moriarty does defend Ebert's first point. He devotes more time, though, to a second one. Why, Ebert had asked, couldn't anyone cite a game worthy of comparison with the works of great dramatists, poets, filmmakers, novelists, and composers?
"Nobody could answer that," Moriarty says, explaining why he gave his speech.
Moriarty argues that a great book offers things games don't. "When I feel the need for reflection, for insight, wisdom, or consolation, I turn my computers off," he said in his speech. There are many things competing for his time. If he finds a great book, he says now, he would rather read it than play games.
Otherwise he'd suffer "gamer guilt" – the moment when "you wake up and say, I've just wasted 40 hours of my life."
O'Donnell can hardly sit still. "The idea that all game time is wasted time!"
"I never said that!" Moriarty replies. "I said I'd rather read Proust."
"Just because [a game's] not sublime doesn't mean it's not art. It's a beautiful thing!" O'Donnell says. "Michael Jordan was an artist!"
"He was an athlete!"
Back and forth they go. To me, Moriarty isn't arguing about whether games can be art but a more limited question: Can they contain the complex characters and moral dilemmas of novels or films? Later, when they've calmed down, Moriarty admits the "Are games art?" question isn't over. "I'm an optimist," he says. "Maybe we just have to wait for our Mozart."
* * *
Soon, we're talking to four students in the WPI program, one of whom might turn out to be the next Mozart. These are students with dreams tempered by realism. Wisconsin native Beth Kunkel knows that most of her classmates may not even wind up in the industry. Virginian Jeff Thomas thinks he might "get a job in computer science to make sure I have a house."
But Connecticut junior Nick Konstantino still seems focused on games. Later, he'll show me the "mocap" project he's working on – a high-tech way to capture motion from live actors that designers use to develop animation. But he's been building games since middle school. The MMO – massive multiplayer online game – he created in high school had 40,000 subscribers.
I mention Curt Schilling, the former Red Sox pitcher who has put $20 million of his own money into a video game company and just released a big game. "What if someone gave you that much money? What would you do?"
"The perfect MMO," says Mr. Konstantino. "I love multiplayer."
MMOs are enormously popular. Last November, more than 12 million people around the world were playing just one of them: World of Warcraft. Single-player games are hardly the only option for gamers. In fact, when I ask about Skyrim, this group sounds restrained. They admire it more than they love it.
But then Konstantino mentions something interesting. After Bethesda released Skyrim, Howard gave his team one week to create whatever addition they wished the game had. Howard called the result "Game Jam" and put it online.
Konstantino loved what he saw. "Amazing things!" he says.
I don't expect to like it. I can't help myself. To a remix of Martin Solveig's "Hello," Game Jam cuts from one tableau to the next: foliage that changes with the seasons, warriors mounting dragons to ride off across the countryside, footprints appearing behind them while characters walk across the snow. It's imaginative and fun to watch – and not just for kids.
* * *
Nesmith sits and reflects on the changes the industry has seen since his dad brought Pong home four decades ago. The Skyrim designer is aware of how unusual he is: a grown-up who knows more about games than kids do.
He doesn't make a big deal of it. A few years ago his daughter mentioned what he did in class. One of her classmates jerked around and started shaking her desk. "YOUR DAD MADE FALLOUT! OH. MY. GOD!" Since then, Nesmith's family doesn't make a big deal of it either.
After 40 years, he still loves games. He's impatient with people who think of it as "just" entertainment. "The importance of play cannot be overstated," he says.
Fun, satirist Tom Lehrer once lamented, was "unfortunately not something guaranteed by the Constitution." Nesmith reminds us of something useful: Even in a country founded by Puritans, there's nothing wrong with having a good time.
You'll get no argument from Fries. By now, he has allowed his Skyrim character to imbibe a magic healing potion. He's ready to fight again. The dragon belches another jet of flame. Fries heaves fireball after fireball at it. The fireball tails stream behind just like Nesmith wanted. The dragon heaves itself up off the ground, then collapses.
Fries feels ... pride.
What will his kids play when the advances of Skyrim seem as primitive as Pac-Man does now? Will the games be a source of reflection, insight, wisdom? Controversy and conflict?
And there's one other question that, like Skyrim, also involves choice: What's the opportunity cost of games – what else might we do if we weren't spending 10,000 hours in front a computer screen?
Hard to predict. Because in 2012, the world of video games turns out to be as complicated and uncertain as the real world. It is dazzling, imperfect – and unfinished. Like Skyrim, the industry needs patches. It needs its Mozart.
Relax. For most of us, it offers little to fear. There's much to like. And it's early in the game.
• Robert A. Lehrman, who owes much of the insight in this story to tutoring by his son, Michael, is a novelist and former White House speechwriter for Vice President Al Gore. Author of 'The Political Speechwriter's Companion,' he teaches at American University and co-runs a blog, PunditWire.