NextGen’s Top Ten Years In Gaming History

  • Post last modified:Saturday, March 27th, 2021
  • Reading time:30 mins read

by [name redacted]

Originally published in some form by Next Generation. I was asked not to include 1999 or 2000, because the Dreamcast was perceived as a low mark in the industry rather than a high one. I was also asked to include the previous year, to suggest that we were in the middle of an upswing. So… that explains some of the selections.

In videogames, as in life, we tend to get things right about a third of the time. There’s one decent Sonic game for every two disasters; one out of every three consoles can be considered an unqualified success; the Game Boy remake of Mother 1 + 2 was released in one out of three major territories. With the same level of scientific accuracy, one can easily say that, out of the thirty years that videogames have acted as a consumer product, there are maybe ten really excellent milestones, spaced out by your 1984s and your 1994s – years maybe we were all better off doing something out-of-doors.

It kind of makes sense, intuitively: you’ve got the new-hardware years and the innovative-software years, spaced out by years of futzing around with the new hardware introduced a few months back, or copying that amazing new game that was released last summer. We grow enthusiastic, we get bored. Just as we’re about to write off videogames forever, we get slapped in the face with a Wii, or a Sega Genesis – and then the magic starts up all over again, allowing us to coast until the next checkpoint.

If we’re to go chronologically, our history of delights will look something like this:

1972: The Institution

As the saying goes, the player is the ping; the game is the Pong. Today, we use the phrase metaphorically: Pong as the prototypical videogame. In the early ’70s, it would have driven Ralph Baer up the wall. In early 1972, Baer was ramping up promotion for the final result of six years of his life – the world’s first consumer game console, which he dubbed the Odyssey. The Odyssey had interchangeable game cards, and even a light gun; although analog-based and battery-run, the system was basically the model for all modern game consoles.

That spring, a University of Utah graduate named Nolan Bushnell attended one of the Odyssey’s early pre-release demonstrations. Already familiar with (cloning) the earlier SpaceWar!, he played Baer’s ping-pong game with rapt interest, his mind abuzz with ideas of how to “fix” its gameplay. Soon after, Bushnell was the head of a new company called Atari, testing the waters with its own video ping-pong product. The difference: while the Odyssey’s manufacturers, Magnavox, had no clue how to market Baer’s system, even going so far as to suggest that it would only operate on Magnavox televisions, Ralph Baer was a master of publicity. And, for the time being, Pong cost just a quarter to play. Thus, “Pong” became the first generic videogame brand – coining not only a colloquialism; also a trend that we will see repeated throughout this article.

1977: The Definition

For all its questionable origins, Atari quickly put its influence to broader use, both leading the first generation of true multi-game consoles and inspiring the first generation of gaming PCs. Five years after the Pongsplosion, Atari backpedaled a bit, looking back to the Odyssey again and refining the basics. The result was the Atari Video Computer System, or VCS. The system would be digital, and far more sophisticated than Baer’s. Games would be stored on tough, kid-friendly cartridges rather than frail cards, much like the new videocassettes introduced by Sony and JVC only two years earlier; much like a Betamax tape, all you had to do was jam your favorite Atari “tape” into the machine, and you were set to go. People were already used to the motions, so at last all the zeitgeists were in a row. Cue the new generic: “Atari”.

Just previous to this, while Atari was still fussing with the tail end of Pong, an Atari employee named Steve Jobs called on an old chum of his, Steve Wozniak, to help him design a one-player Pong derivation. Wozniak came up with a byzantine yet cost-effective logical diagram for a game that was ultimately called Breakout. Following this pairing, Jobs noticed a personal computer that Wozniak had invented for his own use; he called it the “Apple”. Before the summer of ’77, Wozniak had refined his clunky hobbyist system into a cheap consumer-oriented design, labeled the Apple II. With its powerful processor and color graphics, it was only a matter of time before computer games began to appear on the market…

Incidentally, it was also around this time that a Taito employee named Tomohiro Nishikado became enthralled with Wozniak’s game, Breakout, and decided to make his own tribute – except he would alter the “bricks” broken with the player’s ball, so they would move around the field. Since they were moving, he shaped them like aliens. And to make the game easier to play, Nishikado took away the ball physics and had the player’s paddle simply shoot the “ball” toward the oncoming aliens. Cue the birth of the Japanese game industry.

Simultaneously with Nishikado’s adventures, a man named Ed Logg joined up with Atari; his first duty: an update of Breakout that would be known as Super Breakout.

Also in 1977, a young man named Toru Iwatani joined Namco under the illusion that he would be allowed to design pinball games. Since pinball was on the way out and videogames were on the way in, Iwatani bode his time making clone after clone of Breakout, incorporating more complex level designs, flashing lights and colors, bringing the game ever closer to pinball, nearly making a maze out of his game boards…

1982: The Boom

By now, all of the potential suggested in the last three paragraphs had finally exploded. Nishikado’s Space Invaders was so successful it caused a coin shortage in Japan; with his Asteroids (to SpaceWar! what Breakout was to Pong) and Centipede (to Space Invaders what Space Invaders was to Breakout), Ed Logg came to establish Atari as a world leader in game design. With Pac-Man, Tori Iwatani expanded the audience for videogames to women, children, and casual players who had until that moment never showed an interest in the medium. Inspired by the broad appeal and narrative premise to Pac-Man, a Nintendo employee named Shigeru Miyamoto refitted a bunch of failed Space Invaders clones to tell a story of a carpenters and apes. Then Activision followed up with its own Jumpman game, placing him in a land of crocodiles and pitfalls. And by now, all this stuff was becoming available for home systems.

Speaking of Activision, Atari finally settled a long-standing suit, opening the way for third-party development. And develop they did. In the short term, it was great – anyone could publish anything for the Atari 2600. The downside, as would evidence itself within a year or so, is that anyone could publish anything.

Videogames had really made it – and though Atari was still at the crest of the wave, there seemed plenty of room for all. Ralph Baer had a slightly more successful second go with his Odyssey2, a significant improvement on the Atari VCS (now known as 2600) hampered mostly by a lack of outside software and Atari’s firm domination of the market. Later, Mattel broadened the market a bit more, by chasing after a more adult audience with its Intellivision. And in 1982, Coleco introduced the cream of the crop: the ultra-powerful Colecovision, which also served as a kind of all-in-wonder console, with its support of Atari 2600 cartridges. With a Colecovision at hand, you had both power and compatibility. What more could you need?

In the face of all of this competition (including, thanks to Activision, on its own system), and encouraged by its own ridiculous success, Atari responded with its own super-powered console, the 5200. The problem here, besides the awkward controllers, was that Coleco’s wisdom was a bit lost on Atari: the 5200 could only play its own games, and there weren’t many of them. What games it did have were mostly “enhanced” games that everyone already owned for the 2600. The bubble was stretched as far as it would go; when it burst, Atari never really got a chance to recover.

As for what Atari had to offer for its old console, you might have heard of the 2600 version of Pac-Man. How about E.T.: The Extra-Terrestrial? Still, though the omens were out there, disaster was still months and months away. On its own merits, 1982 was the high before the great depression.

Speaking of the “great crash”, the one thing that did ride it out okay, from here to the next paragraph, was personal computers; as they also had a practical use, there was no good simply chucking them in the bin. People kept using them; therefore, games kept being made for them. Whatever didn’t end up on the Apple II was created for the brand-new, hyper-inexpensive Commodore 64 – bolstered by one of the most powerful sound chips to date and the recent innovation of MIDI-sequenced music. Many friends’ doors were knocked down. With Atari out of the way, the computer age was here.

1986: The Resurrection

If 1982 was the Industry in 2000, 1986 was the Internet in 2002: shoulda seen it coming. Anyone with a brain did, actually. Can’t stop an angry freight train, though. And when the bottom fell out, the investors went running.

Somewhere amidst this noise, that Japanese playing card company with the gorilla game decided it was time to try its hand at the consumer market. Longtime Nintendo employee and toy inventor Gunpei Yokoi put his energy into developing a product with broad appeal that reflected all of Nintendo’s hardware and software innovations to date, from the character-based games Yokoi’s R&D studio had been dreaming up to Yokoi’s “Game & Watch” LCD games. When the product was ready, Nintendo made a trip across the Pacific, to offer Atari a distribution deal. Atari, for its part, was busy leaving the console business, thank you very much, putting much of its energy into the upcoming Atari ST computer. This was, after all, the computer age.

Nintendo forged on, undaunted. Since game consoles were no longer chic, its Famicom system was redesigned so it more resembled an inexpensive VCR than an expensive toy (harking all the way back to 1977). The company hired some of the top marketing agents in the U.S., put into place some stringent “quality control” measures – whereby third parties required a license and all cartridges were manufactured by Nintendo – and installed its trademark gorilla in the retail channels. In late 1985, the Nintendo Entertainment System debuted in FAO Schwartz in New York, and took the world by storm. By early 1986, the system was available everywhere. And like that, “Nintendo” was our new generic. Even if you bought a Sega Master System, or a resurrected Atari 7800, both of which had plenty to offer (with the exception of third party support).

Oh, yeah. Then there’s this “Super Mario Bros.” game the kids seemed to like…

1989: The Division

And then there was choice – real choice. Different kinds of systems with different kinds of games, for different kinds of players. Ninja Gaiden popularized the now-dreaded cutscene. Phantasy Star II brought a sense of literature to console games. Ys Book I & II showed what optical storage could do for a game. Electrocop was… in color, and portable. And then there was Tetris!

And amongst this wealth, suddenly generics began to break down. When everyone has a different alliance, it’s no longer so easy to use one blanket name. The NES was “Nintendo”; the Genesis was “Sega”; and any portable system was “Game Boy”. That’s safe, right? Lynx and Turbografx fans are left to their own devices.

It is this factionalization, more than any technological issue, which was Sega’s real influence on the industry. They had something distinct to offer, that did what the competition didn’t – and yet which in no way obviated the competition. True, at this point Sega had only a half-dozen games worth playing. Still, add in some backward compatibility and you’ve got some wiggle room until Sega really finds itself.

Although the Turbografx never made much of an impact in North America, its CD-ROM add-on fueled the whole multimedia craze that would immerse and distract the industry in years to come, inadvertently helping to lead Sega off a cliff there was no rescaling. At this time, it was super cool, and threatened to make NEC’s system halfway popular.

And then there is portability. In the same way the VCS followed Pong, the Game Boy is – in a sense – a more true successor to Gunpei Yokoi’s legacy than Nintendo’s line of home consoles. Add in cartridges and a raster array, and “personal” games are once again viable. Stereo headphones make the experience all the more intimate, and this Tetris thing is both an excellent example of Glasnost and the “killer app” a pocket system needs. Just imagine being able to pull out an addictive game at a bus stop, on the train – anywhere you’ve a spare moment – and playing for just a few minutes. It was a whole new model of game design, and of thinking of videogames.

This is also the year that computer games changed forever. Over the eighties, Intel kept chugging away at its x86 line of processors; IBM-clone screens went four-color, then sixteen-color, then full-on too-good-to-be-true 256-color VGA. IBM clones were cheap, and well-supported. Then, all at once, came the kicker: the 486 processor, paired with Sound Blaster digital audio, and Super VGA graphics. Also, around this time, modems started to speed way up, even up to the ridiculous speed of 14.4kbps.

Suddenly, IBM clones were legitimately appealing as gaming machines, and would be the primary focus of most US-based development. Europe is another story, and Japan barely even used computers at this point. From now on, the IBM PC would be the target of a “reverse generic” – in that “Personal Computer” would automatically mean a DOS/Win Intel-based system; everything else had to go by name.

This year, everything seemed possible. You could feel it in the air – videogames were right on the verge of something amazing. The Nintendo myth was still in full force. Tom Kalinske was building the Sega dream. And just look at all this technology – CDs, thousands of colors, digital sound, 16-bit processors in sleek black casing, portable videogames (some even in color!). Surely, at this point, there are no more limits to what videogames can do. It’s hard to even conceive of what to do with all this power…

1991: The War

And Nintendo wanted in, darn it – what’s the deal with all these interlopers? Well, they’d show ’em. Out with the Super NES, and the biggest and best of what everyone already knew he wanted: a flashier version of Super Mario 3, a… less than ideal version of that Final Fight game that everyone’s talking about, and hey – look at this Mode-7 stuff. Can’t get that on the Genesis, can you?

Sega’s response: “What, like this?” Cue Sonic the Hedgehog and its rotating bonus levels. Since the first generation of SNES games suffered from some pretty atrocious slowdown and flicker (much like the “jaggie” business with the PS2, for those who remember that), and since indeed the SNES had a much slower processor than the Genesis, Sega chose to capitalize on all its strengths, and devise a mascot that stood for everything Sega: faster, sleeker, hipper, with a certain sassiness that comes from fighting the system. Oh, and blue. As a zeitgeist, Sonic was basically ideal – and his game, though not as “deep” as Super Mario World, had an energy and mischievous charm lacking in the Nintendo myth. Cue five years of fuzzy mascot games.

Nintendo sold a bunch, because it was Nintendo. Sega sold more, because it had proved a reliable and appealing alternative and because the Nintendo kids were getting older. Big kids can’t keep playing with little kids’ toys, and Sega gave them the message that their hobby was growing up with them. Sega was hip; Nintendo wasn’t. Furthermore, everything Nintendo had to show, Sega one-upped. Final Fight was slow, censored, and only one-player? Here’s a game called Streets of Rage, with the coolest soundtrack in the world. Gradius III buggy and slow? Hey, the Genesis was made for shooters! And so on and on. Meanwhile, Actraiser sort of slipped under everyone’s radar.

The press bought into it too. EGM named Sonic its game of the year – third year in a row for Sega, after Strider and Ghouls ‘N Ghosts – commenting that, though Mario was nice, they felt they’d played the game already. And Sonic… well, he just captured their hearts.

Of course, Nintendo was still gearing up. All Sega’s success did was kick Nintendo into tiger mode. Over the next two years, the one-upsmanship between the two companies would rise to ludicrous levels, resulting in such eccentricities as the Super Scope, the Sega Menacer, and “blast processing”. Right now, Nintendo was all potential and Sega was all action. Stuff was happening, and it was exciting.

Sega also introduced its own portable Master System called the Game Gear. It was a battery monster, and not many games were ever released for it. Still, hey, it was in color – and it was by Sega! For anyone who actually owned a Master System, there was also a dongle to allow play of Master System games. That made it much more useful. In 1991, though, who the heck knew how little would come of it.

1993: The Great Experiment

Things continued as they had. Nintendo kept chugging along; in the wake of Sonic 2 and a few million Happy Meals, Sega took a solid lead, both in market and in mindshare. Nintendo hit back with game-of-the-moment Street Fighter II, the console’s first real system-seller, leaving Sega reeling ever so slightly.

And yet, such was Sega’s influence that Nintendo stalwarts Capcom and Konami began to grow restless within their exclusivity contracts. Before long, Sega had brand-name third-party support. The first major release? Why, Street Fighter II, of course. Despite some early concerns about the console’s potential muscle power, color depth, and sound quality, the Genesis version was every bit as good as the SNES one. Some people even cite its superior controls. To further the impression, Sega revised its lousy controller, giving it six face buttons, a comfortable shape, and an excellent D-pad; some people still cite this controller as one of the best ever made.

Inspired by NEC, Sega chose to take the war up a notch with its own CD add-on. It was expensive, slow, and by the time manufacturing had ceased, it had virtually nothing worth playing on it. Still, multimedia was the buzz-word – and Sega was hardly alone in buying into it. Electronic Arts founder Trip Hawkins was enthusiastic enough to come up with the 3D0; although the CD-i had been around since 1991, people started to pay some vague attention to it around now. Even Nintendo started to work with Sony an a CD add-on for its SNES. When it became obvious what a corner Sega had painted itself into, Nintendo abandoned all plans for its PlayStation system, irking Sony to no end.

Another trend that the media often linked with multimedia was the “virtual reality” craze. The idea was, in the future we would all be playing video games in virtual 3-D space. It wouldn’t just be flat pictures on a screen; we’d be wearing goggles and gloves and wandering around computer-generated spaces – sort of like a holodeck on Star Trek. Although the goggles never went anywhere, the idea of 3-D space captured everyone’s imagination. Cue trend-setter Sega, with its model-1 arcade board and “Virtua” series: Virtua Racing, Virtua Fighter, Virtua Cop. Cue Nintendo, with its Super FX chip and Star Fox. The polygons might be flat, unshaded, and hideous-looking – yet they were the future. You could play the games, squint, and imagine what it would be like when real VR arrives.

On the PC end, shareware had slowly begun to take hold as a distribution method. Small, amateur teams of developers – often only one to three or four people – could gather resources, develop an idea, upload it to the local bulletin board, and let word of mouth do its work. The dial-up boards at the time had download ratios: you could snag any file you liked, so long as you uploaded so much data in return. This kept a healthy give and take of files going: whenever you had something neat to share, you would upload it to earn more download credits. Whenever you saw something that seemed interesting, you would grab it in hopes of exchanging it elsewhere.

By 1993, there were already some large Shareware publishers – Apogee, Epic Megagames – with a large stable of developers and a bunch of influence in distribution. Even so, shareware was very much a meritocracy, guided by the tastes and whims of individuals: the best, most interesting downloads spread. As a result, shareware got really good, really fast. It had to be programmed well, so it could fit into as compact a download as possible. It had to push the limits of what was expected on modern PCs, so people would spread it along. It had to be entertaining, so people would actually write out a check and pay for more.

While Intel released its Pentium processor, shareware developers figured out how to crack the basic memory barrier that had so long plagued DOS-based games. DOS extenders, like DOS/4GW, allowed programs to make use of higher memory functions on execution, without any need for the user to fidget with his computer’s settings and rebooting half a dozen times. The moment the Pentium arrived, all of its power was available for manipulation; it’s almost like a light switch was flipped. Suddenly PC games could be “as good” as Genesis or SNES games – meaning as flashy, as complex, as fast to respond.

Into all of this came id Software. Carmack, Romero, and company had already put publisher Apogee on the map with the Mario-inspired Commander Keen series. In 1992, they cobbled together a primitive yet elegant pseudo-3D engine and decorated it with Nazis, for extra shock value. Although in many ways inferior to Origin’s Ultima Underworld, Wolfenstein 3D was free to download and share – and therefore reached the widest audience, illustrating for the first time the potential of shareware as a distribution model, and showing many people their first glimpse of virtual 3D space.

This year, id went indie to publish its pseudo-sequel to Wolfenstein – a little opus named “Doom“. And just like that, just as the PC was starting to take hold as a platform, to explore its boundaries, everything began to change.

On the commercial end of PCs, you had the left hand to the right hand’s 3D: multimedia. CD-ROM drives were now available, and people bought them en masse, bolstered to a large extent by Nintendo PlayStation almost-ran Myst – a game which would become, and remain for some years, the highest-selling PC game of all time. Between Myst and The 7th Guest, the template was essentially down for mass-market PC games: actors, big budgets, ray-traced graphics, and arbitrary puzzles. It all seemed neat at first. They were almost like movies!

Oh yeah – then there was the Atari Jaguar.

1996: A New Dimension

By now it was clear that multimedia was a dead end. Whoops. So that left the Virtua kids as the fathers of the next generation. The PlayStation had been out for a few months, and by now it seemed like it might stick around for a little longer than the 3D0 or CD-i or who knows what other junk the last few years had seen. Sega’s Saturn launched in Japan to rapturous applause and in the US to the sound of crickets. As both stumbled to find some direction for themselves, Nintendo finally got around to thinking about its next console. The Virtual Boy had been such a flop that Gunpei Yokoi – the man who turned Nintendo into a videogame company – was forced to resign; a year from now he would be dead. So whatever Nintendo had to say, it would be all on Miyamoto’s terms.

The N64 came out with a bang. With it came analog control and Mario 64, the pair of which gave the first real hint at how to use this 3D space that everyone was so excited about in principle. Sega said “Oh yeah” and released NiGHTS. Dyed-in-the-wool Sega fans swooned. Everyone else furrowed his brow and turned to Lara Croft.

Tomb Raider appeared, cross-platform, at the same time as Mario 64 – and like Mario 64, it offered an example of a third-person adventure through 3D space. Lara was to Mario as Sonic had been, five years earlier: a rung up the age ladder. She had boobs; Nintendo kids were now interested in boobs. She offered a grittier, more adult world and more highbrow game design – based more on Prince of Persia than on Donkey Kong. And, at least with the first game, you could play with her whether you owned a Saturn, a PlayStation, or a PC.

Lucky that Quake had just come around and created a market for 3D graphics cards. Now, officially, the 3D age had begun – and there was no turning back. Every game had to press the new hardware in unprecedented ways, signaling a need for more powerful graphics cards, which had to be pressed just as hard. If you weren’t part of the race, you weren’t in the running.

On the less idiotic front, Quake also came around at the same time as the World Wide Web became navigable, opening up the Internet almost overnight. And with this new world of IP addresses and packets came online multiplayer games – starting with Quake deathmatches and Warcraft, soon leading into Ultima Online.

Within the next year, Sony will have paid Eidos to keep Lara off of Sega’s console. Final Fantasy VII and Symphony of the Night will have arrived. “PlayStation” will become the new generic.

In Japan, Nintendo continued to profit thanks to the seven-year-old Game Boy and a new monster collection game that got people communicating with somewhat shorter wires.

2001: Introspection

And then… nothing. Sega released an excellent little follow-up to the Saturn; it had the biggest launch of any console in history. For a year and a half, Sega and its third parties kept pumping out some of the best software on any system ever – and nobody really cared. Sony told them to wait, because the PS2 would be so much better – so they waited. And Sega disappeared under a rock. And the PS2 hit, and at first it… kind of sucked. It wasn’t really any more powerful than the Dreamcast. There weren’t many games, and what games it did have were either ugly, terrible, or both. Sony began to bleed money as everyone in Japan bought the PS2 as a cheap DVD player and ignored any actual game software out for it. Sega fans caught up on the dozens of games they’d been missing; Sony fans drummed their fingers, waiting for the fun to start.

In 2001, that began to happen – albeit in a form nobody quite predicted. While Microsoft and Nintendo dumped their systems on the market, to middling reception (and even less astounding long-term effect), and Sega set about shooting off its remaining toes, a handful of… strange games began to creep onto the market, practically under the radar compared to what everyone was meant to care about.

Toward the end of the year, Sony managed to sneak a little gem called Ico out the door. Critics loved it; nobody bought it. Though not exactly fun, it was really observant. It took Prince of Persia or Tomb Raider, then stripped it down to the barest essence of its design, with the intent that each element of design needed to somehow support the basic themes and emotions that the game was trying to express. There were no hit points, no gamey devices, because fighting and getting hurt wasn’t the point. The levels were constructed more to give a feeling of scope than for convenience. The result was a subdued, understated, game – high on the concept, low on the game design. The development community collectively said “hmm”, raised an eyebrow, and jotted down the name Fumito Ueda.

On the exact same day, the PS2 received a sequel to the late-era PlayStation horror game Silent Hill. The original game was renowned for its subjective approach to horror, and for how its uses the hardware’s limitations as a strength. Under the guidance of Takayoshi Sato, the sequel seemed like more of the same. If anything, it seemed to kind of miss the point of Silent Hill. And yet, even more so than Ico, there was more going on. Every element in the game, from the inventory to the monsters, reflected an element of the main character’s personality. Every action the player took – even inaction – was tracked and analyzed in psychological terms. Depending on what the player’s actions said about the main character’s state of mind, the game ultimately read a different motive for the protagonist, therefore gave him a different conclusion to his journey.

Sega’s own Tetsuya Mizuguchi produced one of Sega’s first cross-platform games since the early ’80s, taking what he saw as the most fundamental form of a videogame – the rail shooter – and building on top of it, in attempt to see how complex a concept he could explore within as simplistic a framework as possible. The result, Rez, uses music, rhythm, force feedback, and visual cues to produce a sense of euphoria and “oneness” in the player. Thematically, the game explores the evolution of life and the human mind from primordial soup, through the great civilizations of Earth, to enlightnment – and all the player does is move a cursor around the screen.

Even the one immense blockbuster game – the game, if any game, that convinced people to hold off on the Dreamcast for Sony’s wonderbox – turned out differently from anyone’s expectations. It turned out that, for over a year, Konami’s Hideo Kojima had misrepresented the contents of Metal Gear Solid 2 – for much the same reason behind many of his design decisions within the game: he wanted to mess with people’s perceptions. The original Metal Gear Solid was already kind of silly – intentionally so. It did everything it could to break the fourth wall and force its audience to notice how absurd it was. The problem was, nobody noticed; the existing gaming audience simply accepted the game at face value and thought it was awesome. For his sequel, therefore, Kojima simply turned up the heat. He put the player in the role of an effete, emasculated “gamer” who yearns to meet up with Snake. He put far more polygons than necessary into Snake’s buttocks. He put Snake in a questionable relationship with his scientific advisor Otacon, and turned Otacon into a complete weirdo. Fans screamed bloody murder and stormed out of the building. Kojima began to attract a completely new base of fans.

Those new fans were mostly made up of people who had been growing restless with the brainless, essentially unquestioning nature of videogames to date – those Nintendo fans who had grown older still and now were looking for some deeper meaning in their hobby – not so much legitimacy as an art form, as just some kind of actual inspiration – some emotional or intellectual meat to keep them interested. And this was the year that they started to get their wish; that, in place of real hardware or design innovations, videogames began to innovate in the realm of mature expression.

People started to think differently of videogames: thus the (frankly kind of misguided) “games as art” and “new games journalism” movements. People began to write about them differently, analyze them differently. Take them more seriously – because games were starting to appear that were worth taking seriously. And there was much conflict.

The gaming community – players, developers, the press – began to split into two camps: the “technologists”, who wanted videogames to remain essentially as they were – except bigger, better, more awesome – and the “expressionists”, who continued to look for ways videogames could better convey meaning, however unconventional the method.

At the same time, MMO games began to go nuts. Phantasy Star Online introduced online multiplayer to game consoles, while EverQuest became the bane of hard-working spouses across the world.

Also, the Game Boy Advance seemed pretty neat at first.

2005: The Re-Evaluation

A few years later, progress was minimal. Games kept arriving that were worth considering past their surface level – Katamari Damacy, Silent Hill 4 – even gamey games like OutRun2 and Gradius V, which are cerebral in their gameyness. A jazzy new handheld and powerful new home console appeared, both way too expensive and kind of under-supported. Despite all the mockery, game writing continued to change. The market continued to change. Perceptions of game design, the game industry, and the ultimate purpose of videogames began to change. Issues like EA Spouse got everyone pondering about this monster we were now a part of. Introspection began to switch into re-evaluation of everything we thought we knew about videogames.

Into all this, Nintendo released the DS – a seemingly bizarre successor to the Game Boy with features nobody asked for, and which did not seem immediately conducive to any existing that game anyone wanted to play. Then it came out, and it was glorious. It soon became one of the biggest hardware successes in Japanese history, and it didn’t do too badly over here, either. Recently-appointed Nintendo president Satoru Iwata began to speak at length about the need to simplify; to strip away the layers of junk that had been built up over the last twenty years, and to appeal to a mass audience again.

Indeed, people who never played videogames before were attracted to the DS – in part due to the interface, in part due to the mix of software. Frivolous games sat next to experimental games, which sat below applications that held some significance in people’s day-to-day lives. Hardcore gamers still chortled, while the disenfranchised grew inspired.

And then came the build-up to the Revolution. As the DS was to handheld games, refining everything the Game Boy had ever meant, the Wii was meant to do for console games – except more so. Most of the rest of the year was filled with speculation: how would it work? Would it suck? Would it really be a revolution? Who does Nintendo think it is, anyway? The upside: at least it got people thinking – seriously thinking – about what the purpose has been to the last thirty years of growth.

The Next Generation

Today, the future looks bright. More than at any time in the past, the next generation looks like it will be a defining generation in the history of videogames. The hardware’s just a part of it. The Revolution – or rather the Wii – should be interesting. It should be especially curious to compare its progress with the PlayStation 3. What’s more interesting is how engaged the gaming culture has become – and in such a short time. We have thirty years of hindsight, millions of creative minds, and an Internet at our disposal. We’re mostly adults now. We have things to say, and videogames are the medium of our generation. And the thing is, now we know it. And the control, therefore, is in our own hands.