Every now and again The Well-Red Mage poses a real head-scratcher to the gaming community. He calls this event in which he asks big questions of the gaming community: Asking Big Questions. How on-the-nose! How brilliant in its simplicity! He’s touched on all kinds of hotly-debated topics in gaming, and this time around his question focuses on old versus new, or at least the progress from old to new. Specifically, he’s asked us this:
Is Game Quality Improving?
Now on his website, Red chose to address a specific facet of the topic. That is to say, he answered one possible interpretation of the question – is the average video game today of higher quality than the average video game of the past? If that’s the version of the question you want answered, then by golly I don’t think you can do better than Red’s response. From a mathematically standpoint, you can’t really argue against it. There were fewer games back in the day, and those games were filtered through a quality control process by a select few companies for whom gaming was their bread and butter. Now, anyone with half an understanding of software design can make a game – accessible leads to saturation and the chance of you reaching your hand into a pile of old games and selecting something decent is a lot higher than accomplishing the same feat with a pile of newer titles.
In my view, though, there are other interpretations of this question. “Is Game Quality Improving” may very well be purposefully vague so that each of us in the community can find our own angle. For me, the angle I want to talk about is a comparison of the typical design elements of older games versus newer ones. While no two games are driven by the exact same design philosophy, a combination of technological limitations and beliefs about what works in games leads to trends that we can identify and discuss. I particularly want to hit on two topics: accessibility and difficulty.
For the purposes of this discussion, I’m using “accessibility” to mean the ease with which anyone can pick up a game and learn to play it. Accessibility in the sense gaming for folks who have disabilities is a topic for another day. A common criticism of modern games, particularly by those who grew up during the 16-bit-era, is that modern games are oversaturated with tutorial content.
“You can’t just pick up a game and play it anymore,” they say. “You have to get through a three-hour tutorial first.” Portions of a video game which are instructional certainly seem to have increased in recent years. Let’s look at some specific examples. The original The Legend of Zelda drops you in the middle of a field close to an open cave and gives you little else to work with. You can walk over to the cave and pick up a sword, but you are totally free to skip over it and wander the world free. Of course, if you do so you’re probably gonna die, but the fact remains that the game doesn’t make you do anything. It doesn’t tell you what items do when you find them, it doesn’t explain how to find secret entrances – you just have to wander around and figure out what’s going on yourself.
Compare that to Breath of the Wild. The first few hours of the game are essentially a tutorial. They are a more open tutorial than most, sure, but if you want to explore the wide world then you have a pretty set path you have to travel for the first portion of the game. Every time you pick up a new item, a window interrupts the game flow to tell you exactly what that item does, or what buttons to press in order to open the quick-select menu and equip that item to your person. You don’t have the freedom to go where you want to and a ton of information is shoved down your throat all at once.
These examples for the most part capture the typical perception of each era. Older games are seen by detractors as having too little guidance. You can’t just pick up and play because the game doesn’t tell you what’s going on. Those in favor of this style cite the freedom to choose, the fact that you have to learn things the hard way instead of having it all handed to you on a silver platter. There’s a value in learning things manually rather than being spoon-fed, and as a bonus, your immersion is never broken because you’re not dealing with out-of-context tutorial boxes interrupting your every move.
Newer games are seen by those who prefer an older style as too rigid and holding the player’s hand too much. You can’t just pick up and play because the game spends so much time telling you what’s going on. Between the twenty minute cut scene exposition, the twenty minutes of text dialogue, and then the hour of navigating menus as you learn every single technique you may ever need in the game, you lose one entire play session to all of the minutia of a game without experiencing any substance. But those in favor would say that this style increases accessibility, that it’s easier for anyone to play a game where the game gives you the tools that you need to succeed. A game with no guidance and direction becomes frustrating, and there’s nothing worse than getting stuck because you’re not sure what button combination to press to use the move you need.
Another subject that tends to come up when comparing games then to games now is the level of difficulty you tend to see. I chose the photo here quite intentionally – Mega Man is notorious for having punishing platform levels that are tricky to navigate. Some of the challenges can feel downright unfair when you’re playing through them. “How was I supposed to know a laser was gonna blast me as soon as I landed in that spot?” The answer to that question is “you’ve played it before” – older platformers often relied on memorization as the measurement of mastery. A game gave you ridiculous circumstances with little to warn you of what’s coming. You play a level over and over again, slowly learning the patterns of the platforms and the attacks of the bosses until eventually you finish the level successfully. These techniques make a small amount of game go a long way.
Modern games, though – those suckers are too easy. Bosses have glowing weak points and your tutorial companions tell you exactly how to win the fight if you don’t figure out the secret fast enough. If you die too many times, a magical golden mushroom makes you invincible and gives you permanent flying. Platforms are wide and the game telegraphs lasers to you twenty minutes before they’re going to hit. However, the lax difficulty makes it more likely that any given gamer will make it to the end of the game, and there’s a lot more world to explore in modern titles compared to older games as a result. How many of us have played a classic title and never beaten it because of the difficulty? The idea is that modern games don’t have that problem because the difficulty is less punishing.
These topics – accessibility and difficulty – easily become intertwined. Modern games are typically seen as being too easy in favor of increasing accessibility, while older games are seen as more difficult and less accessible but having a greater degree of quality as a result. Someone who completes an older game is in a special class of gamer, more accomplished because they invested the time to learn all the tricks and secrets, spending hours of their time to memorize every platform and level so they could finally overcome the game.
Another argument in favor of why older games might be better is because of how they conquered technological limitations in order to increase accessibility. For example, every level design decision in Super Mario Bros. was very intentional. The open space at the beginning of the level allows the player room to try out the movement controls and press the jump button. There’s tons of room to jump on top of the first Goomba and get points. Question marks signify special blocks that give coins or items. The first mushroom appears right next to a pipe underneath a row of blocks that gives the player narrow space to move around, making it difficult to avoid the mushroom – this gives the player the opportunity to learn that they don’t want to avoid mushrooms because they’ll then see their character gain points and grow larger. Clever techniques like these allow the first level of Mario to be a tutorial without a single box of text telling you how the controls work or giving you a lengthy exposition.
Of course, techniques like these work for new games as well as they do for older games. While earlier I used Breath of the Wild as an example of an overlong tutorial, this is a game which also uses environmental cues very well. After you get off of the Great Plateau the game gives you a single quest point to head to Kakariko Village. You can run straight there, of course, but environmental cues guide you along a path that allows you to see your first Guidance Tower, a bunch of shrines, a stable to capture your first horse, and the Korok which upgrades your inventory pouch. How many of you who have played the game, after leaping off of the Plateau, immediately made it your mission to run through the middle of the mountain split in two? You don’t have to go there, but it’s such an appealing location that serves as a great landmark – it’s no accident that all of the early-game tutorials are positioned along that path.
Still, it’s easy to talk about specific examples – it’s a whole ‘nother challenge to compare the style that was generally popular with older games with the one that is generally popular with newer titles. Has game quality improved because it’s much easier for anyone to pick up the game, understand what is happening, and complete that game successfully? Or were games better when it took a special kind of perseverance and determination to memorize the levels and complete the game?
My cop-out answer is it depends on your perspective. In my view, gaming and gaming criticism are subjective hobbies. While The Well-Red Mage strategically posed his question in such a way that an objective answer is possible and makes sense, I intentionally phrased mine in a way where there isn’t really a right answer. In general, I think that gamers who grew up in the 8-or-16-bit-eras are going to prefer the games of that time, and gamers who have cut their teeth on modern titles are more likely to prefer them. I believe there’s not a correct answer to this question – but that doesn’t mean I won’t share my opinion!
In my opinion, modern games can be executed poorly but in general they are moving in the right direction. Game quality is improving – or at the very least, the potential for quality is improving. In my view, the value of a game rises and falls on how much I have fun playing it, or how much the story resonates with me. Older games, which rely on repetition and memorization to create challenge, are more frustrating than fun, and their small worlds and weak stories often leave me wanting more. Sure, the typical modern game might hit you with too many tutorials or over-explain, but once you push through that problem the game is a lot more enjoyable. Things like grinding for EXP in RPGs or repeating the same segment of a platformer over and over again to try and master the level leave me dissatisfied. I think older games do the best that they can with the resources they had available at the time – but that still doesn’t mean that if you hand me almost any older game and almost any newer game that I won’t prefer the newer one.
Part of the reason I’m okay with looking at things this way is because I want games to get better as we go on. If the classics I played as a kid are the best video games ever made – if quality never improves beyond that – then why in the world would I keep playing video games? Quality has to be improving for this hobby to continue to grow and change. Perhaps someday, the popular trends will swing in a direction which causes me to feel like games have plateaued, and I too will pine for the good old days. But I hope when that day comes that there is a young whippersnapper out there, typing away on a little gaming blog, posting their defense for why the new games of the era have improved upon the old.