Whatever Happened To Player 2?

Playing Video Games

It is an understatement of almost irresponsible proportions to say that video games have changed over the course of the medium’s history. This is Big Bang Theory-joke-writing level of obvious. The graphics have improved immensely. Arcades, once a vital and necessary component of playing games, are mostly irrelevant. The actual gameplay has improved, and the controls needed to interact have mutated countless times. Storytelling and narrative are expected as a norm now, as opposed to being a unique exception, and it is easier than ever to argue that video games are an art form.

But the most important change has come in how we play with each other. The reason arcades used to be “the scene” (or just the place to be, if you don’t want me awkwardly trying to fit in my hip slang) is because that was where you had to go if you really wanted to play video games, especially with other people.

As computing power became more portable, arcades became less and less necessary. Just have your friends come over to your basement to play Street Fighter. No quarters required. Then the Internet came, and with it a massive PC game culture where you could play Counter Strike and World of Warcraft with your friends AND complete strangers from all over the world, with no need to ever leave the house. Then the platform consoles started offering online gameplay, and it’s now how the majority of us play games with one another.

Pants Are Now Optional

Pants are now optional.

Unless you have Nintendo.

Whereas the other gaming platforms have essentially done away with the concept of sitting on the couch and playing a game together with your friends, Nintendo is the last bastion of hope for slumber parties.

Nintendo has gotten a lot of shit over the past few years. “Serious” gamers have scoffed at the Wii and the Wii U as gimmicky novelty items with less computing power than their competitors and next to zero third-party support, and also it totally sucks your mom’s dick. These are all true (except for the part about your mom, she’s way too classy to let just anyone suck her dick). The Wii systems are largely gimmicky with their controls, and the lack of third-party options is a big reason why my Wii is still at my parents’ house and I never bothered with the Wii U. But for all the talk of how Nintendo has become a joke when it comes to the console wars, they got one thing right, and it’s a mighty important thing.

It’s that games are the most fun when you have people to play with, and even more fun when you are really playing together in the same space.

I mean, I get it. There are a lot of great things about being able to play games with people regardless of their geographic location. It allows people to stay in touch with friends who have moved away, or even forge new relationships. But somewhere along the line this became the de facto option. Most multiplayer games on Xbox and PlayStation now require that each participant play on their own system with their own copy of the game. Outside of fighting and sports games (and the occasional Little Big Planet), it’s harder and harder to find multiplayer games that allow a shared room experience. Remember how awesome split-screen Goldeneye was? Xbox and PlayStation sure don’t.

To be fair, there's a lot Microsoft has tried to forget

To be fair, there’s a lot Microsoft has tried to forget

There is something magical about sharing a space to play a game, to trash-talk someone who is right there to trash-talk back/knock the controller out of your hand. The ability to give your friend the stink eye when they chose Oddjob is at least half the reason Goldeneye is remembered so fondly. Playing games actually together makes it a better experience. If you don’t believe me, take a look at Wii U sales after Nintendo released Mario Kart 8 and Super Smash BrosThese are games that work specifically because they are designed for you and your friends to hang out together and play, and they’re selling like gangbusters. Not too bad for a system that was written off as dead a year ago.

What sucks the most about this aspect being brushed aside is that it’s done primarily for greedy reasons. Why offer a split-screen option on Destiny when forcing all of your friends to buy a copy as well makes for a larger profit? Having split-screen means your free-loading friends would be able to play with you but Microsoft and Sony wouldn’t get any money from it. You could argue that this is a necessity, a result of games becoming more complicated and costly. Yet Nintendo hasn’t lost sight of that, and while Super Smash Bros. surely didn’t require a Grand Theft Auto V budget, it wasn’t exactly cheap, either.

You don't always get what you pay for anyway

You don’t always get what you pay for anyway.

I tend to be very wary of waxing nostalgic, because nostalgia has a tendency to sugarcoat the past in a way that prevents us from being anywhere close to objective about it. It also produces terrible, terrible BuzzFeed posts. But this is one thing I genuinely do get nostalgic about. For as much as I love a lot of the directions modern gaming has gone (for example, I will kidney punch anyone who shit-talks The Last Of Us), I miss eating pizza rolls at my friend’s while we played Gauntlet Legends.

I miss resetting the system when my brother beat me at NFL Blitz so that it didn’t affect my streak. I miss video games as a catalyst for actually interacting with other people, not just talking into a headset. For lack of a better phrase, I miss the old days. I like the new days, too, but since when has that ever stopped nostalgia?

 




Tim Gaydos

Author: Tim Gaydos

Tim is a contributor for Robot Butt and is not hosting a parasitic xenomorph inside him, so just don't worry about it, ok? You can disagree with his opinions on Twitter @timthinksthings.

Share This Post On