How Microtransactions Are No Longer “Optional”

“Microtransactions” has become one of the most hated terms in today’s gaming ecosystem, even trumping that buzz term “Day 1 DLC” that was running riot in 2012. The idea of being able to spend extra real-world money to earn a special item or skill has earned its share of detractors, yet it’s still a hugely popular concept for developers. Companies like EA, Activision and even PC juggernauts Valve have made microtransactions a big part of their games. You can buy new weapons, skills and enhancements instead of spending countless hours playing the game to unlock them. It may sound like sheer convenience at first glance, but the way this idea is entering the mainstream gaming market is anything but subtle. The developers are saying that you aren’t required to use them, but unless you’ve played the game in the ecosystem we live in today, that claim is all the more unsubstantiated. The point is this: developers are slowly making microtransactions mandatory for gamers.

The logic behind microtransactions is that it’s an option for players who can’t (or won’t) play the game long enough to unlock every mode, skin or ability. It is not a mandatory feature; no one is forcing you to pay your real-world money to unlock something. And they’re right, it’s not mandatory…from their side of the fence. Paying real money to unlock a powerful new weapon or stat boost is optional in theory, but when so many people are given the opportunity to do so, what are the odds that everyone will simply pass on it? If someone does pay their real-world money and gets a super powerful skill, in order to keep up with them, many gamers will cave and simply take that shortcut. This is especially apparent in all the competitive multiplayer games on the market like Forza Motorsport 5 on Xbox One. If there is the option to one-up your opponent quickly and without repeated gameplay, someone will do it. It’s all an effort to become better than your rivals. In order for anyone to rise up the ranks, every single advantage needs to be taken; it’s the nature of competition. Microtransactions are that simple advantage that let you progress instantly, so in order to keep pace with your rivals online, paying real money for better skills or weapons isn’t just advised: it’s required.


Multiplayer environments are where microtransactions have hit their highest level of notoriety. If you’re paying extra cash to earn an advantage in the single-player of Dead Space 3 or something like that, that’s not something that will question your character on a widespread level. You aren’t necessarily “cheating” anyone (maybe yourself, but I won’t judge). Putting that under-the-table advantage into a mass competitive environment is where this idea is becoming such an epidemic. The world of competitive online networks has changed the way we play games. If you dive into a game of Slayer on Halo 4 or a Free-for-All on Call of Duty, you’re in a mass competition and are playing against other people, people who’ve practiced and earned a significant amount of skill through that practice. In a competitive environment, that skill is (most of the time) rewarded. Microtransactions lower the integrity of competition. They skew the line between hardened practice and illegitimate victory, dropping the game into a murky limbo where it becomes harder and harder to know what constitutes a “fair match.” Giving everyone the ability to pay money for an advantage sounds fair, but the gaming community’s mantra has been one about practice and playing the game effectively. It’s how many of us grew up as gamers. Challenging that mantra with microtransactions is not something everyone will universally accept: some will do it, some won’t. This causes an immense amount of imbalance when the game actually gets going.


What is especially troubling is that many games are being built with microtransactions in mind. Games like World of Tanks and free-to-play browser games have gaming frameworks that don’t just abide by microtransactions’ logic, but centralize it. Some games simply couldn’t function without the ability to jump the line, practically forcing you to pay up. Probably the worst offender is Final Fantasy: All the Bravest, a game that requires you to wait three minutes for your party member to revive themselves unless you’re willing to pay for a quick revive. It simply locks you out of the game for a dozen or so minutes (depending on when each of your party members die and how many there are). In order to continue playing All the Bravest after that moment, you need to pay up.

The inherent issue with microtransactions’ recent surge in implementation is that the gaming community is afraid that the basic structure of games will change to accommodate them. More and more games (many of which are competitive) are using microtransactions, and the developers are further experimenting with ways to put them into the game. But why? Why has this become such a fascination? The most recurrent response is money-grabbing, a way to continuously earn money from gamers without producing anything truly substantial like story-based DLC or a patch with improved mechanics. It’s very easy and inexpensive to develop, so it’s a sure-fire method to at least get a little bit of profit. But for the community itself, microtransactions don’t serve a purpose. It’s a confusing idea that doesn’t distinguish between practice, payment and performance. Games like All the Bravest prove that the idea can easily be abused and it’s becoming even more problematic with competitive multiplayer games, games that at their absolute essence are made to show who has the better skill, not who can pay the most.


Microtransactions really don’t have any kind of purpose from the gamers’ side of things. As gamers, there really isn’t any reason for them to be there. The clear-cut environment of competitive or even social gaming has become dark, foggy and muddled. The competitive nature of games is morphing in ways that are unsettling. Being able to spend money on gameplay advantages does sound like a simple convenience, but the way it’s being used in such a widespread manner is why it’s become such a hated idea. It’s one thing to jump to the best weapon when you’re playing a single-player game, but using it to gain an advantage over thousands of online players is not something that just passes by. Microtransactions do more damage to online gaming than developers expect, all while increasing the profit and muddying the waters of legitimacy and competitive integrity.

Microtransactions have the potential to be a worthwhile inclusion to games, one that isn’t mandatory or game-breaking. However, right now, they’re destroying competitive gaming.

Do you agree? Disagree? Are microtransactions really that dangerous to gaming? Sound off in the comments below!