In recent years, video game developers have begun to focus heavily on the multiplayer experience. Long-time gamers will know that this wasn’t always the case, however – there was a time when video games were almost exclusively single-player experiences. Some games featured split-screen multiplayer, such as the classic N64 first-person shooter Goldeneye. These multiplayer modes were intended to be played with all players in the same physical location (so no online play) and were usually limited to four players. Video games focused more on the single-player experience. This isn’t the case anymore. According to Forbes, only two of the ten best selling video games of 2017 were single player games. This has continued the trend of the past decade, with just two or three single player games in the top selling positions.
Why are video games becoming more focused on multiplayer?
Spil Games’ 2013 “State of Online Gaming” report showed that nearly 700 million people around the world played online games, which equates to around 45 percent of all online users. The Entertainment Software Association’s 2016 annual report found that 54 percent of Americans play video games online. Once a subset of the gaming genre, online gaming has become very popular and has made its way to the forefront of the overall gaming experience. In many ways, the culture of gaming has changed, and the shift from single player games may be due to commercial concerns. Game developers have taken notice of this and have centered their business model around the multiplayer experience. What was once an additional game mode to be played occasionally with friends has become the bread-and-butter of most video games. DLC expansions and embedded micro-transactions also allow developers to elongate the life span of their games, enabling them to make more money out of titles than before.
Recurrent consumer spending aims to keep games fresh through new content, and this is easier done through multiplayer modes (though some games do offer expansions of the single-player story). For example, the GTA V online mode has brought in $500 million from microtransactions alone, with a nearly 100 percent profit margin according to Forbes. Rockstar Studios, GTA’s developers, need only add new vehicles, weapons, or missions to a game that has existed since 2013. These numbers are virtually unheard for single player games. While titles like GTA may be the exception and not the rule, it is no secret that most game developers are following the trend of including microtransactions and “pay-to-win” models. The single player mode has always been the crux of the video game experience since the very beginning. From an economic stand point, it is not surprising that video game developers would not choose to continue this trend. If they can generate millions from sprinkling in new content every so often on a project that already exists, rather than spending time and money creating entirely new experiences and writing new scripts.
What does this mean for consumers?
Many gamers feel that if developers over-use this model, it can result in developers bringing unfinished products to market and selling content down the line that gamers feel should have come with included with the initial release. GTA Online was a boon for both players and the game developers because it was a two-way street. While Rockstar developed a business model that would almost print money, they delivered a fully functional and enjoyable product that consumers would enjoy. Of course, there is still a market for single player games and many titles do well. The CD PROJEKT Group reported 584 million in sales revenues and 251 million in net profit for 2016 for their single-player only Witcher 3 title. According to Eurogamer, the developers went on to produce DLC packs with over 50 hours of new content and areas to explore as well as 7000 new animations. The multiplayer experience is currently in the driver’s seat, but single player games can be economically viable if done right.