If you own a smartphone or have a Facebook account, we have no doubt that you have played a free-to-play game. Maybe you scrimmaged in Clash of Clans or grew crop in Farmville or erected a mighty kingdom in Elvenar. No matter what free-to-play game you chose, the point is it is nearly impossible to go through this day and age with even a mild interest in video games without your free time getting completely sucked up by free-to-play games.
Odds are you haven’t actually paid for any free-to-play games you may have played. The format, which makes money from secondary purchases inside of the game such as new skins, characters, abilities, mounts etc. after its been downloaded for free, only sees purchases from about 2.2 percent of its players, according to a 2014 report by Swrve. This year’s Swrve report has an even grimmer statistic: nearly half of free-to-play revenue comes from 0.19 percent of players.
That is staggeringly low. These people who are willing to shell out money for upgrades and perks are known as “whales” within the industry and it doesn’t take a business genius to know that banking on these whales is unsustainable.
So why are free-to-play games still touted as the release model of the future if only 0.19 percent of people playing free-to-play games?
The simplest answer for why the free-to-play monetization model is being so rampantly adopted despite the cracks in the veneer is because developers and publishers know that paying $60+ upfront is a big deterrent. Realistically, the only people who are willing to take the gamble are gamers. By making a game free-to-play, it at least gives the developers and publishers a chance to crack into the much larger market of people that simply have spare time.
Are free-to-play games sustainable? Only time will tell. But for now, you better believe that the industry is banking on there being plenty of whales out there for them to ensnare.