Gauging Player Response To Your Game

0 favourites
  • 5 posts
From the Asset Store
Basic Rounded Vector Geometry Player Design with Glow for 3 player games
  • I originally posted this on my blog. The link is in my signature. The direct link to the article is kurie.us/2013/11/the-power-of-nps-and-customer-reviews in case anyone would like to read it with better formatting. Otherwise, the complete text is below. I wanted to get feedback on what other devs thought about this idea.

    NPS is a powerful tool that can be converted to very powerful metrics to measure a game�s performance. That�s a bit of a broad statement. It�s true though. The best part is we already have the tools to do this. We already have star reviews on mobile markets to gather information for us. That seems like a very �duh� thing to say, that star ratings mean something, but I would argue they aren�t really being utilized properly.

    First, let me start by explaining NPS.

    Net Promoter Score� is a customer loyalty metric developed by (and a registered trademark of) Fred Reichheld, Bain & Company, and Satmetrix. It was introduced by Reichheld in his 2003 Harvard Business Review article �One Number You Need to Grow�.[2] NPS can be as low as -100 (everybody is a detractor) or as high as +100 (everybody is a promoter). An NPS that is positive (i.e., higher than zero) is felt to be good, and an NPS of +50 is excellent (Wikipedia, 2013).

    NPS is such an important metric because it�s one of the first that quantifies the customer experience and, more importantly, the cost of word-of-mouth exposure. Business has always known that word-of-mouth advertising is the best anyone can get. We�ve never been able to easily place a dollar value on it though. We�ve also had very little insight on how negative word-of-mouth advertising can impact a product or business image. In short, NPS allows us to quantify the customer experience and how it affects our bottom line. That seems like a really powerful metric when it�s put into perspective. We really should be paying attention to it.

    Did you know that it takes 5 good responses to negate a single negative response?

    NPS is typically measured on a 1-10 rating scale, and in some cases (depending on the business) a 1-5 rating scale. On a typical 1-10 rating scale a 9 and 10 are considered promoters, 7 and 8 are considered passives, and anything below is considered a detractor. Promoters will (obviously) promote the product and the business. They tend to not go anywhere. Detractors will bad mouth the product or business and jump ship as quickly as possible. Both promoters and detractors tend to be vocal and can offer great feedback. Passives are harder to gauge. They tend not to be vocal. There is a little bit of an art to raising their allegiance with a product or business. They are also opportunists. They will stick with a product or business because they are comfortable, but if they are offered something better then they will jump ship.

    As the quote states above, NPS can vary from -100 (all detractors) to +100 (all promoters). The metric isn�t really measured in a percentage although some companies change the metric to display it as such and make it easier to conceptualize. The metric takes the total amount of detractors and subtracts them from the total amount promoters. That will give the total score. So, if a product receives 5 promoters and 3 detractors, the overall score would be 2. Some companies will say 2 out of 8 (or %25 promoters, or happiness rating, of a total of 8 responses) to make the metric easier to grasp. I typically measure the metric in this way as well.

    NPS is typically measured with 2 or 3 questions.

    How likely are you to recommend this product to friends/family/colleagues?

    (Optional) How likely are you to recommend this company?

    Why?

    The first and third questions are the important ones. The second is typically thrown in for good measure to rate the company as a whole. The first question simplifies and boils down the equation to something easy and intuitive. It measures the whole package of customer happiness. It�s unique because it offers very specific insight and makes the question easy enough to answer to garnish more responses. The more responses, the more accurate the results (which is what NPS is designed to offer). The third question completes the feedback loop and allows the customer a chance to offer insight as to why they are happy or upset. Anything above a 0 score isn�t to shabby. Anything above a 50 is typically considered great. Anything below a 0 needs to be examined closely and fixed.

    How does that relate to star reviews though?

    Game developers obviously aren�t going to send customers a questionnaire. Very few games have the system already in place to do this (mostly MMO or social games only). Still, the process has to be friction free for the customer to participate. That�s where we have it easy. App stores already engage the customer and asks that magical question for us. The very act of having a star rating system is basically saying, �Would you recommend this app?� The stars ask the question while the comments close the feedback loop. As application developers, we have the luck of having some of the most vocal and responsive customers. Compare app reviews to just about any other product. The response rate is typically much higher.

    So how do we boil down those star ratings to the NPS? That�s easy enough. Businesses already use an established system with 1-5 scale ratings. That translates directly to 5 star reviews. 5 is considered a promoter. 4 is considered a passive. 3 and lower are all considered detractors. The comments close the feedback loop and explain why the customer rated the app the way they did.

    jetpack-joyrideLet�s use JetPack Joyride as an example (mostly because it�s one of my favorite games. Specifically, I�m using the Android version though an accurate analysis would use all versions of the app on all ecosystems (though sometimes segmented markets like the Apple App Store, Google Play, and the Microsoft Store require some independent gauging). Currently Jetpack Joyride, at the time of writing, has an average of a four and a half star review (out of five) broken down to (Studios, 13):

    321,905 five star reviews: Promoters

    40,234 four star reviews: Passives

    18,862 three star reviews : Detractors

    7,119 two star reviews: Detractors

    23,915 one star reviews: Detractors

    That means JetPack Joyride has 321,905 promoters and 49,896 detractors (you can�t please everyone). That leaves with roughly a 66% NPS rating which is considered to be really great! Keep in mind, anything above %0 is trending in the right direction.

    Let�s compare that to an app called Flight Track 5. I specifically picked this app because it�s ratings are a bit deceiving. At the time of writing it has a 3 star review with a total of 64 responses. Those responses break down to (Mobiata, 2013):

    26 five star reviews: Promoters

    3 four star reviews: Passives

    3 three star reviews: Detractors

    5 two star reviews: Detractors

    27 one star reviews: Detractors

    That means that Flight Track has 26 total promoters and 35 detractors. Yikes! That would give Flight Track a promoter score of % -14. That is a negative fourteen. Remember, NPS scores swing from -100 to +100. The percent sign is kind of added just to make things easier to conceptualize but doesn�t mean much. Looking at this data, Flight Track has a major reputation problem. I�m sure the developers could read the comments to find out why. Those comments complete the feedback look and offer great insight on what the developers of Flight Track need to improve. While every customer response may not offer specific insight, if a lot of people complain about the same thing I would think that improving on that one thing would drastically change customer perception. The comments are a good place to start.

    An Added Bonus

    The cool thing about using NPS is that customer can go back into the app stores and change their responses. Data is dynamic. Developers can look at segmented time scales (perhaps before and after changes were implemented) and the entire life of the app. Think about how powerful that is for a moment. That makes the iterative life of an app, and the potential revenue, offer much greater potential.

    During my travels around the interwebs, I haven�t read much of anything relating to gauging and quantifying customer reaction and experience. Perhaps this is an easy and cost efficient (as in the data already exists) way of doing that for developers.

    Works Cited

    Wikipedia. (2013, 8 13). Retrieved from Net Promoter: en.wikipedia.org/wiki/Net_Promoter

    Mobiata. (2013, 11 21). Flight Track 5. Retrieved from Google Play: play.google.com/store/apps/details

    Studios, H. (13, 11 21). JetPack Joyride. Retrieved from Google Play: play.google.com/store/apps/details Added Bonus

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • You probably won't find much about that online, from developers at least : this is a marketing metric, which is more abstract as is than tracking the average score of a game from version to version.

    I would also prefer the "happiness rating" metric with relatively small audiences simply because I doubt the NPS translates well based on a small data sample. Nor does it always work on bigger samples (see the battlefield users metacritics for example, or the call of duty ones).

    Anyway, why not, but i'll leave that to PR/marketing people.

  • Valerien I think that's part of the problem though. Devs don't speak much about it. The problem is a lot of devs and games don't have dedicated marketing departments or dedicated people ONLY for that. Typically the devs, or start-ups, have to wear multiple hats. Understanding the metrics and having those tools is powerful.

    I won't attest that NPS isn't a debated topic. For many people, it doesn't seem like it offers enough information. The problem is that many consumers won't fill out longer surveys, or even surveys with only a few questions. To understand a product, you need as much feedback as possible. NPS works well because of it only asking 1 important questions: the first one is curcial and the second one closes the feedback loop. There is not point in asking a customer what they like and dont like about a game if they won't answer you or you only get two responses.

    NPS is very boiled down but the information is extremely powerful. I would read the Ultimate Question or look up the NPS spec on the website. Again, it is open source and is very detailed on the website.

    Anyway, I thought I would only offer this as a tool and a suggestion and bring light to the matter. Making games is fun and only, and can make a really great hobby, but those looking at turning it into a business need to consider all aspects of that business.

  • For us indies, quick metrics like the NPS are just way less important than a good PR and other reliable data like conversion rates or tracking ingame stats to some extent. The main reason being that the error margin on small samples is simply huge.

    There are other reasons like the grade of a game - the source of NPS - doesn't predict how much your community will grow, nor does it give you a clue on how the sales will evolve or how much the next game will sell.

    This is still a potential tool though.

    However I totally second you on that part : small teams should worry about marketing, pr and legal aspects of video game creations. It's simply as important as making good games in the first place, and doesn't require as much experience.

    Cheers,

    Nathan

  • Valerien Agreed. There are many metrics to look at, some better then others. The idea of applying NPS to a game is only one tool. Someone should never depend totally on only one tool.

    With that said though, with freemium type games, and game trends moving to games as a service (the best example would be WOW, but even Angry Birds has a hint of this)gauging existing customers is important to. If customers are responding to star reviews on varying markets, I think it makes sense to utilize that free and already organized data. The caveat, as you mentioned, is that a very small sample pool is prone to more errors. Having only 3 people respond to a star review and well over 2000 downloads is not very indicative. But even a pool as small as a 20% response rate can be helpful (although obviously the more the better).

    On the flip side, a game like 'The Last of Us' would have very little impact using this on the current game. That game is written all around the narrative. Naughty Dog isn't going to go in and change things in the game after the fact or add game play. It would be useful to change practices for future games, or DLC, but will have little impact on the current game. (I still wouldn't disuade someone from not using this research for future projects though, but only use it for this type of game as a hindsight kind of deal, learn and move on).

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)