With J.U.L.I.A.: Among the Stars being out in the wild, I’ve decided to use some of the post crunch time I have to revive my blog feature. Atypically I won’t continue with our crowdfunding journey, which I plan to return to in my next entry. Instead, I would like to focus on an interesting phenomenon which happens in the game industry – review aggregators.
Let’s state the obvious. Sites like GameRankings or Metacritic are based upon a really good idea – to save time for potential gamers, they aggregate all the reviews in one place and calculate a meta-score to kind of give you an impression about the overall quality of the game.
Before I explain further, it’s important to understand how a big majority of gamers treat game reviews. In the age of too many games out there struggling for their wallets, they don’t have the time to dig
deeper and understand any reasoning behind a score. So they just scroll down to the bottom to read the final verdict, despite the eternal wish of a reviewer to get their text read.
Sites like Metacritic make this easier because they are mainly about the numerical representation of the overall quality. With a significant amount of reviews this should do the trick, right?
The main problem is that MetaCritic ignores the vast majority of review sites and accepts only a close group of selected affiliates. Sure, everyone can apply to be a part of this, but in reality only the largest or well-known sites are in. This of course works for AAA titles, where everyone just fights to get the game reviewed. But what about small indies?
We struggle to be featured by these sites, battling thousands and thousands of other games trying to do the same. At the end reviewers are flooded and if they finally decide to review our game, often they are tired or don’t have the time to research our design goals and what is actually in the game. And when they write their review, we find paragraphs written about how our two-man game, developed with a mere $8k budget, lacks perfect lip-sync. Such things are never the focus of more thorough reviews, which – and you’ve guessed it – are not normally part of MetaCritic or GameRankings.
My second issue is that this isolation creates an illusion that reviews by writers for those large sites are somehow better and more accurate, while the smaller places are not that good. Nothing can be further from the truth, because people, who contact us for a game key for review purposes, usually spend much more time studying, researching and playing our games than those who we have to hunt down.
In my experience there are incredibly talented and educated reviewers who are handicapped just by the mere fact that their site doesn’t belong to the “elitist” group. Does it make their opinion less valuable or professional?
Another thing is the laziness of some of the sites. Would you believe that instead of creating a new entry for J.U.L.I.A.: Among the Stars, someone would simply change the Metacritic game logo from our old game from 2012, while keeping all the related data, links and scores? People reading the data would assume these were written about Among the Stars, unless they happened to notice the incorrect screen shots and article dates. [UPDATE]Fortunately, the situation has been just now rectified, but should I actively check everything? The same thing happened with another article (written about our original J.U.L.I.A. release) where the reviewer just added “Among the Stars” to the old title, instead of reviewing the actual “new” game and there is no response to my requests for a correction. My experience is that smaller review sites rarely commit such errors, because they pay way more attention to what they do even with small developers like us.
So what to do?
The very first idea is to dismiss aggregators per se, but then, it’s a bit dishonest to the gamers. How should they decide if our game is worth their time? We can do what the majority does – simply plaster our page with “box quotes” and great review links while silently ignoring the rest. In the age of rapid and easy fact access, this is very short sighted. Everybody will find the unfavorable ones and the news will spread quickly.
And here comes the thing which we are going to try now. We’ve decided to make a clear split between “showcase reviews” and all the reviews. While we still want to boast about great articles and nice verdicts, we want YOU to decide if our game is worth it.
For that reason we are doing our own J.U.L.I.A. aggregation from all the reviews we find out about. The only parameters we are using are: the review must have a score (be it a convertible letter or a value on a numeric scale) and the review should look like a review (eg. must be readable and contain accurate information about the game). We will treat each and every review in an equal way providing our readers with two values:
- Arithmetic mean – where we eliminate one highest and one lowest mark and then calculate the rest by sum of the remaining values divided by number of eligible remaining reviews.
- Median – where we simply give you a “central value” of all the values to give you an idea what would be the most common score.
Clarification: All the reviews, including those which don’t have a score will be listed in All reviews section. They just won’t be a part of the numeric aggregation.
I believe that this new method will be actually beneficial to both potential gamers as well as reviewers. The reviewers will have exposure regardless of the outcome. While we still love good box-quotes and nice scores, we deeply respect personal opinions and there is no way that our game would match everyone’s taste.
I am doing a big leap of faith, but at the end of the day I strongly believe that it will show that we want to be as honest as possible with our community and that we accept the fact that not everyone will love our games.
So let’s do it! Let’s remove Metacritic and Gamerankings from the equation and let gamers decide over a reasonable set of data rather than over just a few selected bread crumbs.
And it’s up to you, our readers, to trust that we are not cheating you because CBE software is built upon the mutual trust between our two-man team and you – our community.
You can see our new system in action here:
Thanks for reading and your comments are always welcome,