Game reviews are an interesting part of gaming journalism. They're articles with Schrodinger's importance. Reviews rarely get traffic on the level of that of features, guides, or news, and they're often too all-encompassing to offer the deep criticism more targeted features can offer. They also require a huge playtime investment, up against the clock, which means they're not only shallow and largely go unread, they're also rarely your best work. And still, our industry is defined by them. What a game scores on Metacritic is used in fan circles as talking points, or when it differs from the User Score (which sways hard into 0 = don't like it, 10 = do like it), as a cudgel against our integrity. They're 1,000 words, often significantly more, offering the first critical analysis on a game and the only criticism the game will ever receive in a vacuum. They're also just a number. We do them pretty rarely, in the grand scheme of how many people write for TheGamer and how busy our jobs are, yet when you tell someone you're a games journalist they'll almost exclusively ask 'oh so you review video games for a living?'.

Even when we don't write reviews, we get emails complaining about our 'biased reviews', which are usually aimed at pieces like 'the lighting in the original Halo game isn't as good as I remembered'. You might be wondering what this has to do with anything. Well, join the club. I write about toys for a living, sometimes the nonsense just spills out. In this specific case though, I've decided that the process of our reviews could do with some grounding, so this will be our hub for what review scores from TheGamer mean. We have a large review team and have previously debated back and forth over what score to award any given game, so this helps put that in perspective.

Related: Every Pokemon Movie Ever Made, Reviewed

Previously, I have moderated reviews with two simple rules: a) your score is the right score, and b) if you're torn between two scores, pick the lower one. These are still basically how I feel and, naturally for such a subjective subject, any two staff at the site might give a game a different score. I've never gotten the idea of "TheGamer gave X game a six and Y game an eight, they suck!", which mostly makes no sense because our scores are out of five. But also, they were probably two different reviewers, and why do you care about what scores a site you hate gave to two video games anyway? Still, it's important to have consistency, so going forward, these are the metrics all reviewers for TheGamer will use when they review any given video game. If we're wrong, at least you'll know why.

0.5 (out of 5) - An offensively bad game. The world would be a better place if this did not exist.

1 - Broken beyond all repair, completely dreadful. We gave Babylon's Fall a 1/5.

1.5 - Mostly broken but enjoyable in places, or working perfectly fine but with few redeeming qualities.

2 - A bad game, it just about works and does what it means to do mostly but I would never encourage anyone to play it.

2.5 - Never quite executes its ideas properly and is boring. I am resenting that I have to review this.

3 - There is nothing wrong with this game but I am never going to play it ever again once the review is over. We gave Mario Strikers: Battle League a 3/5

3.5 - Fun and competent, but not many layers to it. Good for a laugh but it's never going to get replayed.

4 - Very good game, extremely enjoyable but perhaps without the critical spark that makes it special. Won't trouble your yearly top five but you would tell people to play it. We gave Horizon: Forbidden West a 4/5

4.5 - A great game, guaranteed to make it onto my GOTY list.

5 - I'll be talking about this game for years, a rare title that will come to define the generation. Executes its vision perfectly and sets a new benchmark for its genre. We gave Elden Ring and The Forgotten City a 5/5.

Next: Review Homepage