Life is hard. I recently blogged about the Little Wars TV-review methodology, and pointed to apparent mistakes in their rating system. Although I praised the show, I really love their intelligent chat, a few commenters on facebook tore me to pieces. I was writing bullshit. How dare I? Clickbait! Etc. Well – that’s Facebook dynamics.
But I made a promise in my blog: to devise a review system and write one or more reviews that are logical, sharp, honest (objective is not the word) and informative. Here are my first thoughts, after consulting BGG and other sources.
Do’s and don’ts
1) The review should make comparisons. Why is Chain of Command better or worse than Bolt Action? A reviewer must play several games before reviewing – I set the bar at five at least. Too many reviewers just open the box and write how they like their new toys . They are very happy to share their positive opinion of a new system they bought, or give a general impression after just one game.
2) The review should be totally independent. Several bloggers that I follow receive a game for free from the publisher. They are fair and disclose that. However, those reviews still risk being subconsciously too positive (after all, it’s a gift), focusing on first impressions and whats-in-the-box-articles. Often these blogs lack price and system comparisons. I will pay for games myself, with my own hard-earned money. Or borrow the rules and test them thoroughly.
3) The reviewer should play and review the market leader. In SF for example, the market leader is 40K. That’s a truth that I hold self evident. Thus, not all SF games are created equal. 40K is in many aspects the benchmark of the SF-genre. Is any other scifi-game that I want to review in comparison faster to learn than 40K? How is the art compared to GW art? What’s the price of the model? Game mechanics?
Ditto with Flames of War – Flames of War might be good, or bad, but only in comparison with lesser known games like Spearhead, and not “because I read in many blogs about the car park rules and that Germans always win”.
4) Don’t review only the games that you like. Reviews are personal opinions. I might not like game X because it’s simple IGOUGO. I might like game Y because it isn’t. Readers of the review should know my taste, the anchors that I use. The games that I review negatively are such anchors. A good review contains links to similar (own) reviews.
5) A review should be well-researched. Not only my own opinion matters, but also the opinion of others. A quality review links and mentions how blogger X and Y rated the game, and why, and why I think the same or different.
6) Mechanics, in particular dice mechanics, should be discussed in the blog. I don’t mean to say that all reviews should contain a ‘full chapter’ about boring dice statistics. But if a certain mechanic results in a lot of dice rolling without much effect, then that should be mentioned.
For example: the popular Black Powder series has a Command Value test with two dice. Researching dice statistics with two dice, I discovered that the outcomes are not equally spread. So the question rises: is this procedure, played this way, a ‘good’ procedure?
7) The reviewer should bear in mind what the goal of the design/ designer is. The GW games were from the moment of creation a marketing tool to sell more fantasy miniatures to hobbyists. 40K and WHFB/AoS are collectible miniature games: every month more miniatures with more special powers are added that you need to collect to remain competitive in the miniature tournament scene. Same with X-Wing. Same with Warmachine. Same with the Collectible Card Game Magic the Gathering. Every few years the rules need a reboot to counter minmaxing and to clean up the mess with all the special rules and powers created for the earlier miniature waves.
Many reviewers blame companies for this policy, but that’s how capitalism works. So don’t review Age of Sigmar badly solely because it replaced the WHFB universe. Check if it attains it’s desired goal, simple fastplay in a fantasy world where there is only war. Don’t blame Warmaster/Black Powder-influenced rulesets for not being able to move your wing. The designers believe that this mechanic reflects friction and miscommunication in battles. Does this design exaggerates historicial friction? That’s the only relevant question when reviewing. Same with target group: if a game is designed for advanced wargamers, the more snobbish ones, then don’t blame the rules for not being fastplay.
8) The reviewer should abstain from simple rating systems, 1-10, three or five stars etc. Commercial games are often average to good, but maybe not my taste. I think all rating systems simplify the small difference between average, above average and ‘quite good’ too much. The LWTV-vlogs with their ratings clearly suffer that problem. Besides, I believe that thorough rule reviews should be written, not vod- or podcasted, that’s superficial chat – always. Video is about pictures, not about depth.
9) A thorough review should be balanced and based on many aspects of the game, not on a few defining ones. Often a game is rated as ‘good’ by bloggers/vloggers because it’s a boxed set with beautiful figures, or the rules are reviewed positively because reviewer regards them as innovative. I have doubts about that approach.
Try to compare, for example, David Ensteness’ Et Sans Resultat! and Sam Mustafa’s Blücher, both corps level Napoleonic games. I play Blücher and it’s an excellent game, concise, not too expensive. ESR (that I will try at some point) is said to be slower and more expensive, but – author Ensteness is playing in a different league. Mustafa joyfully writes a new ruleset every two years, sold as cheap PDFs. Ensteness is Flames of Warring Napoleonics: one period, and he’s marketing a full package including 10mm figures, battlepacks, relatively expensive but well-researched illustrated supplements, and scenery.
So what is better? Mustafa sells a relatively complete ruleset with rules for pick-up campaigns and pick-up battles. Ensteness sells beautiful books with detailed historical orders of battle, uniform painting guides included.
Same with Ancients. DBA was back in the nineties a very innovative system. What is better: fastplay tournament DBA in formal English with very simple illustrations, or the nicely illustrated and well-written hardcover Hail Caesar book made by and published for beer-and-pretzel-gamers?
10) A thorough reviewer compares games with the help of a topic list and standard situations., like, but not limited to, attacking a hill, defending against superior numbers, crossing difficult terrain while charging, attack from behind, etc. The Heretical Gaming blogger played the same Mons Graupius battle with three different rulesets. That’s what’s I call quality.
I still have to devise the topic list. Clarity of the rules of course, but also the consistency of the dice mechanics.
I know it sounds ambitious. I’m a lawyer IRL, and some of my concepts are derived from my law background – what’s the goal of the law, what do other lawyers think, is it an effective rule? I might be forced to compare a few more rulebooks than my usual range and play many more games than just two if I follow my own review rules. Well, Brutus says I am ambitious – and Brutus is an honourable man.
I just hope that I will live long and prosper!
Earlier thoughts in my ongoing review project: Part I here, part II here, part III here, part IV here.
10 thoughts on “The Amsterdam Acid Test for Wargames, Part V: 10 Reviews Do’s & Don’ts”
Good post! Looking forward to your reviews. Two points to consider:
1) “Video is about pictures, not about depth.” I think it’s possible to craft thoughtful video reviews. I’ve found the board game space is more advanced in this regard. Have you seen the video reviews from sites like The Dice Tower and Shut Up and Sit Down? I’ve found them helpful in deciding which games to purchase or avoid. We may not have video reviews that reach their level of refinement, but I think it’s possible.
2) I think there’s an aspect that reviews tend to miss: solo fun vs group fun. When you think about the hours invested, most of our wargaming time is done by ourselves. Hours spent reading rules, crafting army lists, writing scenarios, painting figures all to serve playing the game with friends for a fraction of that time. I think most reviews spend time focused on the play aspect and don’t consider that solo time (is generating an army list fun? Are the models easy to assemble? Fun to paint? Is the book a chore to read or enjoyable?)
LikeLiked by 1 person
Vlog reviewers can be pretty good, and the quality can indeed be better than a mediocre written review. Correct. However a thoughtful written rule review will be more informative. Analysis, detailed comparison, footnotes – I might be old school, but that’s how I am.
You have a valuable point about pre-play and solo fun. I will work it out in my topic list somehow. Tnx.
If you didn’t know that two dice give a bell-curve instead of a linear distribution, then should you be reviewing games? Seriously, that’s a pretty basic principle of game design.
Back in the days I didn’t know. I just played games and read rules and reviews to decide what (not) to buy. I’ve read several good bloggers who criticise dice mechanics since then, so I think I know a lot more about it now. I’m not a virgin in statistics, just never applied it to gaming until I start to think about it.
I love this article, thinking about same issues on our blog where we do two reviews about the games. The first impression and then the review. It can create two different results. For example Frostgrave. First impression was just Wow. Later it was Meh.