As someone who has himself written dozens (please tell me it's not hundreds...) of product reviews over the years, I was fascinated to read Brian's piece. It's an accurate discussion of the limitations of product reviews, though I'm not sure the problems can ever truly be fixed.
First, a recap. Brian bought a well-reviewed Samsung oven, and it broke. At this point, he fell into a bottomless pit of bad customer service--and realized that despite all his diligent research, including reading numerous product reviews, he never really learned anything about how difficult Samsung would be to deal with if his oven ended up being defective.
Defects happen. Anytime a company mass produces a product, some percentage will be lemons. It doesn't mean the overall product is necessarily bad--though the higher that defect percentage, the more you've got to wonder--but that's cold comfort to the owner of a lemon. Getting a lemon is a bit like winning the lottery (the Shirley Jackson kind). But if the company you're dealing with has a strong warranty policy and offers good service and support, you should be fine.
What's a reviewer to do
As Brian realized, most of the product reviews out there, including the ones here at Macworld, are focused on the product itself, and its performance fresh out of the box. There are practical reasons for that. People want to read about products when they're new--a review written after spending a year or two with a product would be vastly superior to one written after a day or two, but nobody would read it (and the product would probably have been discontinued by then).
Then there's the issue of trying to measure service and support. This one's brutally hard to do, and that's why most organizations don't even try. In my early days at MacUser and Macworld we debated this one a lot, and made several false starts at trying to evaluate supp ort. At one point, Macworld actually considered the vendor's available hours of telephone tech support when calculating a product's rating!
But what if you really want to test customer service First, you have to invent a fake complaint--and those can be difficult to gin up. If the product you're using was provided by the vendor, they may figure out that the request is coming from a reviewer and give you a different experience from a regular customer.
And even if you do manage to make up a fake problem and get a single, normal customer-service interaction, is that really enough to make a judgment What if you got the company's best or worst customer service rep, or just an average rep on a bad day A single experience really doesn't tell you much, and pretty much no editorial budget is going to survive trying to expand customer-service testing to the point where you've got a reasonable sample size.
This is why we punt.
Now, this isn't to say that there aren't good attempts out there to quantify a company's customer-service abilities. Research companies like J.D. Power and Associates do surveys of consumers in an attempt to find out which companies tend to be good or bad at customer service. Consumer Reports polls its own subscribers about long-term support issues with products as well as the quality of a company's support efforts.
These can be helpful, but there's still so much variation. Does this year's model break more than last year's Do people in New York get a different set of customer-service reps than people in California Does Samsung do a terrible job of supporting ovens, but a good job of supporting phones, because the two are managed by entirely separate divisions
I bought a Samsung washing machine a few years ago. The reviews were good, I got it for a good price, and I've been very satisfied with its operation. But about a year in, it sprung a leak. And that's when we discovered that, unlike most washer brands we could've bought, there was no local appliance-repair person who was qualified on Samsung washers. We had to get someone from an hour away to drive over and look at the problem. (He fixed it, and the washer's worked great since then.) I'd never considered availability of qualified repair personnel as a major issue in reviewing a product, but when my washer sprung a leak, it turned out to be a big deal.
One person's opinion
In the Times, Brian writes, "Product reviews are broken. They are great at telling you about the speed of a computer or the brightness of a screen. But there's a big gaping hole in evaluations of most products." He's right about the gaping hole, but he's not right about reviews being broken. I'd argue that the state of product reviews today is, in fact, better than it's ever been.
When I started reviewing tech products in the '90s, it was the era of the big publisher. You could literally name every publication that would review Mac or PC products, and the list wouldn't be very long. The attitude of these publications--including my own--was that they were offering the definitive pronouncement about the quality of a product. The stone tablets came down from the mountaintop, and inscribed on them were the appropriate number of mice or stars or whatever that told the true story of the quality of that product.
The people I worked with back then--writers, editors, and analysts in testing labs the likes of which simply don't exist anymore--tried very hard to live up to the ideal of finding the absolute truth about a product's quality.
But the truth is that product reviews have always been personal, biased, and idiosyncratic. Scoring systems are made by people with their own opinions about what aspects of a product are more or less important. Reviewers bring their own usage history to the party. In those days, we cloaked all of that behind a veil of utter impartiality, but of course all our biases were baked in.
Product reviews are, ultimately, just one person's opinion, a lesson I learned from Rick LePage, my predecessor as editor-in-chief of Macworld. A review that shows off the voice and experience of the writer is more truthful and more valuable that one of those old-school "impartial" reviews. When I write a review today, I bring two decades of history and a whole lot of consideration with me, but in the end, a review is still my opinion. It's based on my experiences and biases and readers should know that. (In fact, I think readers prefer reviews that are open and have a distinct voice.)
People who grew up in an era where reviews seemed like absolute declarations might find it terrifying that we now live in an age where publication reviews are more personal and even professional product reviewers admit that they check Amazon user reviews before they buy anything. But those Amazon reviews, the good ones and the bad ones, can be incredibly instructive in painting a picture of a product. And then there are new sites like The Wirecutter--disclosure: I've written a couple of reviews for them--that are adding thoughtful, practical voices to the product-review conversation, and trying to boil down the consensus of editorial reviews and user reviews in order to find the right buying advice.
So are product reviews broken No, but they're also not perfect. Admitting that, and that every product review is a voice in a larger conversation, is a big step in the right direction.