How analytics are misused to support an agenda
You see this most obviously in politics and contentious issues where the desire to prevail exceeds significantly the desire to do the right thing. One of the most painful instances to watch was in the last presidential election. Democrats and Republicans used analytics heavily, but the Republican process was so flawed it led them to a false sense of victory and materially contributed to their loss.
Then I was reading a Washington Post story on Gun Control looking at a report from Columbia University, which in turn looked at a whopping 130 studies done over 64 years on gun control and found that a lot of controls (like gun buyback programs) are a waste of money, and those that work (like denying guns to mentally ill people) should be pretty obvious.
A key part to any kind of research has to be a willingness to accept the facts because, if management doesn’t want to accept the facts, the analytics effort is likely to fail and the people responsible for it are at job risk. And, most important, if you work for a company with executives who are actively participating in the falsification of reports, even if they are “just” market analysis, you’ll likely be laid off at some point as the company fails around you.
[ Related: 21 data and analytics trends that will dominate 2016 ]
I’m using gun control as the subject of this because it is topical and the dispute has little to do with the 2nd Amendment or saving lives. It is about one side attempting to control the other, and it means both sides tend to use statistics as a blunt instrument and the statistics are often compromised as a result.
So let’s jump to this massive study.
On the gun control side, one of the popular practices is to put in place gun buyback programs and then showcase how many guns were taken off the street. Everyone feels good because they accomplished something. But the study concluded that this activity has no significant impact on gun deaths, which means this money, which could have been spent on things that actually made a difference, was wasted.
Years ago I had a meeting with a large software company that told me they were killing a program that allowed employees to take a copy of a popular business application home and use it for free as long as they had a legal license to use it at work. They said the decision was made because IT demanded it. This seemed insane -- it turned out what the company had done was tell their clients that if their employees made illegal copies they’d be sued and the clients said they didn’t want to be sued. So the firm made the leap in thought that the clients didn’t want the work at home program. But what the customers were really saying was simply that they didn’t want to be sued. That distinction could have led to a decision that preserved the program, but limited customer liability.
In the case of gun buybacks, because the payments are a fraction of what a good gun is worth people that intend to use the guns or intend to sell them to others don’t participate. As a result, there is no impact on the actual problem. The program simply eliminated guns that weren’t in use and apparently aren’t a danger. Yet the programs created the false impression it was making progress.
This is why it is important to make sure the results are consistent with the program goals. However, because this may make the decision-maker look bad, often that effort is never made and bad decisions aren’t corrected.
[ Related: 6 analytics trends that will shape business in 2016 ]
In reading through the study one thing is painfully clear -- most of the gun control laws implemented in the U.S. have had very little impact on anything but suicide rates. There are clearly conflicts but the only constant is that restrictions do seem to have a positive impact on folks that would kill themselves.
One interesting result is that with all the emphasis on assault weapons bans, there appears to be no additional benefit from them. On the other hand, New Zealand’s aggressive program that both does background checks and requires training has had a massive positive impact across the board. This suggests you either do a New Zealand type program or you take the money and spend it someplace else where you would have a greater impact.
Often market analysis will showcase what the market requires for a product. To be successful you need a number of elements. A big consumer product targeting the iPod comes to mind in that the company bringing it to market had a list of things they knew they needed to do and on that list was the ability to migrate iTunes users’ music. The firm launching this potential “iPod killer” decided that was too risky and even though they knew this was a primary requirement, and they had created the feature, they didn’t release it. The product failed spectacularly.
This kind of thing may eventually drive me insane, but market analysis done right will tell you what you absolutely need to accomplish and yet some managers will treat that list like it’s discretionary, which it clearly isn’t. Much like finishing 90 percent of a race isn’t the same as winning it, not doing what the analysis says is a requirement generally assures failure.
When you get to the final conclusions of the Columbia study you see the researches resonate my main point. A massive number of these studies are compromised by one side or the other so you can’t trust the conclusions. Meanwhile, both sides are using them as foundational arguments in an effort to beat the other, the goals of protection and saving lives almost completely lost in the process. After what appears to be millions of dollars in studies the only sustaining conclusions are that if you can keep guns from people likely to commit homicides, suicides or unsupervised children under 17 you will reduce gun based deaths. I doubt there is a person on the planet who couldn’t, off the top of their heads, come to that same conclusion without spending a dime.
Here is the real irony: For years the NRA has aggressively blocked the CDC from doing gun studies. The President did an end-run and was able to get CDC research funded. Most of what resulted actually supported the NRA (I chose a pro-NRA site for the citation) and were generally consistent with what Columbia University found. Sadly, I expect the gun control advocates now think the CDC study was a bad idea rather than concluding that much of what they advocated was wrong. Clearly the NRA was just as thic- headed. (By the way there is a funny segment on The Daily show on this, the most interesting part is that the congressman who sponsored the legislation blocking the CDC from doing this research now supports the effort). Go figure...
Analytics should be about finding the truth not about supporting an already-taken position. That’s the only way you’ll get to a valid decision. Something to noodle on over the weekend.