Reminiscent of the scene in Alfred Hitchcock's “The Birds,” when the sky turns black with attacking avians, I often feel as if I'm being attacked by throngs of bad statistics when I read today's insurance and business press. “The Birds” was just a movie, but bad statistics can draw real blood. Bad statistics can lead to bad strategies and bad decisions. For proof, look no further than our economic malaise. While some people rely on statistics, others avoid statistics at all costs. Some people believe Benjamin Disraeli's famous saying, “There are three kinds of lies: lies, damned lies and statistics.” I would qualify to say that bad statistics are damned lies, while good statistics are key to intelligent management and a huge competitive advantage. This makes it essential to discern between the two. Sometimes advanced education is required to identify the good from the bad, but usually common sense will suffice. Here are some examples: Check the data source
I recently examined a PowerPoint from an agency consultant. I was jealous of his beautiful graphs that were so artistically crafted. After looking more closely, though, I realized they were just art. There was no real data behind them. The consultant had started with how he wanted the charts to look, then created the data that would generate his desired charts. He had not gathered any real data. The lesson: Always check the data source. The best data is random. If it's not random, suspect it. Be particularly cautious if the organization that created the study also is selling something because its study is more likely to be biased.
Beware of averages
Studies presenting only averages are of little use because the information regarding what creates better or worse performance is absent, as is variance. Averages are not usually adjusted to the median, so a few scores could greatly skew the result. And even if you see averages showing the numbers for the top performers or worst performers, that data often is meaningless because the numbers are just averages. Many factors affect averages–without more information, how do you know what to do about it?
For example, an agency was comparing its producers with a study showing the average producer's book was $300,000. Was its $250,000 producer a poor performer? Possibly, except she made $250,000 in half of the time. How about another producer with $350,000? Is he great? Perhaps, except his book was given to him. Possibly the biggest problem with averages is the general assumption that every statistic can be plotted along a normal curve. Some sophisticated users exacerbate this mistake by assuming normal levels of confidence and variation at either tail. The truth is not all curves are normal and we know that in reality, especially in financial markets, even when using a normal curve, the extreme tails do not always work the way a normal curve predicts. The lesson: When a study only shows averages, very little weight should be placed on the results if one is trying to determine cause and effect. Context is critical
Changing the context enables people to purposely mislead others. For example, if I wanted to pump up an agency owner, I could expound on how his agency's 88 percent retention rate was great, without mentioning that most of his competitors are doing even better. At 88 percent retention, the reality is the agency is likely doing something wrong. Another great example is, “Our producers are awesome! They each wrote more than $200,000 in new commissions last year!” Are the producers new or established? Is this a small or large agency? Does this include program business? And most importantly, what difference does it make how much new business the agency writes if all of it goes out the back door? I've met awesome new business producers who could write the equivalent of 30 to 40 percent of their books in new business each and every year. Of course, their retention rates were around 65 percent. New business only counts if it is retained. These are important issues that greatly affect the terrific-sounding “$200,000 in new business” statistic. The lesson: Check the context of the statistic. Do factors exist that might mitigate the statistic's usefulness? Mixing & matching
Be cautious of studies that compare apples to oranges. A common and very misleading mismatch is to use EBITDA (earnings before interest, taxes, depreciation and amortization) to compare companies growing organically to those growing by acquisition. The problem is while EBITDA excludes almost the entire cost of acquisition growth, it does not exclude the cost of organic growth. I do not have space to go into all the details here, but suffice it to say that when comparing the two types of growth using EBITDA as a measure, the acquiring firm's EBITDA is virtually guaranteed to look hugely better. The key is to look at cash flow. When firms grow organically, their cash flows and profitabilities usually are very close to the same. When a firm grows by acquisition, cash flow often is much less than profits. But because too many people do not understand the implications of EBITDA versus cash flow, a lot of bad acquisitions, growth decisions and even loans have been made.
The lesson: The statistic or measure must match the purpose. Do not compare apples to oranges!
Assuming cause and effect
A statistical correlation does not guarantee a cause-and-effect relationship. For example, a recent study found the shorter life span of Russians correlated with industry privatization, which was killing Russians at an earlier age. Further research revealed that while privatization was highly correlated with shorter life spans, the actual cause is vodka. With privatization, Russian workers can drink as much as they want and are drinking themselves to death.
An example closer to home is revenue per person and profits. Many in the industry assume greater revenue per person leads to greater profits. The truth is, it doesn't. In fact, there is zero correlation between revenue per person and profits. Take a look at best practices and you will find that the agency sizes with the highest revenue per person have the worst profitability.
The lesson: Don't assume cause-and-effect relationships. Are there other factors that may be causing the result? Incorrect assumptions
The best example of this is the assumption that insurers pay more for larger books. It's true that bigger books of business generally result in larger contingency checks. However, do companies really pay more for larger books? In dollar terms, larger books generally result in bigger bonus checks. But on a percentage basis, the effect of a larger book is very small (except for books less than approximately $1.2 million). Based on my analysis of dozens of contingency contracts, once a book gets past about $1.2 million in premium, the bonus percentage does not increase much in most contracts for bigger volumes. For example, at $3 million, the bonus percentage might be 2 percent, and at $8 million it might be 2.1 percent. The bonus is much bigger: $168,000 versus $60,000. All but 0.1 percent of this is due to the bigger book–it has nothing to do with the company paying more money. The percentage is the key here, not the dollar amount. An agency often is advantageous not building a bigger book with a specific company just to earn more contingencies, because it will not, unless the company is one of the few that pay a materially higher percentage for bigger books. The lesson: Always check the underlying assumptions. Are you or is the study making any assumption that would cloud the true results? Statistics easily are manipulated to mislead. When statistics are done well and used correctly, they can provide huge competitive advantages and opportunities. I am not recommending a formal education in statistics. I am recommending you think about the statistics you see, apply common sense, and work with people who will present an accurate and unbiased perspective.
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.