When Simple Rules of Thumb Beat Statistics

Subscribe at the end of the post to receive

the eBook for free

 

In highly complex and uncertain situations, the use of simple heuristics can outperform the use of complex models. Heuristics are sort of rules of thumb used either consciously (simple rules) or unconsciously (intuition). They are based on experience and their effectiveness depends on their adaptation to the environment, namely a business problem or a choice situation. An example of heuristics is to identify an inactive customer based on the time of the last purchase instead of relying on a complex optimization model. While the goal of prediction models is to optimize the situation, the goal of heuristics is to satisfice, namely achieving a good solution instead of the best solution. In addition, the flexibility and subjectivity of heuristics foster creativity and innovation.

Best data analysis techniques based on the type of data and the level of uncertainty

The advantage of heuristics is that they ignore part of the complexity, and this means not only less effort, but also a lower estimation error in uncertain environments. An example is the experiment performed with German and U.S. students[1] who were asked to identify the larger city between Detroit and Milwaukee. Here, 90% of German students versus 60% of U.S. students answered correctly using the “recognition heuristic,” that is, they chose Detroit for the simple fact of having heard about this city.

Usually, when we compare different prediction models, we talk about the effort-accuracy tradeoff. This is only partly true when you use heuristics under uncertainty. Under risk (where the probability of outcomes can be estimated), the use of heuristics can be an alternative to more complex models since it implies less resources and time, at the expense of accuracy. However, under absolute uncertainty, this principle doesn’t hold, and it is replaced by the principles of less-is-more: limiting information and search leads to higher accuracy.[2] To explain this principle, I need to introduce the different components of error in predictions. Error is composed of bias, variance, and noise. Bias is the deviation from the reality (or, in mathematical terms, the difference between the predicting function and the function representing the reality); variance is caused by the peculiarities of specific samples, it increases as the sample size decreases; noise is the observing error in sampling. Simpler models tend to have a higher bias, since they are less accurate in detecting smaller variations. However, if you try to reduce bias by using more complex models, variance will increase since your model will tend to overfit the data. Overfitted models are less accurate in the prediction of future events. Besides, more information sometimes means more noise in the data. Heuristics implies a certain degree of bias that has some beneficial effect on controlling variance. We can say that it is the very fact of ignoring part of the information that produces a better result thanks to the reduction of variance. Therefore, since heuristics does not convey any advantage on bias compared to other models, their power relies on reducing variance.

At this point, after having praised the advantages of heuristics, I have to stop and be clear on one thing. Heuristics are not better than fact-based decisions or statistical models when data is available, and the level of uncertainty is manageable. What I’m saying is that you can’t just “guess” how many customers are happy with your product, when, with a small sample and an inexpensive survey, you can obtain quite an accurate answer. Moreover, heuristics don’t perform better when strong biases are in place. These can be personal biases (based on people’s experience) or information biases, for example, those produced by the news. The frequency of a certain type of news makes us think the phenomenon is larger than realty. For example, if during a certain period, several pieces of news are published about immigrants committing crimes, people will tend to overstate the number of immigrants present in the country or the share of crimes committed by them compared to nationals. In conclusion, you should use heuristics when no data model is suitable for the situation. Besides, even if heuristics is the best option or you use it to complement data models, you must be aware of the possible biases and overconfidence.

Examples of heuristics are intuition and 1/N rule.

Intuition

Intuition is based on the recognition of past events and the application of what we have learned from them to the new situation. When intuition is based on true experience and skills, it is very useful, for example for firefighters or chess players. However, if intuition is the result of substitution (of a complex question with a simpler one) and overconfidence, the result is inaccurate. But then, when do we use intuition? Daniel Kahneman and Gary Klein wrote a paper on this.[3] According to them, we can trust intuition when the environment is sufficiently regular to be predictable, and when these regularities can be learned during a prolonged period of time. The second point implies that after making a decision, the feedback has to be available and with a short delay. Examples where intuition works are in sport where you have immediate feedback and you learn what works and what doesn’t. In economics, however, intuition tends not to work because the environment is quite unpredictable and feedback is ambiguous and delayed. When dealing with business issues, usually pure intuition won’t work, so you have to apply some rules and numbers if you want to use heuristics. Daniel Kahneman proposed a method to avoid extreme predictions based on weak evidence. Imagine that you have to predict the outcome of a promotional campaign:

  1. Identify the criteria for defining a campaign similar to this one and select similar campaigns.
  2. Calculate the average outcome of these campaigns. This represents the prior probability (also called statistical base rate or outside view), which is the starting point of your estimation. Say an average ROI of 10%. If you had no more information, 10% would be your best estimate.
  3. Estimate the outcome based on the evidence you have, namely the specific information about the campaign. You are investing a lot more than other campaigns and you estimate a 50% ROI.
  4. Estimate the correlation between the evidence you have and the outcome, that is, the impact investment has on success, let’s say 25%.
  5. Move from the prior probability toward your estimation proportionally to the size of the correlation. In our example, you move 25% * (50% – 10%) = 10%; your final estimate is 20% ROI.

Another interesting method Kahneman suggested is related to interviews. Instead of asking a general judgment to the interviewers about a candidate, ask them to evaluate different attributes separately, preferably with a predefined scale (e.g., from 1 to 5). Then, use simple statistics; for example the number of attributes above 4, or the average score, based on the results of previous interviews. Intuition is important at assessing individual attributes, but by avoiding an overall judgment we avoid possible biases such as the halo effect, confirmation bias, or representativeness.

1/N rule

This is a very simple heuristics to solve problems of resource allocation in case of high uncertainty, little information, small samples, and large population. According to the 1/N rule, each alternative receives the same share of value. It has been proven to be fairly competitive compared to more complex models in allocating financial resources.

In my coming book “The Art of Data Analysis” you will find more interesting heuristics techniques that you can use in data analysis.


[1] Daniel G. Goldstein and Gerd Gigerenzer, “Models of Ecological Rationality: The Recognition Heuristic,” Psychological Review 109, no. 1 (2002): 75–90.

[2] Henry Brighton and Gerd Gigerenzer, “Homo Heuristicus and the Bias–Variance Dilemma,” Action, Perception and the Brain (2012): 68–91.

[3] Daniel Kahneman and Gary Klein, “Conditions for Intuitive Expertise: A Failure to Disagree,” American Psychologist 64, no. 6 (2009): 515–26.

Click here to get the book!!


 

1 thought on “When Simple Rules of Thumb Beat Statistics

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

search previous next tag category expand menu location phone mail time cart zoom edit close