Even clever people think irrationally
There is a temptation to believe that cognitive biases affect smart folk less than they do people of average intelligence. We like to think that others are much more likely to fall prey to illusions and errors in thinking than we are. This is an example of what social psychologists call the primus inter pares effect, and we are all susceptible to it. For example, the majority of us think that we’re better than average drivers (studies show that most people score themselves 7 or above when asked to grade their driving ability between 1 and 10). Mathematically, of course, we can’t all be better than average at something! David Dunning, a psychologist at Cornell University, has studied the phenomenon, known as illusory superiority, for decades. He suggests that although most people will score themselves highly for most positive traits such as IQ, memory and immunity to bias, they will simultaneously recognise the frailties in other people’s abilities, realising that the actions and abilities of others are influenced by personality traits and external circumstances; “but when it comes to us, we think it’s all about our intention, our effort, our desire, our agency – we think we float above these kinds of constraint” So most people would be shocked to discover that everyone, including those whose abilities and intellect we respect most, are just as susceptible to biases, faulty analysis and irrational conclusions as the rest of us. In a famous 1977 study, 94% of college professors rated themselves above average relative to their peers! Statistically, that makes the 6% who didn’t very dumb indeed! Project managers inhabit a complex world and have to navigate a treacherous landscape strewn with pitfalls and traps; inaccurate data, competing priorities and complex diagnostics not to mention contradictory opinions, unrealistic estimates and faulty assessments. In this sense they experience a similar working environment to medical doctors, and like their counterparts in healthcare, the most successful practitioners have developed the ability to think clearly and logically even under pressure, and in doing so ensuring that their analysis of a problem is as sound and reliable as possible. Bear this in mind when considering the following puzzle, and be aware that when presented with this challenge, over 80% of General Practitioners arrived at the wrong answer ... A problem of multiple probabilities In this example doctors were asked to consider a test used to screen for a disease. The test is quite accurate with a false positive rate of just 5%. In other words, 5% of the people who take the test will show as having the disease even though they don’t. (This success rate would be considered more than acceptable in real life situations. For example, in the case of breast cancer screening the American Cancer Society’s website states that ‘About half the women getting annual mammograms over a 10-year period will have a false-positive finding.’) For this puzzle, it was explained that the disease is quite common and strikes 1 in 1,000 of the population. People will be tested at random, regardless of whether they are suspected of having the disease or not. Finally, the test does not produce false negative results, so it won’t indicate that a person doesn’t have the disease when they do. The question was posed as follows: assuming that a given patient's test is positive, what is the probability of this patient actually having the disease? Most GPs answered 95% - in other words, they thought that the chances were that the patient did have the disease. At first glance this answer seems logical, after all the test is 95% accurate isn’t it? So if the test shows positive then the sad news is that the person is much more likely to have it than not. However, the correct answer is 1/50 or 2%. The chances are that they don’t have the disease. If you thought otherwise, then you may have just informed someone that they’re going to suffer from a terrible illness when they’re probably not. It’s the sort of thing we ought to be capable of getting right – but most of us, including most practising doctors, get wrong. Why? At first the answer of 2% seems counter-intuitive. It just doesn’t sound right. This is because we didn’t engage our analytical faculties when calculating the solution, in fact we didn’t do any calculating of probabilities at all, relying instead on mental short-cuts or heuristics to arrive at what seemed to be the obviously correct answer. If you’re one of the minority of people who got the answer right you can skip the next couple of paragraphs, but for the rest of us, here’s how we should have approached the problem: The key piece of data which we should have factored into our thinking is the incidence at which the disease affects the population. In this case it was explained that the illness strikes 1 in 1,000 people. So, imagine that the screening programme took place in a small city and that 10,000 people were tested. We would expect there to be 10 people in that population who had the disease. However, when we screen those 10,000 people we know that we are going to get 5% false positives and no false negatives, in other words we will end up with 500 people indicating positive on the test. Yet we know that of those 500, only 10 people will have the disease. So 10 out of 500 is 1/50, or 2%, in other words, the positive reading made it 20 times more likely that the person tested has the disease. One of the mistakes we make when dealing with puzzles like this is to skimp on carrying out a thorough analysis of the question. Ironically, the smarter we are and the more confident we are in our own abilities, the more likely we are to rush to a conclusion that ‘feels right’. In this case we didn’t pay attention to the two most important statistics i) the fact that 1 in 1,000 people will have this disease, and ii) the fact that the screening is 95% accurate for positive outcomes and 100% accurate for negative results. Instead, we focused solely on the fact that the test was ‘95% accurate’, we latched onto this statistic alone and allowed it to influence our thought process, which led us inevitably, to the wrong answer. Looking at this example in the context of project management; it is most relevant when thinking about risk analysis; what are the chances that this or that will happen and impact the project for the worst. Most project managers are not experts in the day to day operational aspects of the businesses that they’re working in, usually we work with subject matter experts; engineers, surveyors, developers with whom we would consult to help us populate our risk registers and develop mitigation strategies. The lesson from this example is this: all of the smart people you work with are prone to making errors of this kind, even when analysing processes and events that are within their sphere of expertise. Never assume that they haven’t succumbed to a cognitive error or an invalid mental shortcut when they’re providing you with their assessment of risk, impact or probability. Try asking them to talk you through their thought process, you may discover something interesting. The bigger moral of the story is that we are all susceptible to those tempting mental shortcuts. We use them without being aware that we’re doing so, which is why our success rate when dealing with risk analysis is usually poorer than we would admit. Our intelligence, expertise and experience is no defence – unless we are constantly vigilant and on the lookout for cognitive traps. The author and broadcaster Garrison Keillor created the town of Lake Wobegon as the setting for his long-running radio show Prairie Home Companion in 1974. In this fictional town it was said that; ‘All the women are strong, all the men are good-looking and all the children are above average.’ It’s a pity that our projects aren’t all based in Lake Wobegon, that most unlikely of places.
12 Comments
Joshua Rosenberg
6/1/2017 01:17:08 pm
It is easy to consider oneself superior to the people that one encounters on a day to day basis. It is also easy to assume that answers should come quickly and easily to those of us that consider ourselves to be very intelligent. This goes doubly for people that use words like 'oneself' and 'one' when referring to themselves. In reality no one is above error or fault and we must learn to sacrifice our own egos for the sake of the project and the client since it is their needs that must be met and not our own.
Reply
James Beresford
6/1/2017 01:35:38 pm
We all make errors. This should not be a issue so long as you take ownership and make a best effort to correct the mistake, what more can you do. Make a mistake once - your human. Make the same mistake twice - You lack diligence. Make the same mistake three times - Are you really suited to the task???
Reply
Ben Roberts
6/1/2017 02:22:49 pm
Certain pressures in working environments can lead to these kind of mistakes, be it time or resource pressures. The UK health system must be prime example of this, doctors probably don't have the time to bounce off other team members or other team members, with the required skills, are simply not available.
Reply
Gordon Huxtable
6/1/2017 04:24:55 pm
A nice puzzle - I failed, but the lesson is clear! It would be good to see a similar comparison on risk management when using an Agile Project methodology versus traditional waterfall. The success rate of risk identification is likely to be higher over short term, smaller deliverables rather than a long term project. The immediacy of a discrete function delivered over a short space of time usually means you have more robust mitigation strategies as well - resources are on hand and knowledgeable about potential solutions to problems, whereas a risk identified 6 months before implementation may prove more tricky to resolve.
Reply
Mike
9/1/2017 12:33:37 pm
Thanks for the comment, Gordon, look out for a future blog touching on my experiences re: Agile UX vs. Lean UX. I think you'll find it interesting based on your feedback here.
Reply
Darren Saunders
7/1/2017 09:27:33 pm
A really interesting blog and a good lesson. When I look back over my career I certainly recognise the scenarios described. Most of us believe we can accomplish tasks more quickly than often turns out to be the case.
Reply
Zoe Barnes
31/1/2017 10:33:03 am
A really interesting read MM! I'll try harder to think things through rather than accept an initial understanding at face value. It's interesting to note that our brains lean towards a subconscious pattern, or learning that has previously suited us, kept us safe, so it may be that patients/clients will want to see a worse case scenario because it fits their reality. The real gem here was the clear and concise explanation.
Reply
Richard Evans
18/2/2017 08:16:10 am
Good read. I've also seen many resources, mainly labour and travel, burned in projects where there is the 'Emperor's new clothes' syndrome. i.e a senior leader who makes judgements like this but who cannot be questioned.
Reply
17/10/2022 08:02:25 pm
Building six fire high safe. State rule deep. Charge unit sea west political leader stand.
Reply
30/10/2022 02:56:19 am
Season her very peace. Leg possible anyone long his.
Reply
6/11/2022 04:59:59 am
Present billion campaign hour impact. Its friend each stuff woman coach concern. Bank eight large. Sea control speak perhaps their girl take.
Reply
17/11/2022 05:27:41 am
Company bag under. Long hear process work yes.
Reply
Leave a Reply. |
AuthorThe Irrational Project Manager Archives
January 2017
Categories
All
|