Even clever people think irrationally
There is a temptation to believe that cognitive biases affect smart folk less than they do people of average intelligence. We like to think that others are much more likely to fall prey to illusions and errors in thinking than we are. This is an example of what social psychologists call the primus inter pares effect, and we are all susceptible to it. For example, the majority of us think that we’re better than average drivers (studies show that most people score themselves 7 or above when asked to grade their driving ability between 1 and 10). Mathematically, of course, we can’t all be better than average at something! David Dunning, a psychologist at Cornell University, has studied the phenomenon, known as illusory superiority, for decades. He suggests that although most people will score themselves highly for most positive traits such as IQ, memory and immunity to bias, they will simultaneously recognise the frailties in other people’s abilities, realising that the actions and abilities of others are influenced by personality traits and external circumstances; “but when it comes to us, we think it’s all about our intention, our effort, our desire, our agency – we think we float above these kinds of constraint” So most people would be shocked to discover that everyone, including those whose abilities and intellect we respect most, are just as susceptible to biases, faulty analysis and irrational conclusions as the rest of us. In a famous 1977 study, 94% of college professors rated themselves above average relative to their peers! Statistically, that makes the 6% who didn’t very dumb indeed! Project managers inhabit a complex world and have to navigate a treacherous landscape strewn with pitfalls and traps; inaccurate data, competing priorities and complex diagnostics not to mention contradictory opinions, unrealistic estimates and faulty assessments. In this sense they experience a similar working environment to medical doctors, and like their counterparts in healthcare, the most successful practitioners have developed the ability to think clearly and logically even under pressure, and in doing so ensuring that their analysis of a problem is as sound and reliable as possible. Bear this in mind when considering the following puzzle, and be aware that when presented with this challenge, over 80% of General Practitioners arrived at the wrong answer ... A problem of multiple probabilities In this example doctors were asked to consider a test used to screen for a disease. The test is quite accurate with a false positive rate of just 5%. In other words, 5% of the people who take the test will show as having the disease even though they don’t. (This success rate would be considered more than acceptable in real life situations. For example, in the case of breast cancer screening the American Cancer Society’s website states that ‘About half the women getting annual mammograms over a 10-year period will have a false-positive finding.’) For this puzzle, it was explained that the disease is quite common and strikes 1 in 1,000 of the population. People will be tested at random, regardless of whether they are suspected of having the disease or not. Finally, the test does not produce false negative results, so it won’t indicate that a person doesn’t have the disease when they do. The question was posed as follows: assuming that a given patient's test is positive, what is the probability of this patient actually having the disease? Most GPs answered 95% - in other words, they thought that the chances were that the patient did have the disease. At first glance this answer seems logical, after all the test is 95% accurate isn’t it? So if the test shows positive then the sad news is that the person is much more likely to have it than not. However, the correct answer is 1/50 or 2%. The chances are that they don’t have the disease. If you thought otherwise, then you may have just informed someone that they’re going to suffer from a terrible illness when they’re probably not. It’s the sort of thing we ought to be capable of getting right – but most of us, including most practising doctors, get wrong. Why? At first the answer of 2% seems counter-intuitive. It just doesn’t sound right. This is because we didn’t engage our analytical faculties when calculating the solution, in fact we didn’t do any calculating of probabilities at all, relying instead on mental short-cuts or heuristics to arrive at what seemed to be the obviously correct answer. If you’re one of the minority of people who got the answer right you can skip the next couple of paragraphs, but for the rest of us, here’s how we should have approached the problem: The key piece of data which we should have factored into our thinking is the incidence at which the disease affects the population. In this case it was explained that the illness strikes 1 in 1,000 people. So, imagine that the screening programme took place in a small city and that 10,000 people were tested. We would expect there to be 10 people in that population who had the disease. However, when we screen those 10,000 people we know that we are going to get 5% false positives and no false negatives, in other words we will end up with 500 people indicating positive on the test. Yet we know that of those 500, only 10 people will have the disease. So 10 out of 500 is 1/50, or 2%, in other words, the positive reading made it 20 times more likely that the person tested has the disease. One of the mistakes we make when dealing with puzzles like this is to skimp on carrying out a thorough analysis of the question. Ironically, the smarter we are and the more confident we are in our own abilities, the more likely we are to rush to a conclusion that ‘feels right’. In this case we didn’t pay attention to the two most important statistics i) the fact that 1 in 1,000 people will have this disease, and ii) the fact that the screening is 95% accurate for positive outcomes and 100% accurate for negative results. Instead, we focused solely on the fact that the test was ‘95% accurate’, we latched onto this statistic alone and allowed it to influence our thought process, which led us inevitably, to the wrong answer. Looking at this example in the context of project management; it is most relevant when thinking about risk analysis; what are the chances that this or that will happen and impact the project for the worst. Most project managers are not experts in the day to day operational aspects of the businesses that they’re working in, usually we work with subject matter experts; engineers, surveyors, developers with whom we would consult to help us populate our risk registers and develop mitigation strategies. The lesson from this example is this: all of the smart people you work with are prone to making errors of this kind, even when analysing processes and events that are within their sphere of expertise. Never assume that they haven’t succumbed to a cognitive error or an invalid mental shortcut when they’re providing you with their assessment of risk, impact or probability. Try asking them to talk you through their thought process, you may discover something interesting. The bigger moral of the story is that we are all susceptible to those tempting mental shortcuts. We use them without being aware that we’re doing so, which is why our success rate when dealing with risk analysis is usually poorer than we would admit. Our intelligence, expertise and experience is no defence – unless we are constantly vigilant and on the lookout for cognitive traps. The author and broadcaster Garrison Keillor created the town of Lake Wobegon as the setting for his long-running radio show Prairie Home Companion in 1974. In this fictional town it was said that; ‘All the women are strong, all the men are good-looking and all the children are above average.’ It’s a pity that our projects aren’t all based in Lake Wobegon, that most unlikely of places.
12 Comments
Microsoft Excel has flooded the world in data. So how do you build an information ark to survive the deluge?There is a complaint that I hear often in the corporate corridors and meeting rooms that I frequent; “We’re data rich but information poor!” The cry is inevitably from some beleaguered executive attempting to make sense of yet another voluminous spreadsheet comprising hundreds of rows replete with undecipherable formulae and references. The concerns are always the same: have I got the latest version? How do I know if these formulae are correct? Where did this data come from? How reliable is it? Who else has been editing this document? Which tabs are the ‘live’ ones? Where do I make my changes? Who do I send this to next? Excel is ubiquitous. It’s used in every organisation that I’ve ever worked for, as a calculator, as a modelling tool, as a notepad, a messaging system, a scheduling app, a planner, a business intelligence tool, and yes, as a database too. Consultants like me are fond of saying ‘Microsoft Excel is the most popular database on the planet’. Which is an odd thing to admit to, especially as Excel isn’t a database at all, and more especially because Microsoft is also the author of two of the most popular actual database products on the planet: SQL Server and Access. One wonders how frustrated the product architects must be to know that 99 times out of a 100, users will choose to implement Excel rather than Access as their desktop database solution. Yet in my experience, wherever there is a business processes whose data handling requirements are supplied by an Excel spreadsheet, there is a high probability of waste being generated, or that the process adds less value than it could. Excel – when used as a database - is almost always an indicator of underlying or latent inefficiency. That’s not to say that Excel doesn’t have a place on the corporate desktop. It does, and I am actually a big fan. When deployed appropriately, Excel adds value and reduces cost; the problem is that it is rarely deployed appropriately. Instead, it’s usually the ‘go to’ solution for every project irrespective of its suitability for the task at hand. Why is Excel so popular?The reasons for Excel’s emergence as the de-facto solution for data handling are manifold. It is by any measure a superb application, well-engineered and more or less bug free. In the right hands, it provides more functionality per $ than any other piece of software that I can think of. Building a case for its deployment is remarkably straightforward:
There appears to be a compelling case in favour of the spreadsheet-as-database. But the benefits and advantages of Excel camouflage significant problems which inevitably present themselves once their parent projects and processes develop and mature. The case against:
It’s no coincidence that the pitfalls of deploying Excel as a database mirror the advantages. This is because the problem isn’t Excel per se. The problem is how and when it’s used. The adage that ‘when the only tool you have is a hammer, everything starts to look like a nail’ could have been composed for these business cases. While the Excel application itself is reliable and robust, the spreadsheets that people create using it are too often fragile. There is no easy way to trace where the data has come from, and with no concept in Excel of a transaction or an audit trail it’s too easy to overtype the contents of a cell without realising it. You may never discover that a colleague was inadvertently typing dates into a value field. Furthermore, reconciling and testing a large spreadsheet is a daunting task that few people are prepared or qualified to undertake. Where you find spreadsheets you’re invariably in the presence of errors – it’s just that you’ve no way of discovering where they lie. What is the alternative?Over the last two or three years a number of powerful, low-cost, cloud-based tools have emerged which offer businesses the opportunity to build real applications to meet the needs of their departments and workgroups. The solutions that can be created with this new generation of on-line tools are robust, secure and reliable, but they also offer many of the same advantages that spreadsheets do, without the corresponding disadvantages. Like spreadsheets, these tools are simple to use, low cost and quick to deploy. They empower non-technical users to create sophisticated applications without learning to code. Being cloud-based they can be deployed seamlessly on any device from desktop to tablet to mobile phone. Most importantly, they require little or no input from the IT department because there is no software to install, no servers to configure and no local data to back up. The solutions are genuinely multi-user in that they can support many users editing, creating and viewing the data simultaneously and they are scalable: suitable for one user or a thousand users. The only application a user needs to have installed on their device is a web browser. Two of the most successful players in this market sector are Knack* and QuickBase. Of the two Knack* is perhaps the most interesting success story: the company was formed just over three years ago in New York by a small team of frustrated web developers. Fed up with constantly writing and re-writing the same application under a different guise they asked themselves a simple question: “What if we could build a web application that could build web applications?” This simple idea would result in the creation of a real killer app which within 18 months would be adopted by thousands of businesses around the world. Knack* is now used by blue chips and start-ups alike, including many household-name corporations such as:
A typical Knack* project The gestation of a Knack* project will start in much the same way as an Excel database project. Someone needs to collect and manipulate data and then communicate the results. No one wants to create a software development request as that’s likely to sit in the IT Department’s pending folder for longer than the project will take to complete. Besides there probably isn't time to scope out everything that is necessary at this stage, the team may not yet be ready or qualified to specify what they need in detail. What is required is something to ‘get started with’, something that can then be developed and fine-tuned as the project progresses. This is usually where Excel enters stage right, and this is where Knack* can provide a superior solution – and by superior, I mean more flexible, more cost effective, less prone to error, easier to deploy, secure and simpler to build and use. I should declare an interest at this point. I have been using Knack* for over a year now and I am a huge admirer of the product. I have spoken to the founders, Eric and Brandon, via Skype and have found them to be smart, dedicated, customer-focused and passionate about their application. They have been hugely successful and deserve every accolade that they are presented with. Knack* is a game changer, so much so that it has already started to spawn copy-cat and ‘me too’ competitors. The press is referring to this as the emerging ‘age of the citizen developer’ and the beginning of a new era in ‘frictionless data’. It sounds grandiose, but there is real potential in Knack* and products like it to change the way businesses deal with data that doesn’t form part of the corporate data repository. A typical Knack* project will be concerned with data collection, usually via input forms on a web-based application (see the example below) followed by data presentation and analysis. Like Excel, Knack* provides a suite of standard charts and pivot table views to visualise information, though the key differentiator is that all of the analysis and ‘drilling down’ is done via an intuitive user interface rather than a complex spreadsheet. A key advantage of using a real database to store information is that it is transactional. This means that any change to the data is audited so that you know who added, deleted or edited a record and when it was done. This permits far greater data integrity than could be achieved with a shared spreadsheet.
The authors of Knack* have also given a good deal of thought to the typical projects that are likely to be supported by a Knack* application and have created useful ‘wizards’ that can import a Spreadsheet and build a basic Knack* database complete with input forms and views completely automatically. This allows new Knack* developers to get up and running quickly. Mindful of the advantage of customers being able to integrate Knack* into their existing websites the developers have also made it very easy to embed a Knack* application into a standard web form. An embedded Knack* application also ‘inherits’ the style sheet of the host website so that it takes on the appearance of the client’s corporate identity with regard to fonts and colours. There are also powerful integration features to enable Knack* applications to ‘talk with’ other on-line systems. While these capabilities are generally for the more advanced or technically experienced user, they do allow Knack* to operate seamlessly with other hosted systems such as SalesForce, XERO, Google Docs, MailChimp, DropBox, PayPal, Facebook etc. as well as leading commercial databases such as Oracle and Amazon Dynamo. The rise of ‘frictionless data’ and the ‘citizen developer’ The emergence of simple, cloud-hosted database applications such as Knack* has led to many businesses re-evaluating their approach to data collection and presentation. Excel is no longer the only option, and certainly shouldn’t be considered the default option. In addition to the clear technical, security and productivity advantages there are other benefits to the adoption of cloud technology which shouldn’t be overlooked: Teams given the creative freedom to design their own solutions, or at least to influence the way in which their applications are built, experience a significant productivity-increase. Part of this comes from the empowerment that using ‘the right tool for the job’ brings (the hammer is no longer the only tool in the box) but much of the morale boosting effect is a result of being liberated from the strength-sapping delays that are part and parcel of having to deal with an over-worked, under-resourced IT department. Many companies allow their ‘citizen developers’ to build non-core solutions entirely independently of their IT function. As cloud-based applications require only a web browser and an internet connection there may be no compelling need to initiate an ‘IT project’ at all. It seems that the revolution in cloud applications has started and that Excel may be the first casualty. This is a good thing for most businesses. Excel still has its place, but that place is for a niche spreadsheet application; a calculating, analysing, modelling ‘what if’ tool, not a database. Data collection and management is best handled by a real database, one that can enforce integrity, validate inputs effectively and secure corporate information reliably. In the past, systems like this were beyond the reach of most of the business, only the IT Department had the necessary skills and underlying technology to create and support such a solution. Fortunately, things have moved on and there are leaner, more creative and more productive options available. You wanna bet? Most of us don’t go about our business as Project Managers assuming that we are irrational, biased or easily fooled. Quite the opposite, in fact. We feel that we make decisions based on reason, cold analysis and a rational assessment of the available evidence. We believe that one of the skills that makes a Project Manager good at their job is the ability to make thoughtful, coherent and balanced decisions most of the time, in any given situation, irrespective of stress levels, pressure from stakeholders, looming deadlines or stretched resources. But whether we realise it or not, our decisions are often skewed or biased or just downright wrong because we regularly fall prey to cognitive biases, subconscious influences and things that psychologists like to call decision traps. Of course, in everyday life most of us manage OK. We usually don’t see the effects of our irrational tendencies, and in fact, there is evidence that the short-cuts or heuristic thinking processes that we use day-to-day evolved in the past to improve our ancestors’ survival chances. But these evolutionary adaptations of our brains took place a million years ago, when homo sapiens needed very different skills and abilities to survive. So what about now, how does this legacy of mental capability affect us in our modern work lives? What if our inability to think and act rationally impacts the decisions that we take while we’re engaged in our business of project management? Is it conceivable that some of the problems we face when coordinating teams, assessing risk, estimating tasks, devising schedules or communicating with stakeholders could be down to errors in the way that our brains actually work? If you’re anything like me you’re probably thinking “Well, I can see how that might be an explanation for the issues some of my colleagues face, but I don’t think it applies to me, I’m fairly sure that I approach things in a pretty rational way, I’m a ‘reason and logic’ kind of person, I don’t think I’m easily fooled or misled.” And you may well be right, but statistically it’s unlikely, for reasons that I explain more fully in this book. (See the chapter No one is immune). For now, let’s look at a couple of simple examples that might just catch you out. In the preface to his best-selling book Inevitable Illusions, Massimo Piatelli-Palmarini illustrates just how easily we find ourselves believing bizarrely incorrect facts without realising it, indeed, without giving these ‘facts’ much forethought at all. He uses examples from geography to illustrate what he likes to refer to as mental tunnels. Here are a couple of them to think about; Losing my sense of direction These examples work best for people with a basic knowledge of the geography of the United States, Italy or the United Kingdom. Let’s start with the American example: Imagine you’re in Los Angeles and you take off in a helicopter to head to Reno, Nevada. What direction do you think you would need to fly in? Most people, even most Californians, guess that it would be North and East, perhaps a heading of 10 or 20 degrees East. (Don’t look at a chart just yet.) Now, for the Brits, imagine again taking off in a helicopter from Bristol. If you head immediately North and fly towards Scotland what major Scottish city would you end up flying over first? If you’re a Londoner, imagine taking off from Heathrow and heading directly North, what is the first Northern English city you’d fly over? The prevailing answers are Glasgow and Leeds, respectively. But, as you might suspect, they are entirely incorrect, as is the answer that most Americans give in response to the Reno question. The correct answers are as follows; Reno is actually North and West of Los Angeles, 20 degrees West to be precise. The first city you’d fly over if you were heading North from England’s ‘West Country’ city of Bristol is actually Edinburgh on Scotland’s East Coast. Similarly flying due North from London has your helicopter passing Hull on the left before crossing the coast onto the North Sea. You’d barely get within 100 km of Leeds. Native born Italians are used to seeing the map of Italy drawn at an angle running from North West to South East, but even so they are generally fooled into thinking that Trieste is some 20 or 30 degrees East of Naples, it being on Italy’s Eastern border with Slovenia and Naples being on the West Coast. But, of course, they’d be wrong. Trieste is, in fact, just West of Naples. Now you can check these out on the charts, Google Maps will do. While you’re at it amaze yourself with these other facts: the first country you’d fly over if you headed directly South from Detroit is Canada. Rome is North of New York City. Don’t be disappointed if you answered all of these questions incorrectly, most people do. It’s not because you’re bad at geography or because you don’t know your East from your West. The actual explanation is that your mind played a trick on you. Without your knowing, your brain rotated the mental image of these maps to align any more-or-less vertical land masses with a North South axis. No one fully understand why this is, we don’t know, for example, why no one rotates the land mass of Italy horizontally, it’s always straightened vertically in our minds, as are the British Isles and the Western Coastline of the United States. Piatelli-Palmarini refers to these as cases of tunnel vision and I’m indebted to him for these examples. They illustrate very elegantly how our brains sometimes work on a problem in our subconscious and present us with false conclusions without our even realising it. But these instances are relatively benign and need not alarm us unduly. For more examples that perhaps ought to alarm us, read on. Do your eyes deceive you? If we’re fortunate enough not to suffer from a serious visual impairment, we probably trust our eyes more than any of our other senses. They are generally in use from the minute we wake up to the time we close them again to go to sleep. Most of us are highly resistant to the idea that our vision might regularly play tricks on us, even though we’ve seen stage magicians fool people on television many times, we have an almost unshakable faith on our own ability to discern reality from illusion. So what colour was that dress? If you pay attention to such things, you may remember a furore in social media about a celebrity whose dress appeared as to be different colours to different people? Some thought it was black and blue, others white and gold. I recall asking myself at the time how could it possibly be that different people had an entirely different impression of the same photograph. I wondered if it was their eyes or their brains that were interpreting the visual data differently. In the example below there are two images of a woman wearing a dress. Do they appear to be the same colour to you? For most observers, the dress on the left is a shade lighter than the one on the right. Similarly, the olive brown colour at the top of the dress is slightly darker on the right than it is on the left. The two dresses are, in fact, exactly the same colour. But it doesn’t seem to matter how long you stare at the picture, even if you believe it to be true you can’t force your eyes to see them as being identical. The blue and brown on the left hand dress always seems lighter than the one on the right. All good project managers are somewhat skeptical in nature (and you have every right to be) so for those of you who don’t believe it, here is a modified version of the picture to help you convince yourself of the facts of the matter. Fifty Shades of Grey? In another well-known version of a related visual illusion we see two squares, in this case labelled A and B, which are clearly different shades of grey. You would probably find it very difficult to believe that the squares are exactly the same colour and that your eyes – or more correctly your brain – is fooling you into thinking that one is lighter than the other? And yet that’s precisely what you’re not seeing. In this example of the shadow illusion squares A and B are identical in both colour and shade. Yet knowing that this is true is of no help when you try to see them as the same colour. The presence of the cylinder casting an apparent shadow over a chequer board helps to fool your brain into creating a visual perception of the squares in which square A is significantly and obviously darker than square B. In fact, the only way you can be convinced that they are the same colour is if you completely mask out the other parts of the image to reveal only the two squares in question. Try printing this blog page and cutting squares A and B out then laying them side by side, I promise you that you'll be amazed. Rubik’s Magic Cube The final example is perhaps the most astonishing of all. Below is an illustration of a 5- row version of Ernő Rubik’s famous cube. Two squares are highlighted, A, I think you already know what’s coming ... Yes, the ‘yellow’ square A and the brown square A are actually the same colour, and it’s brown. The illusion is exquisite in that your brain has made the square on the shaded face of the cube yellow. But it’s not, it is in fact, brown, (R145 G89 B32 to be precise). The bars extending out to the side of the squares in the version of the illustration below helps us to see the colours as they ‘really are’, but take care, if you permit your eyes to wander back towards the centre of the picture you may notice the bar on the left lighten again, and by the time you find yourself looking at the lower square it may even look yellow once more! Although we may use the expression ‘our eyes have deceived us’ we know that it is, in fact, our brains that are the cause of the problem. Our eyes are the just the sensors that detect electromagnetic radiation in the visible spectrum and convert it into electrical impulses. These electrical stimuli are then sent along the optic nerve to the visual processing part of the brain. This is the primary visual cortex — a thin sheet of tissue located in the occipital lobe in the back of the brain.
Thanks to technological advances such as fMRI scanning, we now know that different areas of the brain deal with different aspects of visual interpretation such as colour, shape, lines and motion. Just how the brain deciphers this information to present us with our experience of the ‘real world’ remains poorly understood. It would seem that our brains have a lot to answer for, and, as project managers, visual illusions are the probably the least of our concerns. Yet they serve as useful metaphors for the challenges that our brains face when trying to make sense of the complex world we live and work in, a reminder that we need to be constantly vigilant if we are to avoid being fooled. Why does brainstorming remain so popular despite there being little evidence that it actually works?
Projects are often beset with problems. One of the things that sets good project managers apart is their ability to tackle problems in a methodical, creative and insightful way. As a project manager who likes to embrace challenges, I am often frustrated by the number of colleagues and clients who feel the need to convene a committee meeting every time a new problem raises its head. But I didn’t always feel this way, like many people, I used to think that problem-solving in a group was a better way of arriving at the optimum solution. After all, many hands make light work, as they say. Though recently I have been more inclined to believe that it’s more a case of too many cooks spoil the broth. ‘No one takes a decision on their own unless it’s impossible to form a committee’ In his 1948 book Your Creative Power, advertising executive Alex F. Osborn first set out the ideas for group-thinking sessions. He outlined his ideas in a chapter titled ‘How to Organize a Squad to Create Ideas’ a process that would later become known as Brainstorming. Osborn’s key idea was to invite a group to contribute any and every idea that they could think of, irrespective of how unconventional it might be. He advised that the group should defer judgement on any idea that they might initially think was too wild, and instead give the idea an opportunity to be developed and refined by all members of the team. The idea of brainstorming for ideas is an attractive one. It seems feasible, or perhaps even probable, that the more people involved in the search for a solution, the more likely that the optimum solution will be found. However, more recent research suggests that this might not be the case, and that individuals could be capable of arriving at more and better solutions to a given problem. In their paper Productivity Loss in Brainstorming Groups: Toward the Solution of a Riddle published in the Journal of Personality and Social Psychology Michael Diehl and Wolfgang Stroebe reviewed the results of 22 studies which showed that group brainstorming consistently produced fewer ideas than did individuals working on their own. So why are we inclined to choose brainstorming as the ‘go to’ methodology for solving project and business problems? On the face of it, there do seem to be several advantages to the approach: different members of the team will bring different skills and experiences and, as no one can be expected to be an expert on everything, this should be a good way to cross-pollinate ideas. There is also a sense of comfort in being part of a group, the feeling of a burden being shared. Besides, it’s fun to work together towards a common aim, and the sense of camaraderie when a solution is eventually arrived at can be both exciting and gratifying. It seems counter- intuitive that the research should show this method as being less effective, so what actually happens when we brainstorm and why does it so often lead to fewer and/or inferior solutions? The brainstorming meeting I had a client once who loved to organise brainstorming meetings. He used them whenever the business was faced with a complex or important decision. As CEO of the company, his preferred approach was to assemble a cross-departmental, multi-functional team of around 20 people and remove them from the company’s site, usually to a country hotel for a day or sometimes longer. On one such occasion I was invited to speak to a high-powered team of executives about the fundamentals of project management in order that they could better understand the work that their various product development teams were undertaking. Following my presentation, the executive team (which included several main board members) brainstormed alongside engineers, product directors, project managers, heads of marketing and finance executives. Every aspect of the launch of a new product was discussed from the sourcing of components to the optimum quality assurance policy, from the content of the launch marketing materials to the channel fulfilment strategy. Everyone had their say. People even spoke up about aspects of the product launch that they knew little about; finance specialists opined on the benefits of outsourced manufacturing, procurement specialists ventured forth about what the marketing approach should be and the engineering representatives shared their views on how the various stages of development ought to be funded. The brainstorming session was proclaimed a great success. At the end of the day spirits were high and almost everyone agreed that much had been accomplished. Things had been thrashed out, and no stone had been left unturned. It seemed inevitable that the success of the new product introduction would be a foregone conclusion. The CEO was delighted with the outcome and took the entire team, including me, out to dinner at a splendid restaurant. It was several weeks before the actual outcomes of the meeting began to manifest themselves. As the agreed milestones started to loom it became apparent that very little was actually being achieved and a that progress was painfully slow. Eventually it began to occur to the team that much of the agreed timescale was unachievable. As time progressed it became obvious that the costing budgets were over-optimistic, the sales forecasts were grossly ambitious and that inadequate provision had been made for testing, debugging and rework. The project inevitably stalled, heading inexorably over-budget and overdue. With the benefit of hindsight, it is apparent that much of what had gone on during the brainstorming was theatre. Executives, mindful that the CEO was watching their performance, had concentrated on their own behaviour, on appearing to look engaged and creative. Substance was replaced by showboating. The ideas discussed were broad-brushed, blue-sky thinking. They all sounded plausible and exciting but were light on detail. The team, which actually comprised many intelligent and competent executives, was caught up in a wave of positivity. No one challenged the ideas and even the obviously bad ones were greeted with much enthusiasm. After a while, a curious phenomenon started to manifest itself – one or two members of the team started to ‘police’ the tone of the meeting. Anyone adopting a critical tone was censured and the language used soon became universally upbeat, affirmative and complementary. Without it becoming apparent the brainstorming session had disintegrated and reformed into a kind of cheerleading exercise. Hardly anyone was thinking critically any longer, and the ability of the team to rationally analyze the various proposals on the table had mostly evaporated. All that was left was a cloistered feeling of a good will and a table full of unchallenged ‘great ideas’. Furthermore, dissenters, the few who had spoken up against some of the overly ambitious or unrealistic decisions, were marginalised, not just in the meeting itself but also in subsequent sessions and discussions. They were seen as naysayers, as being overly-negative, or worse, not team players. One engineer who had suggested that more time was required to test certain novel features of the product was described as having a “can’t do attitude” when what was needed were “can do” people. The dissenters became what behavioural psychologists refer to as an ‘outgroup’. The meeting described above is an almost text-book example of how brainstorming can go wrong. A number of lessons can be learned from the story:
What seemed to emerge from this particular brainstorming meeting was an example of what psychologists call Groupthink. Much of the pioneering work on the phenomenon of Groupthink was carried out in the 1970s by researcher Irving Janis at Yale. Irving started his career in the US Army studying military morale and other factors that affected decision making. He was particularly interested in the conditions that gave rise to irrational complacency, apathy, hopelessness and rigidity. He went on to write that; ‘The more amiability and esprit de corps there is among the members of a policy-making ingroup, the greater the danger that independent critical thinking will be replaced by groupthink, which is likely to result in irrational and dehumanizing actions directed against outgroups’ Having experienced many subsequent brainstorming meetings in numerous and diverse projects and situations, I am left thinking that this technique, when conducted under the leadership of an effective (but not domineering) chairperson, can be a good way to encourage novel and imaginative ideas. Particularly when used in pursuit of new concepts in a creative context (marketing, advertising, promotional ideas etc.) brainstorming can provide a great way to initiate several threads of thinking for further development. But when there are specific problems to be resolved that require more immediate resolutions then the brainstorming process may not be the right one, especially if there is significant extra workload or the potential for additional costs and disruption contingent on the outcome. More importantly, if the stakes are high then there is a good chance that the members of the team will perceive there to be risk to their status in the group and standing in the business. In these circumstances individuals will seek shelter and protection in the ‘herd’, this best being achieved – in their minds – by following the leader, however unlikely that particular course is to succeed. Talking about brainstorming “I’m not going to convene a meeting to discuss this issue because it’s apparent we have the expertise, experience and the authority to resolve it ourselves.” “We need a stimulate creativity and the best outcome would be a range of different ideas and concepts for us to go away and consider at our leisure. It seems that a brainstorming session would be the best way forward.” “As CEO I’ve called this brainstorming meeting because I’m not sure on the best solution to the problem we’re facing, so I’d like to start by encouraging everyone to challenge the ideas I’m going to put forward. If you think they’re unworkable I’d really appreciate it if you said so and helped me to understand why.” “I’m going to resist the brainstorming route because it’s very likely that we’ll spend too much lot of time trying to justify the work that our departments have already done.” “Let’s make sure we invite Rita to the brainstorming meeting, her opinions are usually insightful and she is not afraid to stand up and say something won’t work if that’s how she sees it.” How framing can change everythingAs project managers we spend a lot of time asking questions. It is important that we understand that how we ask a question can have a profound influence on the answer we will be given. When we ask a colleague for a progress update, or to assess a risk, or even to let us know whether they are happy with the manner in which the project is being managed we are providing information, in the form of the words that we use, which may influence the answer that we receive. Framing effect The framing effect is a type of cognitive bias where people can be influenced into giving different answers depending on how a question is presented. Some of the most interesting examples of the framing effect can be found when decisions are being made about risk and it is for this reason that a basic understanding of this psychological phenomenon is crucial for project managers and business executives. The prevailing theory suggests that people are more wary of risks when a positive frame is presented but are more accepting of risks when a negative frame is presented. To better understand what this means in practise, consider this example from a paper published by Amos Tversky and Daniel Kahneman1 in 1981: The study looked at how different question phrasing affected participants’ decisions in a hypothetical life and death scenario. The participants were asked to choose between two medical treatments for 600 people who had been affected by a lethal disease. The treatment options were not without risk. Treatment A was predicted to result in 400 deaths, whereas treatment B had a 33% chance that no one would die but a 66% chance that everyone would die. This choice was then presented to the participants in two different ways; one with ‘positive framing’ - how many people would live, the other with ‘negative framing’ - how many people would die. The results were as follows: Treatment A was chosen by 72% of respondents when it was framed positively, but treatment B becomes the overwhelming preference in the negative frame, demonstrating that people are much more accepting of risks when the decision is framed negatively. Of course, in this example, there was no material difference between the risks involved in either treatment! Fooled by framing – always ‘do the math’ Question framing can also be the cause of curious cognitive difficulties. For example, we are easily confused by questions that are framed using inappropriate units of measure. Consider the following:
The answer to the question seems obvious; both Margo and Jake are driving 10,000 miles per year. Margo is only going to achieve an additional 2 miles per gallon whereas Jake is going to improve his consumption by 10 mpg. It seems that we don’t even have to ‘do the math’ – Jake is clearly going to be the winner and save the most. Except he isn’t, Margo is. We have to do the math. In driving 10,000 miles in her new Mustang, Margo uses 714 gallons of fuel. Previously her gas-guzzling Corvette used 833 gallons to cover the same distance. So she’s saved 119 gallons in her first year. Jake, on the other hand, will save only 83 gallons. His old Volvo used 333 gallons to cover 10,000 miles and his new, improved, Volvo will use 250. So although Jake’s fuel consumption has reduced by 25% and Margo’s by only 14%, Margo is still ahead in terms of savings. This is because her cars are so inefficient in comparison to his that a mere 14% improvement is still a lot more fuel! Most people are caught out by the question so don’t feel too bad if you were too. But the more interesting discovery is this: if the question is re-framed so that instead of presenting the fuel consumption measure as miles per gallon it was shown as gallons per mile, we would instantly see that Margo was going to save the most fuel. Try it. When discussing business issues involving KPIs (Key Performance Indicators) we should always bear in mind the importance of framing and measures. The health of a business (or of a project for that matter) is often communicated to stakeholders in the form of KPIs. You will be familiar with system dashboards that show, for example, sales growth, sales per employee, percent of target reached, debt to equity ratio, margin by product etc. When reviewing this sort of data while discussing KPIs, bear in mind that perception of performance can be profoundly influenced by the measures used. You may not want to promote Jake as a result of his upgrade decision, after all, Margo did save the company the most money! The Challenger Shuttle Disaster Every now and again a project failure occurs which is so devastating that it attracts the attention of the entire world. On January 28, 1986 the NASA space shuttle, Challenger, took off from a launch pad at Kennedy Space Centre. 73 seconds later it exploded instantly killing its crew of seven. The night before the launch a number of engineers at NASA contractor, Morton Thiokol, had tried to stop the launch on safety grounds. Their analysis of the weather conditions and their understanding of the temperature sensitivity of the booster rockets’ hydraulic systems had a resulted in the assessment that it was too risky to proceed and that the launch should be delayed. The management team asked the engineers to reconsider and to look at the potential costs, both financial and in Public Relations terms, of the launch not proceeding. The managers set out a number of factors for the engineering team to consider including the fact that President Ronald Reagan was set to deliver a State of the Union address that evening and was planning to tout the Challenger launch. The engineers reconsidered. With the question now re-framed to highlight the negative consequences of not launching they eventually agreed that the launch should proceed. NASA’s management team had succeeded in changing the frame of the question so that the cost of not launching carried more weight than the risk of launching. Despite it being the coldest weather Kennedy Space Centre had ever experienced for a shuttle launch, it went ahead. It seems that the engineers were very wary of the risks when the launch was presented in a positive frame but were persuaded to be more accepting of the risks when they were set out in a negative frame. This tragic example highlights how even the most experienced and qualified professionals can be influenced to assess hazards differently when the risks are framed in a certain way. This is not to understate the immense pressure the engineers faced from their superiors (who had a vested interest in achieving the launch date). The lesson for project managers is that if this can happen at NASA it can happen in any project. The more we understand about how people arrive at their assumptions with regard to risk, the more we can do to ensure that their conclusions are logical and rational. How we are prone to making bad decisionsAs project managers we are used to making decisions and most of us would acknowledge that the ability to make the right decisions at the right time is one of the most important factors influencing the outcome of a project. Of course, not all of our decisions turn out to be the right ones, we are only human after all, but we’d like to think that we make more correct decisions than we do wrong ones. More importantly, we like to think that we get the big decisions right most of the time, so we pay special attention to mission-critical aspects of the project such as risk management, task estimates, team selection and milestone planning to ensure that we make them logically and rationally. But what if making rational and logical decisions is more challenging than we think? What if our choices could be influenced in one direction or another without our even realising it? Let us step out of the world of project management for a moment and into an environment that will be instantly recognisable: the supermarket. You may be familiar with product price labels that look something like this: Sauvignon Blanc Was £7.99 Now £4.99 And despite our best efforts to employ our critical faculties, when we see a label with this sort of offer we cannot help but think that we would be buying a premium product at a discounted price. Retail marketing and product placement gurus have long understood how important and influential our desire to make comparisons is to the decision-making process. The technique is employed in many situations which the supermarkets hope will assist us in making the ‘right’ decision but which actually highlight just how irrational our decision-making can sometimes be. Consider the following set up: When presented with three comparable products at different price points most of us tend to pick the mid-price product. Some of us may pick the cheaper or the more expensive options and these choices will be influenced by a number of very rational considerations including how wealthy we feel and what our standards and quality expectations might be. However, an interesting phenomenon occurs when a fourth product is included in the selection: Research has shown that when another, more expensive, option is introduced, a significant number of shoppers who previously would have bought the mid-price product instead opt for the next most expensive one. Somehow the addition of a premium option – which is usually significantly more expensive – tricks us into thinking that the next most expensive product offers the best value for money. If we were happy to spend a certain amount on a bottle of wine, why should the addition of a very costly bottle to the shelf cause us to change our minds and spend more money? This does not seem rational, but the example does provide a clue about the processes that our brains employ when making comparative decisions. Comparisons, relativity and default settings In his ground-breaking 2008 book Predictably Irrational, Dan Ariely showed how easily a person’s choices could be manipulated without their being aware. In a chapter entitled The Truth About Relativity he introduces a concept known as the ‘Decoy Effect’ using an example like this one: You are offered two options for a romantic vacation for two; a free, all-expenses paid trip to Paris or an equivalent all-expenses paid trip to Rome, which would you choose? Your answer would most likely depend upon your preferences in terms of art, culture, food and entertainment. Some people may opt for Paris and some for Rome and in a large enough sample group we might expect the split to be about half and half. However, if a third option is added, in this case also a trip to Rome but without a free breakfast included, a strange thing happens: more people choose the all-expenses paid trip to Rome over the all-expenses paid trip to Paris. No one chooses the inferior trip to Rome without breakfast, of course, but why does the presence of a third, inferior option cause people to gravitate towards Rome rather than Paris? The explanation may be that the presence of a ‘decoy’ makes it easier to compare the two options for Rome than to compare Paris with Rome. This decoy effect (also called the asymmetric dominance effect) is a phenomenon that causes people to have a change in preference between two options when presented with a third option that is related to, but also markedly inferior to, one of the original options. The presence of a decoy option results in an irrational decision. In this case Rome now looks like a better option than Paris despite no new information about Rome or Paris becoming available to the consumer. This kind of irrational decision making is not limited to the ordinary consumer; it is apparent that we are all prone to subconscious influence by decoys. Even expert professionals, when evaluating critical decisions, are susceptible to making illogical choices. In a study carried out in 1995, researchers Donald Redelmeier and Eldar Shafir illustrated how experienced medical doctors could, under certain conditions, be influenced into selecting an inferior treatment course for patients suffering from chronic pain: The first scenario presented to the doctors was as follows: You are reviewing the case of a patient who has been suffering from hip pain for some time and has already been scheduled for a hip replacement. However, while examining the case notes ahead of the surgery you realise that you have not yet tried treating the patient with Ibuprofen. What should you do: (a) leave the patient to undergo the surgery, or, (b) delay the operation until a course of Ibuprofen has been undertaken? In most cases (you will be pleased to know) the doctors delayed the surgical intervention and recommended that the patient be prescribed a course of Ibuprofen. In the second scenario, another group of physicians was presented with a similar case but this time they were told that two different medications had yet to be tried: Ibuprofen or Piroxicam. This time, most physicians opted to allow the patient to continue with the hip replacement! It seems that because another decision factor was added and the choice made more complex, many more doctors allowed the default option to stand. To better understand this example it is important to realise the role played by the default option. We need to bear in mind that the patient was already scheduled to have a hip replacement. The choice was not ‘Which of these many treatment options is best?’ but rather ‘Do I make the decision to change the current course of action in order to try something different?’. It appears from numerous experiments like this one that if multiple or complex choices are available then the tendency is to leave things as they are. Only when a simple alternative is presented is it likely to be selected – even if the alternative choice would have been the more logical thing to do. As a project manager and business owner I have seen decision situations like these arise many times. While directing a large ERP (Enterprise Resource Planning) system installation some years ago my implementation team was presented with a sudden resourcing crisis after half of our implementation consultants were removed from the project at short notice. With the scheduled ‘go-live’ three months away and with little prospect of completing all of the application testing and user training in time we were faced with a simple decision: (a) should we proceed with the project as planned and hope that, by redoubling our efforts and with a fair wind (and some luck), we are able to deliver a workable system, or, (b) do we change the scope of the project so that only a subset of the overall functionality is delivered to begin with, and thereafter initiate a new project to implement the remaining features. At a core team meeting we discussed both courses of action in detail and decided on the latter option: to reduce the scope of the project in order to meet the scheduled date, albeit with a reduced set of system functions. I duly presented this as my recommendation to the CEO of the business at the next project board meeting. There was much discussion between the various stakeholders after which another option was suggested: why not recruit implementation specialists from outside of the business so that the go-live date could still be met without the scope of the project being reduced? I didn’t relish the thought of trying to find a handful of consultants with the appropriate expertise in our chosen ERP application at short notice, but I agreed to take the idea back to the core team for further consideration. The core team representatives now discussed the three options and very quickly arrived at a decision: the team preferred to redouble their efforts, plough on, and hope for the best. I was astounded! It seemed that the complexity of the decision-making process had now increased to the point where it was just too difficult to weigh up the pros and cons of each option. No one knew how best to go about recruiting additional implementors and some members of the team decided to reconsider the feasibility of dropping some functional elements from the scope. The easiest option seemed to be to do nothing, and to let the project unfold according to its original schedule, in other words, to accept the default position. What originally seemed like a simple decision had turned into a complex one by the addition of a challenging, third option. The validity of the choice to proceed with a reduced scope had not altered, but the team’s perception of the choices available had changed significantly enough for them to retreat and backtrack. (Of course, for the record, I did not allow the decision to stand, but that’s not the point of the story.) Dan Ariely provides another salient example of the importance of default options in a case study featuring an analysis of Organ Donor programmes in Europe from 2009. The chart above shows the percentage of the eligible population who agreed to carry an organ donor card when applying for their driving license. On the left of the chart are the countries with the lowest participation rate: Denmark, the Netherlands, the United Kingdom* and Germany. On the right are the countries which subscribe to the scheme on an almost universal basis. It is worth noting that in order to persuade its population to carry more donor cards, the Dutch government wrote a personal letter to every single eligible citizen and asked each one to participate in the scheme. The result was a significant uptake in membership to 28%. It is not obvious from an initial analysis of the data why there should be such a marked variation between countries. Cultural and social differences do not seem to explain it, after all, one would expect Germany and Austria to have similar values, and it is hard to imagine attitudes in the Netherlands being dramatically different from, say, Belgium. But participation in the organ donor scheme is dramatically different between these countries. So why do only 4% of Danish drivers carry organ donor cards when 86% of their near neighbours in Sweden carry them? The answer lies not in the ethical or moral attitudes of the respective populations but in a simple administrative oversight. The countries with the poorest participation in the scheme provided driving license applicants with a form containing the following question: Whereas the countries with the best participation in the scheme used the following question: In both cases the default option is to do nothing by not ticking the box. By doing nothing in the first case you opt out of the scheme, by doing nothing in the second case you opt in to the scheme.
When faced with a choice, people are strongly drawn to the default option, the default option usually being the one that requires the least effort or consideration. That this sort of question framing should influence people’s decision-making to such a degree is fascinating to say the least, especially so given the nature of the choice being made in this particular example. It also raises an important question: if framing can so easily influence us when it comes to making such important decisions, then how easy must it be to sway us when we don’t perceive the stakes to be so high? * From 1st December 2015 the system in England & Wales was changed to an ‘opt-out’ for this very reason |
AuthorThe Irrational Project Manager Archives
January 2017
Categories
All
|