Dissenting Views can make us Smarter Decision-Makers

September 28th, 2016

Please click here to read article online.

 

Noreena Hertz is a visionary economist, strategist, best-selling author and thinker whose economic predictions have consistently been accurate. She advises some of the world’s top CEOs on economic, geopolitical and technological trends and business decisions. Most recently she has been appointed ITV’s new economics editor.

 
We are drawn to those who echo what it is we already believe. We get a dopamine rush when we are presented with confirming data similar to what we get when we eat chocolate or fall in love. On Facebook we defriend those with different political views to our own. On Twitter we follow people just like us.

 
Yet a vast body of research now points to the import of contemplating diverse, dissenting views. Not just in terms of making us more rounded individuals but in terms of making us smarter decision-makers. Dissent, it turns out, has a significant value.
When group members are actively encouraged to openly express divergent opinions they not only share more information, they consider it more systematically and in a more balanced and less biased way. When people engage with those with different opinions and views from their own they become much more capable of properly interrogating critical assumptions and identifying creative alternatives.

 
Studies comparing the problem-solving abilities of groups in which dissenting views are voiced against groups in which they are not, find that dissent tends to be a better precondition for reaching the right solution than consensus. Yet how many leaders actively seek out and encourage views alien and at odds to their own?

 
All too few.

 
President Lyndon Johnson notoriously discouraged dissent, with many historians now believing that this played a significant role in the decision to escalate U.S. military operations in Vietnam.

 

Excessive group-think is now recognized to have underpinned President Kennedy’s disastrous authorization of a CIA-backed landing at Cuba’s Bay of Pigs. Former employees of the now defunct Lehman Brothers have talked about how voicing dissent there was considered a career-breaker. Yale economics professor Robert Shiller explained that when it came to warning about the bubbles he believed were developing in the stock and housing markets just before the financial crisis he did so only “quietly” because: “Deviating too far from consensus leaves one feeling potentially ostracized from the group with the risk that one may be terminated.”

 
Is this the feeling the “clubby” environment in your boardroom is inadvertently engendering? Or are you actively signaling that you want to hear views different and diverse and in opposition to your own?

 
We need to have the confidence to allow our own ideas and positions to be challenged.

 
Eric Schmidt, the Executive Chairman of Google, has talked about how he actively seeks out in meetings people with a dissenting opinion. Abraham Lincoln’s renowned “team of rivals” was comprised of people whose intellect he respected and were confident enough to take issue with him when they disagreed with his point of view. Stuart Roden, Co Fund Manager of Lansdowne Partners’ flagship fund, one of the world’s largest hedge funds, tells me he sees one of his primary roles as being the person who challenges his staff to consider how they could be wrong, and then assess how this might impact on their decision-making.

 
Who in your organization serves as your Challenger in Chief?

 

Interrogating the choices you are considering making? Making you consider the uncontemplated, the unimaginable and that which contradicts or refutes your position?

 
And also challenging you?

 
For we are not the robotic emotionless decision-makers of economics text books, bound to make the rationally best choices. Instead we’re prone to a whole host of thinking errors and traps.
Did you know that when we’re given information that is better than we expected — e.g. that our chance of being targeted for burglary is actually only 10% when we thought it was 20% — we revise our beliefs accordingly. Whereas if it’s worse — e.g. if we’re told that rather than having a 10% chance of developing cancer, we actually have a 30% chance — we tend to ignore this new information?