Now on ScienceBlogs: Accelerated Twins: The Answer

recapred.png

Neuron Culture

David Dobbs on science, nature, and culture.

Search

Profile

dobbspic I write on science, medicine, nature, culture and other matters for the New York Times Magazine, The Atlantic, Slate, National Geographic, Scientific American Mind, and other publications. (Find clips here.) Right now I'm writing my fourth book, The Orchid and the Dandelion, which explores the hypothesis that the genetic roots some of our worst problems and traits — depresison, hyperaggression, violence, antisocial behavior — can also give rise to resilience, cooperation, empathy, and contentment. The book expands on my December 2009 Atlantic article exploring these ideas. I've also written three books, including Reef Madness: Charles Darwin, Alexander Agassiz, and the Meaning of Coral, which traces the strangest but most forgotten controversy in Darwin's career — an elemental dispute running some 75 years.

If you'd like, you can subscribe to Neuron Culture by email. You might also want to see more of my work at my main website or check out my Tumblr log.



My Google Shared links

Recent Posts

Recent Comments

Categories

« Testosterone and the recession: What goes around comes around? | Main | The Lorax Was Wrong: Skyscrapers Are Green »

Bad financial decisions: Low-balling risk, high-balling certainty

Posted on: March 10, 2009 11:06 AM, by David Dobbs

"How We Decide" author Jonah Lehrer, fresh from a book tour of the UK, offers what he calls a "spluttering answer" (it's really quite lucid) to a question he says he's getting a lot these days: What decision-making errors were involved in our current financial meltdown??

The short version of his answer -- well worth reading in its entirety -- is that we (and big investment outfits particularlyl) succumbed to an abhorrence of uncertainty.

We hate not knowing, and this often leads us to neglect relevant information that might undermine the certainty of our conclusions. I think some of the most compelling research on this topic has been done by Colin Camerer, who has played a simple game called the Ellsberg paradox with subjects in an fMRI machine. To make a long story short, Camerer showed that players in the uncertainty condition - they were given less information about the premise of the game - exhibited increased activity in the amygdala, a center of fear, anxiety and other aversive emotions. In other words, we filled in the gaps of our knowledge with fright. This leads us to find ways to minimize our uncertainty - we can't stand such negative emotions - and so we start cherry-picking facts and forgetting to question our assumptions.

In other words, we look for false certainty. And in the case of the financial meltdown, much of that false certainty was found in fancy financial "instruments," like mortgage-based derivatives, that promised to encapsulate and contain risk --but which have turned out to be so risky they're bringing down the whole system. The dynamics these instruments claim to represent and control are almost impossibly arcane and complex -- but they got boiled down to formula that, while flummoxing to normal people, had just the right combination of complexity and simplicity -- complexity apparently solved -- to convince mathematical investor types that they solved essential problems and put risk in a bottle.

Felix Salmon's recent Wired article describes one such instrument masterfully. Jonah cites another, which I haven't read, by Dennis Overbye. In both cases, overconfidence in these models, which were supposed to virtually eliminate risk, encouraged catastrophic risk-taking. As Jonah puts it,

Because everybody at LTCM believed in the state-of-the-art model, few were thinking about how the model might be catastrophically incorrect. The hedge-fund executives didn't spend enough time worrying about the fat tail events that might disprove their theories. Instead, they pretended that the puzzle of the marketplace had been solved - the uncertainty of risk had been elegantly quantified.

Jonah is dead right about this, and I highly recommend Salmon's article (as Jonah does Overbye's) for a look at how this sort of elegant techy solution can breed a false confidence.

But I wanted to note a strong parallel in today's medicine, which is our tremendous faith in high technology, and particularly in imaging technology. These images are so detailed and granular we tend to think they see everything, but they don't. They often get things wrong, both false positive and negatives; but the allure is so great, and the process to satisfyingly neat and unmessy, that both the public and most doctors have far too much faith in them. When I was did a story on the death of the autopsy a few years ago, I asked every doctor I knew if they ever asked for autopsies. One of my own doctors told me, "No. When someone dies we generally know why." And many doctors cited imaging technology as the reason they knew the cause of death. For this reason, we hardly ever do autopsies anymore -- the U.S. used to autopsy 50% of deaths, now we do under 5%. But every time someone compares the autopsy reports to the pre-autopsy declared cause of death, they discover that the autopsy discovers contributing problems that were missed in about 15% of the deaths.

I'm no Luddite; I loves my technology, and I'm as thrilled with the judicious use of imaging as anyone. But as we try to rework our health-care system to make it more efficient, we need to realize that we seem almost congenitally overconfident in high-tech answers. This isn't a reason to toss out all the scanners. But it's great reason to subject high-tech procedures and tools to rigorous comparative effectiveness studies.

PS For a bracing look at certainty's problemmatic allure, check out Robert Burton's On Being Certain


Share this: Stumbleupon Reddit Email + More

TrackBacks

TrackBack URL for this entry: http://scienceblogs.com/mt/pings/101633

Comments

1

Those are some fantastic looking books, thanks for the heads up on those. I think it was Dawkins I was reading who said something to the effect of "I'd rather not know, and know that I don't know, than cling to same false sense of absolute certainty."

I worked in financial services for several years. There are a whole lot of people in that field who don't have the most basic understanding of how any of the financial instruments they sell work and they're perfectly happy not knowing, because their boss told them it's a good instrument to sell and they make a nice commission on it. I think this contributes to the problem as much as the upper level actors who put such faith in their models that have been so highly tweaked that they bear little or no relation to the reality they are meant to represent.

Posted by: Rev Matt | March 10, 2009 11:37 AM

2

Our movie and novel heroes take risks. We, as a culture, like to think of ourselves as risk takers, but we have become highly risk averse, often taking irrational precautions against exotic risks (the arcane regulations in reaction to theoretical terrorist attacks are one example).

Insurance as a concept works only when applied to a very limited set of risks. As you try to cover larger and larger risks, the cost of insurance must exceed the the value of the risks and its value becomes negative.

I have a speculation that the extreme oil runup of last summer may have been a part of this. There were huge buying frenzies on oil futures, but no obvious buying pattern and no one individual or company seemed to be cornering the market. But, what if there were (as likely) many bad debts that were at some point tied to derivatives involving oil futures. As the debt market started to crack, these would be called in, driving oil prices spectacularly but with no single visible actor or group of actors being responsible.

Posted by: jay | March 10, 2009 12:52 PM

3

I don't think he's hit it quite out of the park yet because as generally everyone forgets, there are two sides to every financial action, a buyer and a seller. Free markets function so that the price always sit at or near the 50% certainty mark. For each person that thought that the MBS were solid at the given price, there was someone else who thought they were overvalued.

What we really have here with the current crisis is an uneven distribution of those guesses. If half the banks out there had been betting that housing prices would go down, well then they would simply be the new wealthy banks and be available to produce liquidity and capital for loans. Instead we got a very lumpy, poorly distributed allocations of the winners and losers in this economic event with the losers in key financial positions in our economy.

Therefore, I think the answer is closer to something more social rather than individual psychology. Mr Lerher doesn't completely miss this point: "Because everybody at LTCM believed in the state-of-the-art model, few were thinking about how the model might be catastrophically incorrect." The question isn't why person A might be more inclined to ignore conflicting facts, but why whole institutions, and even whole financial sectors were make up of only people who thought like person A.

There are a number of theories I could posit: Self-selection of certain types of people for the job, a smothering group-think mentality, or organized dependence on a biased source of information across a sector. Notice the two last ones are slightly different, and would have different cures. I'm sure there are some other bright ideas out there.

Posted by: Kevin H | March 10, 2009 7:17 PM

Post a Comment

(Email is required for authentication purposes only. On some blogs, comments are moderated for spam, so your comment may not appear immediately.)





ScienceBlogs

Search ScienceBlogs:

Go to:

Advertisement
Collective Imagination
benchfly
Advertisement
Collective Imagination

© 2006-2009 ScienceBlogs LLC. ScienceBlogs is a registered trademark of ScienceBlogs LLC. All rights reserved.