Decision Bias

When I took an introductory microeconomics class at Princeton my freshman year, I was not only bored to tears (sitting in the front row with freshly brewed coffee was not enough to keep me awake), but I also found myself completely disenchanted with the discipline of economics. 

To assume that people were perfect, optimizing machines struck me as absurd. I lived in a suite with seven other freshmen girls, and we were anything but optimizers! One roommate routinely missed deadlines in her classes and was later wracked with regret, another was lovesick over a new boy each week, and of course we were nearly all hung-over wrecks every Sunday morning (and many other mornings, too) after too much drinking (and in some cases, abuse of other substances). In short, I saw lots of evidence that passion and impulsivity drove human behavior, but little evidence that maximizers roamed the earth. 

So I dismissed economics and its assumption that people are perfectly rational decision-
making machines as preposterous. In fact, I disliked economics so passionately that I eventually subjected myself to summer school classes in physics and chemistry in order to accumulate the necessary credits to switch out of the bachelor of arts program at Princeton (where I had planned to major in economics) and into the engineering program (where I studied operations research, a mathematical discipline based on objective assumptions like “the distance between factories A and B is X miles”). 

But, as luck would have it, the Ph.D. program I stumbled into at Harvard, after falling in love with research while working on my senior thesis, required me to take a whole year of microeconomics at the graduate level. Happily, at Harvard, a new—and some would say subversive—subfield was blossoming in the economics department. Dubbed “behavioral economics,” it acknowledged human shortcomings and attempted to quantify and model the systematic and predictable mistakes we make as a result of passion, impatience, and limited cognitive capabilities, among other flaws. It was love at first sight for me with behavioral economics. Everything about it appealed—its youth and vast sets of unanswered questions, its obsession with data, its openness to multi-method research, its acknowledgment of human foibles, and its cross-disciplinary nature. At its heart, behavioral economics is a field focused on documenting the systematic errors we make, and so I became interested in studying these “decision biases.” 

Many amazing collaborators have been a part of my research in this area—a particularly frequent partner in crime being John Beshears. My research has examined biases such as escalation of commitment among Wall Street stock analysts and overconfidence exhibited by software developers when estimating the ease with which teamwork can be divided and conquered (a phenomenon my co-authors—Brad Staats and Craig Fox—and I dubbed “the team scaling fallacy”).

Related media content:

☛ Stubborn About Stocks: When Analysts Refuse to Admit They’re Wrong
Brian O'Connell