Book Report: You Are Not So Smart by David McRaney

you-are-not-so-smart
I’ve read quite a few books that build on the content of Dan Ariely’s Predictably Irrational, and I often have found myself rejecting them as too derivative.  A few days into You Are Not So Smart by David McRaney, I was starting complain about the book for fitting into that mold, but a friend “politely” convinced me that I was being an ass.*

With an adjusted attitude I got right into the rest of this book and enjoyed the hell out of it, and almost certainly learned more than a few useful things.

At times the book can be a bit frustrating in it’s failure to address how to overcome the tendencies it describes (which probably explains the sequel: You Are Now Less Dumb)

This book isn’t marketed as a business book, but it could easily be. Tell me you haven’t sat in a meeting like the one described in this excerpt:

When a group of people come together to make a decision, every demon in the psychological bestiary will be summoned.
Conformity, rationalization, stereotyping, delusions of grandeur — they all come out to play, and no one is willing to fight them back into hell because it might lead to abandoning the plan or a nasty argument. Groups survive by maintaining harmony. When everyone is happy and all egos are free from harm it tends to increase productivity. This is true whether you are hunting buffalo or selling televisions. Team spirit, morale, group cohesion — these are golden principles long held high by managers, commanders, chieftains, and kings. You know instinctively that dissent leads to chaos, so you avoid it.
This is all well and good until you find yourself in a group your brain isn’t equipped to deal with — like at work. The same mind that was formed to deal with group survival around predators and prey doesn’t fare so well when dealing with bosses and fiscal projections. No matter what sort of job you have, from time to time everyone has to get together and come up with a plan. Sometimes you do this in small groups, sometimes as an entire company. If your group includes a person who can hire or fire, groupthink comes into play.
With a boss hanging around, you get nervous. You start observing the other members of the group in an attempt to figure out what the consensus opinion is. Meanwhile, you are simultaneously weighing the consequences of disagreeing. The problem is, every other person in the group is doing the same thing, and if everyone decides it would be a bad idea to risk losing friends or a job, a false consensus will be reached and no one will do anything about it.
Often, after these sorts of meetings, two people will talk in private and agree they think a mistake is being made. Why didn’t they just say so in the meeting?
Psychologist Irving Janis mapped out this behavior through research after reading about the U.S. decision to invade southern Cuba — the Bay of Pigs. In 1961, President John F. Kennedy tried to overthrow Fidel Castro with a force of 1,400 exiles. They weren’t professional soldiers. There weren’t many of them. Cuba knew they were coming. They were slaughtered. This led to Cuba getting friendly with the USSR and almost led to nuclear apocalypse. John F. Kennedy and his advisers were brilliant people with all the data in front of them who had gotten together and planned something incredibly stupid. After it was over, they couldn’t explain why they did it. Janis wanted to get to the bottom of it, and his research led to the scientific categorization of groupthink, a term coined earlier by William H. White in Fortune magazine.
It turns out, for any plan to work, every team needs at least one asshole who doesn’t give a shit if he or she gets fired or exiled or excommunicated. For a group to make good decisions, they must allow dissent and convince everyone they are free to speak their mind without risk of punishment.
It seems like common sense, but you will rationalize consensus unless you know how to avoid it. How many times have you settled on a bar or restaurant no one really wanted to go to? How many times have you given advice to someone that you knew wasn’t really your honest opinion?

*I’m better for the conversation- it’s not as if I remember Predictably Irrational cover-to-cover.  I’ll be less likely to reject by default the next book that reminds me of it. I may not be so smart, but I’m a little bit less dumb. Thanks, Adam!

Leave a Reply

Your email address will not be published. Required fields are marked *