17 March, 2015

Why can the NSA do what it does?

This is part one of two blogs about how we make decisions and how our lack of rationality results in much of the mess we have today. I'll start with addressing the title of the blog - spying.

Much has been said about NSA by very qualified people like Bruce Schneier, comparing the NSA to
the Maginot Line (..) in the years before World War II: ineffective and wasteful.
The costs in terms of civil liberties and money resulted in one confirmed case where somebody was caught thanks to NSA spying (probably unfairly, though).

Like most technical people, I'm not impressed but very worried about the erosion of our civil rights, through the NSA spying and in other ways. And I am sure I share with others the impression that if only politicians and the general public knew more about the problem, we wouldn't make such bad decisions.

At the same time, I know I'm probably wrong about that. Like most people, I also care about global warming; health care; poverty; war; and the countless other things arguably Wrong With The World. And collectively, we know all there is to know about them.

Somebody, or a small cabal, must be causing this, then - an argument you often hear about many things gone wrong.

Is there a cabal? Let me invoke Hanlon's razor:
Never attribute to malice that which is adequately explained by stupidity.
Because I think it's the human condition that got us here, not malice of anybody in particular. We just, collectively AND individually, fail at making the right decisions.

So the question should be: what makes us so unreliable? So easy to lie to, especially in groups? Why do we believe conspiracy theories that often require us to believe far more fantastic things than the reality they try to disprove? How can fans kill people only in South Korea?

I'd like to dig into that a bit in this post, more of an essay than a blog, I suppose. The immediate reason is the mess surrounding the NSA (probably not news to most readers of my blog), where one-liners and the inherent complexity of the issue have ensured most people I talk to don't see the problem.

Reality is that the more you know, the harder it is to have a firm opinion. In reality, often conflicts are like Israel vs Palestine - if you pick a side, you're wrong. The complexity of real life issues makes it easy for governments and companies to play people - I feel an urge to point to Russia, but how do we know they are not right claiming Israel-backed Neo-Nazis used US supplied weapons to shoot down Flight 17? You can point out that historically, (neo)Nazis and Jews haven't gotten along very well. That's a fact. But so is the support of the EU for 'political reform' (overthrowing a legitimate, democratically elected government) in Ukraine. How valuable are facts and reason in a complicated situation?

I don't want to talk about the facts of any specific situation here. I'm not an expert in pretty much every relevant domain and neither are all of you! But we do have decision power - that is how democracy works. Luckily, with the Internet today, it is possible to have the most important facts around. And unfortunately, it does not lead to better decisions. So I want to look at it from my background in psychology and talk about how we deal with knowledge and how we make decisions.

Most of the time, we make decisions using 'common sense'. Our gut feeling, which was useful when we were still in Africa. But what works when you're hunting antelope might just not do the trick when you have to decide for an insurance company.

Why we don't get it

The problem lies in cognitive biases, described by Wikipedia as a "pattern of deviation in judgment". That's a nice understatement if you ask me.

A cognitive bias is a remainder of our Africa times, setting us up for certain errors. You'd be surprised how much of our thinking is in our genes. It is why most people are afraid of snakes and spiders: somehow, our genes have programmed our brain to quickly learn to fear snake and spider like objects, but be totally fine with bunnies and flowers.

To understand how limited our ability for logical thinking is, let's remind ourselves of Optical Illusions. I'm sure you've seen many by now and you should realize how flawed our vision is. Yet we have evolved to 'see stuff' for millions of years. We have a big part of our brain dedicated to it. And we use our eyes all day long, every day of our life. Yet, these images keep confounding us. To top it all off, there is change blindness, showing it isn't just bad - we are terrible.

Logic, on the other hand, is quite new. While some brilliant people were inventing math and logic thousands of years ago, most humans kept busy hunting, later growing food. And most of the time, in day to day life, we work on the automatic pilot. There's a reason we learn to recognize objects with little effort as babies already but have to be taught the mere basics of math during excruciatingly long sessions at school! Humans are naturally absolutely horrible at math and logic, to the point where somebody with some reasonable knowledge of probability could have won half the riches of the ancient world by gambling - probability theory was only developed in the 17th century.

And if you think you did well in those 16 years of school, answer this:
A bat and ball together cost $1.10. 
The bat costs $1.00 more than the ball.
How much does the ball cost?
See the answer two chapters further, just before the end.

It shouldn't be a surprise to learn that the list of cognitive biases is as long as it is. Most of the time, instead of 'proper' logic thinking, we use heuristics, shortcuts to 'good enough' solutions. They work great - if you're hunting antelopes.

To state the obvious: we're not. To go back to the NSA - we're letting them go wild. And: global warming, poverty, war... We're not doing that well, as Douglas Adams pointed out:
“This planet has - or rather had - a problem, which was this: most of the people living on it were unhappy for pretty much of the time. Many solutions were suggested for this problem, but most of these were largely concerned with the movement of small green pieces of paper, which was odd because on the whole it wasn't the small green pieces of paper that were unhappy.”

Some biases explained

You might have clicked some of the links above and seen some examples already, but lets bring this to life with a familiar example: why are so many people afraid of flying?

Let's turn the question around. SHOULD you be afraid of flying? There's a risk, certainly. But it is a well known fact (I won't even link to anything) that the chances of an accident when traveling by plane are larger on the way to and from the airport than while in the air! So why do many people still feel flying is dangerous? It is due to the way our brain does statistics. It's nothing like a calculator...

Our brains have ways of estimating how likely something is. They do that by digging in our memory: the easier it is to recall multiple instances of something, the more likely it is to happen. That makes sense - if you found food three times when walking a certain path, there is a high likelyhood of you finding more there rather than on a path where you never found anything to eat. It's clear where you should go if you are real hungry.

But our memory isn't terribly precise. First of all, it stores information in relation to emotion. This means when you're sad you can remember sad things better and when you're happy, happy events come to mind easier. It also stores things better when they are associated with strong emotions. This makes sense: events which upset, anger or scare you are probably more important than those which don't elicit any particular emotional response. You'll remember less details from your home-work travel from Tuesday a week ago then that crazy ride in a roller coaster, even though the latter didn't even get you to anywhere special.

This effect causes us to make massive errors in estimating things like the likelihood of getting a car accident, slipping in the shower or getting killed by a terrorist. And flying, of course - plane accidents feature prominently in the press and cause a lot of anxiety. But let me remind you:
Never attribute to malice that which is adequately explained by stupidity.
These accidents are worth talking about - don't blame the media for reporting the news. It's just that we fail at judging how likely it is that it happens to us.

And this is just one flaw related to just one aspect of how we think - statistics. We're not too good when it comes to statistics and causality either, as XKCD points out below.

Our failure to handle statistics can be abused quite easily, and not just in gambling or other situations where statistics obviously play a role. In marriage counseling, a therapist might ask you to try and come up with five good things about your partner. This wouldn't be too hard and put you in a more positive mood. If the therapist was evil, however, he/she could ask for ten good things instead. While you might get to ten, a number that should give more confidence than a mere five, paradoxically the difficulty in coming up with such properties is not going to be good for the relationship.

And there's (again) no malice needed. Stereotypes, for example, are known to cause behavioral changes without awareness of the subject ("you do something but don't know why you are doing it"). In an interesting experiment, it was shown that if you let a group of white people and people of color complete a standardized test, they can either do equally well, or not. It depends on what you tell them it is a test of "diagnostic of intellectual ability" or not!

That's right: when they know it is an IQ test, African-Americans score significantly worse than white people, while they actually do a little better when the same test is done without any mention of its diagnostic abilities. They let the stereotype come true, unintentional and unconscious.

An awesome example of how prejudices influence us on a fundamental level can be found in this great video of a talk by Mahzarin Banaji, who shows to an audience of liberal, well educated scientists the gender bias in their brains. Watch the section with the test below and while she does the test, participate! Speak out the 'left' and 'right' so you can experience the effect for yourself.

If you participate, you will notice that you easily do the first three tasks - but the fourth is inexplicably harder. This shows the fundamental bias in your brain: you take twice as long to answer a question that isn't congruent with your unconscious gender prejudice! Professor Mahzarin notes that she, herself, exhibits this same bias - about 75% of men and women do. Racism exists even in people with the best of intentions.

Nobody can be blamed for this - it is how we, humans, function, just another one of our many flaws. Stereotypes are central to how we function, influencing our behavior at every corner in our lives. They ARE the biases, the short cuts of thought, themselves!

What does this all add up to? For one, you can't judge how dangerous something is. That is why we spend lots of time and money to reduce risks that are tiny, see also this and this.

Confirmation bias

Let's discuss one more bias, confirmation bias. This is one particularly nasty bugger:
Confirmation bias, also called myside bias, is the tendency to search for, interpret, or recall information in a way that confirms one's beliefs or hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning.
Remember, the inductive reasoning Wikipedia mentions here is the one based on probability: after you've seen over 50 expensive cars in a new place you're visiting, you might be inclined to think it is a rich city. Maybe. But remember what I said earlier: how does your brain assess probability? Memory recall.

Wikipedia mentions explicitly that your brain has a tendency to remember things that fit the theory you are trying to verify! That means that it will fail to bring to mind the slum around the airport you saw from the airplane. Yes, your brain will explicitly take away your ability to properly judge a situation.

It sure makes you feel more confident about your decision. But not any less wrong.

And that is just the memory side of this bias. Countless studies have shown we explicitly look for evidence that fits our expectation. A truly impressive example is in the video below:

A little background on this experiment is here and an opinion piece by the LA Times connects it to how white people demonize people of color in the aftermath of Fergusson. Of course, when it comes to minorities there are many biases in play, this is just one of them.

About the Bat and the Ball. The bat was a dollar more expensive than the ball, and no, the ball isn't 10 cents with the bat being 1 dollar. That would make the bat only 90 cent more expensive. 5 ct and 1 dollar and 5 cent. Of course, if you did the math, you wouldn't get it wrong. But if you think quickly - you are. Because you're then using a shortcut. Sadly, you do that most of the time, and not just when math is involved.

The human intellect is particularly good at over-estimating itself. We even consider unrealistic positivism about oneself a healthy attitude - people who are more realistic are clinically depressed. There's a sad list of self deceptions here. And if you think that high intelligence means less cognitive biases - think again. It might even be that the opposite is true!

In a way, this video gives a philosophical view on the subject, inspired by Plato's cave:

I hope that with some of the facts and thoughts above, I've made you a little bit more humble, dear reader. Because if we're to solve the problems we have, we need exactly that...

Read part two for some thoughts on the consequences and dealing with our limitations. Feel free to comment below!