23 March, 2015

Dealing with our flaws in thinking

This is a follow-up on a post about the limited human rationality. In that post I described some facts - just a few - that perhaps left you a little more in doubt about your cognitive abilities. Or at least more aware of the limitations our human condition comes with!

Consequences

Unfortunately, the mentioned and the many other flaws in our thinking have consequences for decision making in our society, especially when there's money to be made. The lobby of weapon manufacturing is rather stronger than that of companies creating anti-slip mats in showers and car manufacturers, well, security is merely a factor increasing the costs of cars so there's little incentive for them to hammer on that issue either. The combination of our innate inability to judge the likelihood of these and other things to harm us and the financial pressure on politicians results in massive over-spending on what is in essence irrelevant or even dangerous and harming our society. The NSA, for one, stupidity around net neutrality is another and the war on drugs is third rather prominent example. And now Ebola, of course - a disease so unlikely to kill you, you're probably more likely to be killed by a falling piano.

I think it is pretty clear, as I mentioned above, that politics and business happily abuse our lack of rationality. But probably more often, 'the system' causes issues by itself, as the insanely huge political divide in the US shows. It pays of for the media to put extreme people in front of their audience - and today, we have a country where you can't discuss politics at the office because people see the world so vastly different that only conflict can come out of a conversation. Think of the biases I discussed earlier: these world views aren't likely to get fixed easily, either.
Never attribute to malice that which is adequately explained by stupidity.
I don't think anybody set out to create this divide - but it is with us now.

Now indeed, the media are part of a system working against us. They get rewarded for creating an impression of problems; and they are often uninformed and biased themselves. As John Oliver pointed out, we don't put a believer in regular human abductions by aliens in front of a camera to debate a scientist, attempting to give both sides of the debate an equal chance! We pity the few who still don't get that this, and many other issues, are settled.


Yet this is what often happens in areas where science has long come to a conclusion. Not just the moon landing but also vaccinations, global warming, evolution and a great many more things. Take the recent "Snowden wants to come home" media frenzy!

I don't think any of that is intentional. It's the system rewarding the wrong things. We are part of that 'system': we prefer news that supports our view point; and we prefer new and exciting things - a balanced point of view is boring.

Quality decision making gets harder and harder.

Dealing

One way of dealing with disagreement has been to essentially declare all facts 'up for discussion'. It all depends on your point of view, proponents of this idea say. But reality isn't as malleable as relativists make it out to be. You can choose to leave your house through the front window on the 3rd floor, but gravity's a bitch. It's nice that some want to value everybody's opinion, but the universe imposes limits to that.

We have to realize that the world is real. People can be right or wrong about it and the choices we make matter!


As a society, we need to find new ways to make decisions in a healthy way. We've done good things - we eliminated polio and smallpox, diseases that were around for many thousands of years and nobody has had them in a long, long time. River blindness is hopefully next, and others will follow. We also drove half the worlds' animals near extinction and are abusing this planet to the point where it just will become a much more hostile place in a century or two unless we change something. You can guess I'm not much into libertarianism - it is clear that we can and do impact the world and going it all alone does not solve the tragedy of the commons style issues we have. There's a problem - and the inherent complexity of the world is certainly part of it, as is our lack of rationality.

How do we deal? I used to be an optimist - when I discovered the Internet, I thought it would democratize knowledge (it has) and news (not so much). Social media, sites like Digg where people vote on what the 'best news' is, it seemed a new and improved reality. No more single points of failure. No journalists who can be bought or oppressed. And then there were open source communities, with their flat structure of decision making, ideals of equality and meritocracy. Democracy would thrive!

Reality was harsh. The Internet has allowed us to hide in our corners with like minded people. It has lots of good stuff (if you're not into economy or net neutrality, this is a good read on both) but the Internet didn't kill conspiracy theories, it fuelled them. And open source works for some, but has its own issues of inequality (and that is just one problem).

Perhaps technology can help - Google has apparently found ways to find out what's true and what isn't. I'm not so sure. I wonder what it would do to sites like this proving that even with mere facts you can create conspiracy nonsense.

Methods to the rescue

I think the solution has to be found in a system or a process in the way the scientific method works. Wikipedia describes it as follows:
The scientific method is a body of techniques for investigating phenomena, acquiring new knowledge, or correcting and integrating previous knowledge.
I prefer to call it a process, rather than a 'body of techniques'. The key is that if left to common sense, humanity decides that life emerges from lifeless matter until more than 2000 years later Louis Pasteur shows it really, really doesn't (except for this). The scientific method thus aims to take human decision making out of the equation, or at least, rigorously deal with the biases that cloud our judgment. Lots of books have been written on the subject of philosophy of science - I got my portion by way of Chalmers, worth reading.

However it works, key is that while science relies on people and thus makes mistakes, it has a process for dealing with these mistakes, correcting them over time. Confidence is gained over long periods and the result is that we have been largely refining our knowledge gained since the scientific method became widely used, rather than rewriting the world as we know it over and over again. Yes, Newton's theories on physics still stand - quantum mechanics and Einsteinian physics merely refines it, providing better results in areas Newton can't reach. Uncertainty exists in science, but only at the 'edges', where new knowledge is created. While many facts of evolution are debated, since evolutionary synthesis, we've settled on a core which is as solid as Newton's ideas about gravity; climate models might be imperfect today but much of what we discovered does not have to be debated over again and again.

Method for decision making

We have methods, systems, processes for decision making, too. Democracy is one, the trias politica part of it. But it has flaws and needs some refinement, ideally in the opposite direction of Citizens United. I don't think we can make a Philosopher King system work, so whatever we come up with has to be a bureaucracy, evolved from today's system. I think decentralization is part of the solution (majors should rule the world?), but we live together on this planet so there have to be over-arching structures, too.

I know there is research being done on the topic. And we've already come up with some strategies like the advocate for the devil.

What exactly the solution should look like - don't ask me. I'm a psychologist, I can merely tell you not to trust people and their gut instincts. If this feels like an anti-climax, well, it should. We will have to come up with solutions together - not one blogger alone!

Soon?

But we should hurry.

I believe, with self-described plutocrat Nick Hanauer, that the pitchforks are coming. Perhaps the militarization of police and governments (the NSA in particular) disrupting online security are attempts of governments to prepare for social unrest.

Maybe.

What I'm certain about is that humanity can't continue the way it is functioning now. If social inequality doesn't put a stop it, the depletion of natural resources or religious fundamentalists will. The Dutch would say: the wall will turn the ship.

Let's see how hard we'll hit it.

17 March, 2015

Why can the NSA do what it does?

This is part one of two blogs about how we make decisions and how our lack of rationality results in much of the mess we have today. I'll start with addressing the title of the blog - spying.

Much has been said about NSA by very qualified people like Bruce Schneier, comparing the NSA to
the Maginot Line (..) in the years before World War II: ineffective and wasteful.
The costs in terms of civil liberties and money resulted in one confirmed case where somebody was caught thanks to NSA spying (probably unfairly, though).

Like most technical people, I'm not impressed but very worried about the erosion of our civil rights, through the NSA spying and in other ways. And I am sure I share with others the impression that if only politicians and the general public knew more about the problem, we wouldn't make such bad decisions.

At the same time, I know I'm probably wrong about that. Like most people, I also care about global warming; health care; poverty; war; and the countless other things arguably Wrong With The World. And collectively, we know all there is to know about them.

Somebody, or a small cabal, must be causing this, then - an argument you often hear about many things gone wrong.

Is there a cabal? Let me invoke Hanlon's razor:
Never attribute to malice that which is adequately explained by stupidity.
Because I think it's the human condition that got us here, not malice of anybody in particular. We just, collectively AND individually, fail at making the right decisions.

So the question should be: what makes us so unreliable? So easy to lie to, especially in groups? Why do we believe conspiracy theories that often require us to believe far more fantastic things than the reality they try to disprove? How can fans kill people only in South Korea?

I'd like to dig into that a bit in this post, more of an essay than a blog, I suppose. The immediate reason is the mess surrounding the NSA (probably not news to most readers of my blog), where one-liners and the inherent complexity of the issue have ensured most people I talk to don't see the problem.

Reality is that the more you know, the harder it is to have a firm opinion. In reality, often conflicts are like Israel vs Palestine - if you pick a side, you're wrong. The complexity of real life issues makes it easy for governments and companies to play people - I feel an urge to point to Russia, but how do we know they are not right claiming Israel-backed Neo-Nazis used US supplied weapons to shoot down Flight 17? You can point out that historically, (neo)Nazis and Jews haven't gotten along very well. That's a fact. But so is the support of the EU for 'political reform' (overthrowing a legitimate, democratically elected government) in Ukraine. How valuable are facts and reason in a complicated situation?

I don't want to talk about the facts of any specific situation here. I'm not an expert in pretty much every relevant domain and neither are all of you! But we do have decision power - that is how democracy works. Luckily, with the Internet today, it is possible to have the most important facts around. And unfortunately, it does not lead to better decisions. So I want to look at it from my background in psychology and talk about how we deal with knowledge and how we make decisions.

Most of the time, we make decisions using 'common sense'. Our gut feeling, which was useful when we were still in Africa. But what works when you're hunting antelope might just not do the trick when you have to decide for an insurance company.

Why we don't get it

The problem lies in cognitive biases, described by Wikipedia as a "pattern of deviation in judgment". That's a nice understatement if you ask me.

A cognitive bias is a remainder of our Africa times, setting us up for certain errors. You'd be surprised how much of our thinking is in our genes. It is why most people are afraid of snakes and spiders: somehow, our genes have programmed our brain to quickly learn to fear snake and spider like objects, but be totally fine with bunnies and flowers.

To understand how limited our ability for logical thinking is, let's remind ourselves of Optical Illusions. I'm sure you've seen many by now and you should realize how flawed our vision is. Yet we have evolved to 'see stuff' for millions of years. We have a big part of our brain dedicated to it. And we use our eyes all day long, every day of our life. Yet, these images keep confounding us. To top it all off, there is change blindness, showing it isn't just bad - we are terrible.

Logic, on the other hand, is quite new. While some brilliant people were inventing math and logic thousands of years ago, most humans kept busy hunting, later growing food. And most of the time, in day to day life, we work on the automatic pilot. There's a reason we learn to recognize objects with little effort as babies already but have to be taught the mere basics of math during excruciatingly long sessions at school! Humans are naturally absolutely horrible at math and logic, to the point where somebody with some reasonable knowledge of probability could have won half the riches of the ancient world by gambling - probability theory was only developed in the 17th century.

And if you think you did well in those 16 years of school, answer this:
A bat and ball together cost $1.10. 
The bat costs $1.00 more than the ball.
How much does the ball cost?
See the answer two chapters further, just before the end.

It shouldn't be a surprise to learn that the list of cognitive biases is as long as it is. Most of the time, instead of 'proper' logic thinking, we use heuristics, shortcuts to 'good enough' solutions. They work great - if you're hunting antelopes.

To state the obvious: we're not. To go back to the NSA - we're letting them go wild. And: global warming, poverty, war... We're not doing that well, as Douglas Adams pointed out:
“This planet has - or rather had - a problem, which was this: most of the people living on it were unhappy for pretty much of the time. Many solutions were suggested for this problem, but most of these were largely concerned with the movement of small green pieces of paper, which was odd because on the whole it wasn't the small green pieces of paper that were unhappy.”


Some biases explained

You might have clicked some of the links above and seen some examples already, but lets bring this to life with a familiar example: why are so many people afraid of flying?

Let's turn the question around. SHOULD you be afraid of flying? There's a risk, certainly. But it is a well known fact (I won't even link to anything) that the chances of an accident when traveling by plane are larger on the way to and from the airport than while in the air! So why do many people still feel flying is dangerous? It is due to the way our brain does statistics. It's nothing like a calculator...

Our brains have ways of estimating how likely something is. They do that by digging in our memory: the easier it is to recall multiple instances of something, the more likely it is to happen. That makes sense - if you found food three times when walking a certain path, there is a high likelyhood of you finding more there rather than on a path where you never found anything to eat. It's clear where you should go if you are real hungry.


But our memory isn't terribly precise. First of all, it stores information in relation to emotion. This means when you're sad you can remember sad things better and when you're happy, happy events come to mind easier. It also stores things better when they are associated with strong emotions. This makes sense: events which upset, anger or scare you are probably more important than those which don't elicit any particular emotional response. You'll remember less details from your home-work travel from Tuesday a week ago then that crazy ride in a roller coaster, even though the latter didn't even get you to anywhere special.

This effect causes us to make massive errors in estimating things like the likelihood of getting a car accident, slipping in the shower or getting killed by a terrorist. And flying, of course - plane accidents feature prominently in the press and cause a lot of anxiety. But let me remind you:
Never attribute to malice that which is adequately explained by stupidity.
These accidents are worth talking about - don't blame the media for reporting the news. It's just that we fail at judging how likely it is that it happens to us.

And this is just one flaw related to just one aspect of how we think - statistics. We're not too good when it comes to statistics and causality either, as XKCD points out below.



Our failure to handle statistics can be abused quite easily, and not just in gambling or other situations where statistics obviously play a role. In marriage counseling, a therapist might ask you to try and come up with five good things about your partner. This wouldn't be too hard and put you in a more positive mood. If the therapist was evil, however, he/she could ask for ten good things instead. While you might get to ten, a number that should give more confidence than a mere five, paradoxically the difficulty in coming up with such properties is not going to be good for the relationship.

And there's (again) no malice needed. Stereotypes, for example, are known to cause behavioral changes without awareness of the subject ("you do something but don't know why you are doing it"). In an interesting experiment, it was shown that if you let a group of white people and people of color complete a standardized test, they can either do equally well, or not. It depends on what you tell them it is a test of "diagnostic of intellectual ability" or not!


That's right: when they know it is an IQ test, African-Americans score significantly worse than white people, while they actually do a little better when the same test is done without any mention of its diagnostic abilities. They let the stereotype come true, unintentional and unconscious.

An awesome example of how prejudices influence us on a fundamental level can be found in this great video of a talk by Mahzarin Banaji, who shows to an audience of liberal, well educated scientists the gender bias in their brains. Watch the section with the test below and while she does the test, participate! Speak out the 'left' and 'right' so you can experience the effect for yourself.


If you participate, you will notice that you easily do the first three tasks - but the fourth is inexplicably harder. This shows the fundamental bias in your brain: you take twice as long to answer a question that isn't congruent with your unconscious gender prejudice! Professor Mahzarin notes that she, herself, exhibits this same bias - about 75% of men and women do. Racism exists even in people with the best of intentions.

Nobody can be blamed for this - it is how we, humans, function, just another one of our many flaws. Stereotypes are central to how we function, influencing our behavior at every corner in our lives. They ARE the biases, the short cuts of thought, themselves!

What does this all add up to? For one, you can't judge how dangerous something is. That is why we spend lots of time and money to reduce risks that are tiny, see also this and this.

Confirmation bias

Let's discuss one more bias, confirmation bias. This is one particularly nasty bugger:
Confirmation bias, also called myside bias, is the tendency to search for, interpret, or recall information in a way that confirms one's beliefs or hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning.
Remember, the inductive reasoning Wikipedia mentions here is the one based on probability: after you've seen over 50 expensive cars in a new place you're visiting, you might be inclined to think it is a rich city. Maybe. But remember what I said earlier: how does your brain assess probability? Memory recall.

Wikipedia mentions explicitly that your brain has a tendency to remember things that fit the theory you are trying to verify! That means that it will fail to bring to mind the slum around the airport you saw from the airplane. Yes, your brain will explicitly take away your ability to properly judge a situation.

It sure makes you feel more confident about your decision. But not any less wrong.

And that is just the memory side of this bias. Countless studies have shown we explicitly look for evidence that fits our expectation. A truly impressive example is in the video below:

A little background on this experiment is here and an opinion piece by the LA Times connects it to how white people demonize people of color in the aftermath of Fergusson. Of course, when it comes to minorities there are many biases in play, this is just one of them.

About the Bat and the Ball. The bat was a dollar more expensive than the ball, and no, the ball isn't 10 cents with the bat being 1 dollar. That would make the bat only 90 cent more expensive. 5 ct and 1 dollar and 5 cent. Of course, if you did the math, you wouldn't get it wrong. But if you think quickly - you are. Because you're then using a shortcut. Sadly, you do that most of the time, and not just when math is involved.

The human intellect is particularly good at over-estimating itself. We even consider unrealistic positivism about oneself a healthy attitude - people who are more realistic are clinically depressed. There's a sad list of self deceptions here. And if you think that high intelligence means less cognitive biases - think again. It might even be that the opposite is true!

In a way, this video gives a philosophical view on the subject, inspired by Plato's cave:


I hope that with some of the facts and thoughts above, I've made you a little bit more humble, dear reader. Because if we're to solve the problems we have, we need exactly that...

Read part two for some thoughts on the consequences and dealing with our limitations. Feel free to comment below!

13 March, 2015

Why open source works

Trying to explain why open source works, you can of course point to the Cathedral and the Bazaar by Erik. But the kernel development process shows it happening 'in real time', every day, and that's a major reason why I so enjoy reading the weekly LWN.

Competition FTW

A recent article explained how two patch sets are developed, impacting the Huge TLB feature. One is about improving reference counting in the handling of huge pages to allow a huge page to be mapped in both 'huge page mode' and in 'normal page mode' by different processes. This is a first step towards being able to use transparent huge pages with the page cache, giving it access to the performance benefits of huge pages. This has been in the works for a while as it is a very complex change. But an alternative has recently surfaced, adding transparent huge page support to the tmpfs filesystem. This also has pieces needed to support huge pages in the page cache, but in a very different way.

Why on earth would you want developers to spend so much time on two competing solutions for a problem? Isn't there One to Rule Them All?

Finding that perfect solution

I am sure that there is a Perfect Solution. I would simply suggest that nobody knows it! Like in economics, where an equilibrium exists but no single person can define it, software development has become complicated enough that competition of ideas often leads to the best (or at least better) solutions.

Companies, building their own cathedral in-house, would put product managers and developers in a room. Let them come up with the Perfect Solution. But they won't, because of Joy's law:
"No matter who you are, most of the smartest people work for someone else"

Impressive results

This is the key to why open source works and works so well. No single development project in the world even approaches the size of the Linux kernel project - no less than 1400 people contributed in under six weeks. With millions of lines of code developed by tens of thousands of people over 20 years time and with an extremist attitude towards backwards compatibility, you wouldn't expect the trend to be "go faster". It is.

Bringing over almost 12,000 different people from 1,200 different companies together in the last decade, the kernel competes with Wikipedia for the title of humanity's largest collaborative effort. Oh, and I'm sure the masses building the pyramids were big, too, they just didn't integrate the experience of all their collaborators into its architecture as well as the kernel and Wikipedia do ;-)

Cookie licking

Many other projects struggle to learn from the kernel's success. Success which has many aspects, one of which exemplified here, a rule Frank sometimes talks about: no cookie licking!

In short, cookie liking is preventing others from working on something by claiming you are (but not really making progress). Extending it a bit, I would say that, while collaboration is good, if you feel you can do better -- or just want to sample a different approach:
"never be deterred to build an alternative solution to a problem"
Because it's space for competition within projects that makes open source work better.