“Do I contradict myself? Very well, then, I contradict myself; I am large, I contain multitudes.”

– Walt Whitman

 

 

Social science study number 2:

 

Confirmation Bias –

Confirmation bias is a term often used to explain the motivations behind people’s cognitive reasoning. It speaks of our tendency to preferably favour information which confirms an existing belief or value we hold.

Since cognitive psychologist Peter Cathcart Wason discovered the phenomenon in the 1960’s, numerous studies have been conducted to uncover why we behave in such a way. A way that greatly expresses an innate stubborn self-righteousness, but simultaneously a virtuous loyalty and diligence in upholding our personal standards. What they have found is that when confronted with a new viewpoint or truth, we will almost always place higher importance on the evidence we choose to believe. All in order to seek proof in backing up our pre-held ideas, and actively discarding the information which doesn’t confirm it. This also translates across the way we interpret and recall new stories – seeking out non opposing information to support the vision we hold for how we think the story should go.

Going beyond the theory and looking at examples from every day life, perhaps this bias is behind the strange sense of disappointment we can find ourselves entertaining after we go to the doctor. Say you went in complaining of some ambiguous health woe that had been plaguing you mysteriously for weeks, only to receive test results which are positively clear and indicate no real imbalance anywhere in your system. It’s not that you’d rather be ill fated and diagnosed with a negative outcome, it’s just that the positive clearance results still don’t give you an answer for your health struggle – you feel as in the dark as you were before. Applying a preference bias to this scenario, would be to have the test results line up with whatever narrative you had devised and diagnosed yourself with prior to your doctor visit, based only upon your own knowledge and ideas.

Looking at this economically, we all know that the consumer choices we make can have real ethical consequences, but often willingly avoid the stark confrontation of understanding these in complete truth. That is to say, as much as we like to tell ourselves and others we stand for something moral, thanks to our existing biases our walk doesn’t always match our talk.

Researcher Neeru Paharia once wrote a paper titled ‘Sweat shop labour is wrong….Unless the shoes are cute.’ In this paper she uncovered a theory called Motivated Reasoning. This occurs when we as consumers begin our moral reasoning after we have made the decision whether we like the item / product or not, as opposed to starting while we’re in a neutral moral place. It starts when we’ve already got items in our hands, or on our feet, and suddenly feel as though our life would be oh so wrong without them now. We momentarily sway in our ethical staunchness in order to find a way of rationalising our justification, and to avoid the reality of it. Neeru explained that our moral high horse behaviour only comes back in to play if the ‘shoes’ are ugly. “We often just decide what is moral based on how much we want something, really…”, she says. “We’ll often willingly go along with a product that is ethically problematic so long as we can come up with a way to distance ourselves from the ethical behaviour or action that produced it.”

This is related to another theory called wilful ignorance –
Which asks the question, ‘if we have access to the information or the cold hard ethical truths, would we look at it? Or would we wilfully ignore it because our bias is that we love x,y,z and if we found out how wrong it really was we would feel guilty?’ Because we know that you can’t un-know what you know, this causes a deep irony: Research showed that those who care the most about ethics may be most willing to turn a blind eye to unethical business practices because they know if they found what really went on they would feel obliged to do something about it.

 

“Most ignorance is vincible ignorance. We don’t know because we don’t want to know.”

– Aldous Huxley

 

“Research in motivated reasoning suggests that people are likely to arrive at conclusions they prefer, as long as they maintain an ‘illusion of objectivity.’ By engaging in this rationalisation process, people may be able to consciously think of themselves as moral even as they engage in unethical behaviours by justifying the ethical burden away.”

She explained how things might change dramatically if we lived in an ‘on demand’ economy, where things were made (either ethically or unethically) only once you decided you wanted them. Thus giving us ownership and making us question whether it’s really a genuine need, and if it really sits right with us ethically. “There would be a stronger ‘cause and effect’ connection, making us think about our role in enabling these harms. We wouldn’t be able to so easily distance ourselves from the ethical consequences of our economic actions.”

 

 

Well Societe, as with study number one and the endowment effect, it seems these social phenomena are gripping us all. Even as you read this article and told yourself you weren’t someone who behaves in such a way. That’s probably just your confirmation bias talking…

How’s that for a confusing human inception moment.

 

 

Comments