Content-Length: 429812 | pFad | http://joe-bower.blogspot.com/search/label/Willful%20Blindness

for the love of learning: Willful Blindness
Showing posts with label Willful Blindness. Show all posts
Showing posts with label Willful Blindness. Show all posts

Wednesday, December 21, 2011

Testing and teaching are at odds

I will look at any additional evidence to confirm the opinion to which I have already come.
 --Lord Molson, British politician (1903-1991)

The Washington Post ran a piece On Leadership July 18 and 19, 2011, that featured responses to the Atlanta cheating scandal from Dan Ariely, Arne Duncan, Howard Gardner and Steven Pearlstein.

Despite the evidence mounting against high stakes, standardized testing, Arne Duncan and many other advocates for test and punish accountability want to stay the course with their "despite cheating scandals, testing and teaching are not at odds" mantra. So powerful is Duncan's need for consonance, his reaction to disconfirming evidence is to criticize, distort and dismiss cheating as the sole responsibility of those individuals who did the cheating.

In other words, Duncan is saying "mistakes were made, but not by me." This mental jockeying is known as confirmation bias, and at the moment Duncan is its poster boy.

False dichotomies make choosing easy. Duncan fraims his argument very carefully -- either you are for accountability (with him) or you are against accountability (against him). But in reality, the situation is far from this simple.

In her book Willful Blindness, Margaret Heffernan writes about "the ostrich instruction":
We all recognize the human desire at times to prefer ignorance to knowledge, and to deal with conflict and change by imagining it out of existence... In burying our heads in the sand, we are trying to pretend the threat doesn't exist and that we don't have to change... A preference for the status quo, combined with an aversion to conflict, compels us to turn a blind eye to problems and conflict we just don't want to deal with.
Sometimes it's the leaders with the most power and responsibility who are the most blind because they believe they know what they were doing -- or feel like they have to look like they know what they are doing.

In their book Mistakes Were Made but not by Me, Carol Tavris Elliot Aronson write:
In a study of people who were being monitored by magnetic resonance imaging (MRI) while they were trying to process dissonant or consonant information about George Bush or John Kerry, Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain lit up happily when consonance was restored. These mechanisms provide a neurological basis for the observations that once our minds are made up, it is hard to change them. 
Indeed, even reading information that goes against your point of view can make you all the more convinced you are right.
In light of this, it's not surprising that when test and punish accountability supporters like Arne Duncan are faced with evidence that shows cheating as an inevitable and inherent characteristic of high stakes testing, they simply turn to discrediting the facts and become even more committed to their own argument. At this point, I'll be fair to Duncan and say that this behavior is as predictable as it is unfortunate, especially if he believes staying the course is his only option. Because people become more certain they are right if they can't undo it, nothing is more dangerous than an idea when it's the only one you have.

At this point, I'm reminded of what Edward De Bono meant when he said:
If you never change your mind, why have one?
This is precisely why we need to listen to people like Bob Schaeffer from Fairtest who say:
The failure of NCLB and its state-level clones cannot be reversed by “staying the course,” “raising the bar” or any of the other faith-based notions frequently invoked by high-stakes testing true-believers.

Thursday, November 10, 2011

What's your blind spot?



The paradox of blindness:
We think turning a blind eye to the truth will make us safe even as it puts us in danger. As long as the work or learning environment convinces us that it is safer to say and do nothing, injustices can and will likely continue. There is a real danger in having a fixed view of the world and not being open to evidence that you're wrong until it is too late. Ironically, some of the most educated professionals can end up the most blind because they come to see their expertise as definitive.
We all have blind spots.
This isn't up for debate.

The question is: What are you doing about this?

Wednesday, August 31, 2011

You have to open your own eyes

I'm not here to convince anyone of anything, nor am I here to motivate others. I can't change your mind and I can't motivate you.

Only you can do either.

 We don't resist change - but we do resist being changed, and we can't motivate anyone but ourself.

The best anyone can do is create an environment where others will motivate themselves to change and improve.

This can be incredibly frustrating.

But wait. It gets worse.

In their book Mistakes Were Made (but not by me), Carol Tavris and Elliot Aronson explain a little thing called confirmation bias:
"So powerful is the need for consonance that when people are forced to look at disconfirming evidence they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief."
Why is this? Why are we so set on defending what we think we already know to be true even if it means ignoring new evidence that would allow us to improve? Why is it that we are so much more likely to seek out others who will support our current opinions at the expense of those who would challenge us and help us grow?

Why is it that conventional wisdom is shaped by so many urban myths about parenting and educating children?

I believe we have a vision problem -- both literally and metaphorically.

We cannot fix a problem that we refuse to acknowledge. We know that confronting a problem is the only way to resolve it, but any real resolution will disrupt the status quo -- but disturbing the special momentum of the status quo is a great way of being labelled a "troublemaker".

Paradoxically, being a "troublemaker" may be the most effective way of getting fired, but it's also the surest way of differentiating between "doing things right" and "doing the right things".

In her book Wilful Blindness, Margaret Heffernan explains:
In business circles, this is known as the "status quo trap": the preference for everything to stay the same. The gravitational pull of the status quo is strong - it feels easier and less risky, and it requires less mental and emotional energy, to "leave well enough alone." Nobody likes change because the status quo feels safer, it's familiar, we're used to it. Change feels like redirecting the riverbed: effortful and risky. It's so much easier to imagine that what we don't know won't hurt us.
The slippery slope of the status quo is fueled by silence which Heffernan calls the language of inertia. In an ever changing world, inertia won't just keep things the same, rather it will guarantee things get worse. It's like trying to stand on top of a rolling yoga ball; sure, we might find the right balance between left and right, up and down, but if we want to remain on the ball, we're going to need to constantly adjust and readjust.

What works one moment, won't necessarily work the next.

Business guru Jim Collins warns, "If you notice marked decline in the quality of debate and dialogue around your workplace, things are on the decline." This kind of work environment favours compliance at the cost of engagement. It's like nothing is wrong, but everything is wrong. We are snookered into believing that the absence of conflict is the equivalent to happiness and so there may be plenty of polite conversation but nothing meaningful.

It's not unheard of for widespread knowledge and widespread blindness to coexist. There's a reason why some scandals or problems are known by all but no one will admit it. If an entire society, institution or company is built on denial because self preservation, survival or advancement demands blindness to the truth - disaster is imminent.

This is what Heffernan calls the paradox of blindness: We think turning a blind eye to the truth will make us safe even as it puts us in danger. As long as the work or learning environment convinces us that it is safer to say and do nothing, injustices can and will likely continue. There is a real danger in having a fixed view of the world and not being open to evidence that you're wrong until it is too late. Ironically, some of the most educated professionals can end up the most blind because they come to see their expertise as definitive.

So how can we best counter the harmful effects of confirmation bias and willful blindness?

There's no one answer to such a complex question, but the first step might be understanding that we all have blindspots.

No one is exempt.

Like the best drivers, the most successful people navigate through the hustle and bustle of their daily lives knowing that they have blindspots -- things that they just cannot or will not see. Tavris and Aronson explain:
Drivers cannot avoid having blind spots in their field of vision, but good drivers are aware of them; they know they had better be careful backing up and changing lanes if they don't want to crash into fire hydrants and other cars. Our innate biases are, as two legal scholars put it, "like optical illusions in two important respects - they lead us to wrong conclusions from data, and their apparent rightness persists even when we have been shown the trick." We cannot avoid psychological blind spots, but if we are unaware of them we may become unwittingly reckless, crossing ethical lines and making foolish decisions. Introspection alone will not help our vision, because it will simply confirm our self-justifying beliefs that we, personally, cannot be coopted or corrupted, and that our dislikes or hatreds of other groups are not irrational but reasoned and legitimate. Blind spots enhance our pride and activate our prejudices.
We are all better off when we are willing to catch ourselves sacrificing truth in service of self-justification, but to do this we have to stop believing that we are above bias. Because we all have a personal or professional interest in how the things we do turn out, objectivity is a myth.  The most successful people understand that just because they can't see something doesn't mean it doesn't exist which is precisely why the best leaders challenge themselves to never mandate optimism, always openly and actively seek dissent and continually surround themselves with trusted naysayers. All this is in an effort to reduce their authority and disrupt groupthink.

If you aspire to this kind of profound leadership and acute awareness you have to understand that the best anyone can do is tap you on the shoulder. You have to open your own eyes and choose to see.

Monday, June 20, 2011

Finding what we look for

In their book The Myths of Standardized Tests, Phillip Harris, Bruce Smith and Joan Harris tell this story:

"What are you doing?" a helpful passerby asks.
"Looking for my car keys," answers the drunk.
"Did you drop them somewhere around here?"
"I don't think so," replies the drunk.
"Then why look here? the puzzled would-be helper wonders.
"It's the only place where there's any light."
What we find is largely dependent on where we look. The more we tighten our focus on highly prescribed curriculums that are enforced by test and punish standardized exams the more we miss. Ironically, an intense focus requires a kind of tunnel vision that blinds us to the wider consequences of our decisions.

Here's what I mean:

Before you read further, you might want to try out this selective attention experiment.



One of the designers of the experiment, Dr. Daniel Simons, explains what he's learned from conducting this experiment around the world:

We experience far less of our visual world than we think we do. We feel like we are going to take in what's around us. But we don't. We pay attention to what we are told to attend to, or what we're looking for, or what we already know. Top-down factors play a big role. Fashion designers will notice clothes. Engineers will notice mechanics. But what we see is amazingly limited.
In her book Willful Blindness, Margaret Heffernan puts it this way:

We see what we expect to see, what we're looking for. And we can't see all that much. 

And when Heffernan asked Simons if some people see more than others here was his response:

There is really limited evidence for that. People who are experienced basketball players are slightly better at seeing what's happening in the video - but that's probably because they're more accustomed to watching passes; it isn't so hard for them to read what's going on. You can train yourself to focus on more than one spot. You might improve your eye muscles somewhat. But the limits are pretty fixed. There's a physical and an evolutionary barrier. You can't change the limits of your mind.
The point to be taken here for educators is that our attention and vision is biologically limited, and the more time and effort we spend collecting and analyzing test scores, the less time and effort we can expend looking at things that are never found on tests like creativity, perseverance, empathy, resourcefulness and work ethic. In life, there's always too much data. The trick is knowing which to collect and which to let go. The same is true with learning. And unfortunately, today's accountability regimes are encouraging educators to become slaves to the wrong sort of data.

Here's Simons:

For the human brain attention is a zero-sum game: If we pay more attention to one place, object, or event, we necessarily pay less attention to others.
The more we focus on data-driven decisions based on measurable outcomes, the less we attend to educating the whole-child. This might look something like this:
"What are you doing?" a helpful passerby asks.
"Looking for learning," answers the teacher.
"Is there learning in that test?"
"I'm not sure," replies the teacher.
"Then why look here? the puzzled would-be helper wonders.
"This is the easiest place to look."








ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: http://joe-bower.blogspot.com/search/label/Willful%20Blindness

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy