fbpx

Earlier this week, I asked for examples of oversimplified expectations——when administrators reduce teaching to whatever is easiest to observe and document

…even if that means lower-quality instruction for students…

…and downright absurd expectations for teachers. 

And wow, did people deliver. My favorite example so far:

The main push this year is “where is the teacher standing?” (with the implication that “at the front” = bad).

teachers now lecture from the back of the room (with the projection up front), which is resulting in a diminished learning environment for the students, even while earning more “points” for the teacher from the roaming administrators.

Students have even complained that they have to turn around to even listen well…

…the teachers miss out on many interactions with the students because they can't see the students' faces and reactions to the (poor) lectures.

You can't make this stuff up!

But here's the kicker: at least this school is trying!

The administrators are getting into classrooms, and emphasizing something they think will be better for students. 

That's more than most schools are doing! But we can do better.

Having clear expectations is great.

Getting into classrooms to support those expectations is great. 

Giving teachers feedback on how they're doing relative to shared expectations is great. 

But the “how” matters. It matters enormously. 

So why are schools taking such a reductive, dumbed-down approach to shared expectations? 

I have a one-word answer and explanation: data.

I blame the desire for data. 

To collect data, you MUST define whatever you're measuring reductively. 

If your goal is to have a rich, nuanced conversation, you don't have to resort to crude oversimplifications.

If you talk with teachers in depth about lecturing less and getting around the classroom more as you teach, the possibilities are endless.

But if your goal is to fill out a form or a spreadsheet—well, thenyou have to be reductive

In order to produce a check mark or score from the complex realities of teaching and learning…oversimplifying is the only option. 

So here's my question—and I'd love to have your thoughts on this:

What if we stopped trying to collect data?

What if we said, as a profession, that it's not our job as instructional leaders to collect data?

As a principal and teacher in Seattle Public Schools, I interacted with many university-trained researchers who visited schools to collect data. 

I myself was trained in both qualitative and quantitative research methods as part of my PhD program as well as earlier graduate training. 

knew how to collect data about classroom practice…

But as a principal, I realized that I was the worst person in the worldto actually do this data collection in my school.

Why? Because of what scholars have identified as one of the biggest threats to quality data collection:

Observer effects.

When the principal shows up, teachers behave differently.

When teachers know what the observer wants to see, the song-and-dance commences. 

You want to see students talking with each other? OK, I'll have them “turn and talk” every time you walk into the room, Justin. Write that down on your little clipboard.

You don't want me to lecture from the Smartboard all day? OK, I'll stand at the back, and lecture from there, Colleague.

The late, great Rick DuFour—godfather of Professional Learning Communities—used to tell the story of how he'd prepare his students for formal observations when he was a teacher.

I'm paraphrasing, but it went something like this:

OK, kids—the principal is coming for my observation today, so whenever I ask a question, you all have to raise your hands.

If you know the answer, raise your right hand. If you don't know the answer, raise your left hand, and I won't call on you.

The principal needed “data” on whether students were engaged and understanding the lesson…so the teacher and students obliged with their song-and-dance routine.

Across our profession, in tens of thousands of schools, we're engaged in a conspiracy to manufacture data about classroom practice

It's not a sinister conspiracy. No one is trying to do anything bad. 

We're all behaving rationally and ethically:

—We've been told we need data about teacher practice
—We have a limited number of chances to collect that data from classroom visits
—Teachers know they'll be judged by the data we collect

So they show us what we want to see…

…even if it results in absurd practices like lecturing from the back of the room. 

So here's my suggestion: let's stop collecting data from classroom visits

We already get plenty of quantitative data from assessments, surveys, and other administrative sources. 

We already have enough hats to wear as instructional leaders. We don't need to be clipboard-toting researchers on top of everything else. 

Instead, let's focus on understanding what's happening in classrooms. 

Let's gather evidence in the form of rich, descriptive notes, not oversimplified marks on a form.

Let's talk with teachers about what they're doing, and why, and how it's working. 

Let's stop trying to reduce it all to a score or a check mark. 


{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}