Sunday, February 24, 2013

Teacher Quality and Student Evaluations

The question of how to improve our schools has been hung up for a decade now on the question of teacher "quality" and how to measure it. Pouring over the mass of data from standardized tests, education experts have noticed a huge difference from teacher to teacher in how much they raise the test scores of their students. One group of experts says that this is the right measure of teacher quality, but this is resisted by others, especially teachers, who hate the thought of evaluating teaching by that one metric. Now testing guru Thomas Kane is back with a new study of 3,000 elementary school teachers that looked at test scores, student evaluations of the teachers, and how teaching experts judged videos of these teachers in the classroom. The study showed that all of the measures pointed in the same direction: the teachers rated highly by their students, and whose videos were scored highest, were also the ones who raised test scores the most.

I find it especially interesting that the most effective teachers were also rated as more "enjoyable" by their students. Observing my own children, I have noticed a strong correlation between academic problems and whether they are enjoying school; real educational crises only appear when the child is miserable.

The importance of observation is also worth noting. In my teaching career I have only been observed once, and I found it very helpful; I would very much like to be observed again, but universities don't seem to make much use of it.

Interesting interview with Kane about the study; the study itself is here.

1 comment:

pootrsox said...

I would hope that what the videos revealed was that teachers who get the kids actively involved in learning, making their own meanings, etc were the ones whose students not only enjoyed the teachers but did well on tests.

I agree that observations are essential.

I spent well over a decade as a trainer of mentors for novice educators, a mentor myself, and also as an evaluator for the state of CT of novice teacher performances (which required documenting a unit of instruction day by day with teacher materials, student work, assessment materials, and several videos of lessons, along with teacher reflection on the effect of instruction on student learning).

What I learned, watching videos of exceptional second year teachers and (to be candid) abysmal second year teachers, was that only when the students are active participants in the teaching/learning process-- and only when that process requires higher order thinking, not just concrete level Q&A-- will student work show growth and progress.

So often, the teacher's written documentation would describe lessons as higher-order and students as deeply engaged but the video would show bored students while the teacher did drills or lectured or focused only on plot-level discussion: "what happened next?"

We did not see standardized test scores on the students being documented; we did see their final assessments, however: the teacher "test" and the student product in response.

While I am somewhat suspicious of someone who's a test advocate producing this study rather than someone w/o an agenda, I do find the results interesting.

I will need to read the actual study to see if he actually randomized his sample. Did he account for differeing socio-economic statuses? for differing student skill levels? for learning disabilities? for teacher experience?

I am certain I could have hand-picked colleagues with whom I taught to come up with precisely the same results :)