Ranking Teachers: NC Bets Big On A Complicated Stats Model
Next month, a million or so North Carolina public-school students between third and twelfth grade will start taking tests. Lots of them. Reading and math tests for the younger kids; biology, Algebra, and English for the older kids.
Their scores will be tabulated and run through some servers at SAS Institute, a private company in Cary. There, software called EVAAS will compare the test score the student earned to one a statistical model predicted the student should get.
From there, that compared number – called a “value-added” score - will be matched back to the student’s teachers. If the actual score is higher than the predicted score, that number will be positive and the teacher will be deemed to have added value; if it’s lower, then that number will flicker like a scarlet letter in the teacher’s digital file.
It’s all part of a growing movement to use data to evaluate teachers.
“You may be the best person in the world, but if your students aren’t making gains for the time that you have them, than we have a serious problem,” says State Senator Jerry Tillman, the Republican Chair of the Joint Legislative Oversight Committee on Education.
The value-added score started showing up a year ago, in a section of the teacher evaluation instrument called “Standard Six.” It was implemented as part of the $400 million Federal Race to the Top grant awarded to North Carolina.
Now, Republicans in the Legislature are considering numerous plans to use the evaluations to create merit-based pay plans for teachers.
“And I’m saying teachers you ought to embrace that,” says Tillman. “With the end of tenure and the advent of contracts, you’re going to see lots of chances to make very good money when you can perform and your kids can perform at a high level.”
Most teachers are not embracing it.
“I do have some mixed feelings about it,” says Karyn Dickerson, last year’s North Carolina Teacher of the Year. “Just seeing how that EVAAS data can go up and down depending on the students you get each year, based on the assessments.”
Dickerson teaches English at Grimsley High School in Greensboro. She has both high-achieving students in advanced classes, and some at-risk students in basic courses. Those students are going to have very different test scores, of course, but more importantly for value-added, they have very different potential for growth.
EVAAS is based on that student growth, not the test score itself. And the software is complicated - and some say largely secret. Teachers, principals, even administrators at the state level don’t know everything that goes into the model.
“Now the statisticians, and I’m not a statistician - I’m not the smartest guy in the world - they would say that stuff should even out, and I think they are correct, I’m sure it does even out, when you look at statewide data,” says Jim Key, an assistant superintendent in Durham. “But within a particular classroom? You could have more than a normal share of students who are going through some challenges with their personal lives.”
More about the software
North Carolina currently pays SAS about $3 million per year for EVAAS. It uses five years of student test data, but does not take into account the socio-economic status of a student, a factor often linked to student performance.
“The software is proprietary to SAS,” says William Sanders, the University of Tennessee professor who created EVAAS. “That doesn’t mean the methodology has not been reviewed. That also doesn’t mean the validation of the accuracy of the results coming out of our software hasn’t been checked.”
SAS also sells EVAAS to a number of other states, including Ohio and Pennsylvania. Other value-added models are being used in dozens of other places, with bi-partisan support.
President Obama is a fan of value-added, but some of the President’s education advisers have jumped ship.
“The National Research Council has recently come out to say that value-added should not be used as a tool to make decisions about teachers because it’s very unstable, it’s unreliable, turns out that it’s biased,” says Linda Darling-Hammond, one of President Obama’s education advisers during his first campaign and a professor of education at Stanford.
One reason educators say “value-added” and Standard Six of the teacher evaluation can be unreliable is that it’s entirely based on children – some as young as third-grade – taking long, boring tests.
“I’m a big believer that how students perform on an individual day is really outside of your control,” says Jim Argent, the principal of Lake Myra Elementary School in Wake County. “How you can prepare them, how you can have sound practices, how you can do everything that is in Standards One through Five. More times than not, if you do Standards one through Five well, you are going to see good results in Standard Six.
Standard Six, and “value-added” scores, only directly apply, of course, to teachers whose students are tested. That eliminates tens of thousands of social studies, art, music, and kindergarten through second grade teachers. Currently, those teachers’ Standard Six scores are based on school-wide data.
Some teachers in these grades and subjects say a data-driven standard devalues their efforts and creates a two-tier evaluation system.
“I have evaluated teachers for 40-some years,” says Tillman, a former educator and superintendent. “And chorus, band, art, and music – why not say this is a subjective thing with the principal? You don’t have test scores, but how well do you perform in your choral groups? Are they welcome, do the nursing homes want to hear them? Do your art students have their art on display at the banks?”
Tillman is one of the driving forces in education reform in Raleigh. And he’s got a message for teachers who wonder what the future holds: “I do think that paying the best teachers and the worst teachers all the same is a thing that will phase out. I do think that day is coming to an end and it’s probably not very far away. Maybe a year.”
These reports are part of American Graduate-Let’s Make it Happen!- a public media initiative to address the drop out crisis, supported by the Corporation for Public Broadcasting.