Monday, June 17, 2013

Value-added fever in Ohio


Ohio is enthusiastically pushing value-added measures on a website called "State Impact."  The most recent post is "Grading the Teachers: 'Most Effective' Teachers Say High Scores Happen By Focusing on the Kids" (featured on ASCD Smart (sic) Brief): http://stateimpact.npr.org/ohio/2013/06/16/grading-the-teachers-most-effective-value-added-teachers-say-high-scores-happen-by-focusing-on-the-kids/).  This is only the most recent in a series of posts in praise of value-added.

I posted this comment:

Not mentioned is the fact that a number of studies have shown that rating teachers using test score gains does not give consistent results. Different tests produce different ratings, and the same teacher’s ratings can vary from year to year, sometimes quite a bit

Sources:
Different tests produce different ratings: Papay, J. 2010. Different tests, different answers: The stability of teacher value-added estimates across outcome measures. American Educational Research Journal 47,2.

Vary from year to year: Sass, T. 2008. The stability of value-added measures of teacher quality and implications for teacher compensation policy. Washington DC: CALDER. (National Center for Analysis of Longitudinal Data in Educational Research.)
Kane, T. and Staiger, D. 2009. Estimating Teacher Impacts on Student Achievement: An Experimental Evaluation. NBER Working Paper No. 14607 http://www.nber.org/papers/w14....


No comments:

Post a Comment