Wednesday, January 15, 2014

Why I'm off the Data Bandwagon

I'm a math guy.  Numbers appeal to me, and I tend to see the world in mathematical terms.

It's a blessing and a curse.  There's lots of problems that I face in my professional and personal lives in which thinking mathematically allows me to see things from a point of view that makes analysis, problem solving, and innovation easier.  I naturally see the connections between the beauty of art, music, architecture, science, literature, and math.

But, very few other people I know yell at their kids when the windshield wipers are on in the car because trying to calculate the number of wipes per minute for no reason while the kids are talking normally can be incredibly distracting.  And, I learned a long time ago that discussing math at a party is more dangerous to one's social standing than discussing religion or politics.  Partygoers rarely care that my favorite number is Phi.  It's as if me going on about it forever makes them think I'm irrational or something.
Image Credit:  Wikipedia

So, you can see why I was naturally attracted to the obsession with data that we've had in education for the past decade and a half.  Playing with numbers was fun.  Plus, there had to be a way to organize and analyze standardized test data into something meaningful that was good for students.

For years I was a data fanboy.   I've left that bandwagon behind, though.

What I've come to realize is that all the analysis, organization, debate, and discussion of data is meaningless if you are looking at the wrong data for what you are trying to accomplish.  Police departments don't check to see how many library books were checked out each month to reduce speeding on Main Street.

Here are a few of the realizations I grew into over time:

  • No matter what the standardized test data says, the best remedy for any shortcomings will always be better teaching and/or helping kids with rough home situations get their basic needs met.  Always.  As a teacher, there's a whole lot I can do about making myself a better teacher.  Unfortunately, despite my best efforts, there's often little I can do about the latter.
  • Looking at individual student scores on standardized tests is pretty much worthless.  If I'm Johnny's teacher and I don't know his weaknesses long before the state assessments, I'm not doing my job effectively.  Daily, in-lesson informal formative assessments should be giving me that information on a regular basis for every students so that I can meet each one's needs.  If I do know Johnny's weaknesses and strengths before the state tests, the assessment data won't tell me anything I don't already know.
  • If teachers in a district/school aren't regularly using the data from daily in-lesson informal formative assessments to make course corrections to their teaching and their students' learning, that's where a district should be focusing its resources - teachers should be discussing best practices, how to replicate amazing lessons, analysis of awful lessons, etc.  And they'll know what lessons are amazing and which ones bomb by whether or not kids learned as evidenced by the daily formative assessments they are giving.  Discussing test data until the cows come home isn't going to help those teachers learn better pedagogy. 
  • Standardized test data is useful for seeing curriculum gaps, large trends, and other more global issues in a school or district and it does have a use.  With that being said, focusing on standardized test data usually comes at the expense of focusing on formative assessment and pedagogy.  And the latter are the whole ball game when it comes to student learning.
  • Value Added Models used to measure student growth are junk science.  I've never had one person be able to explain to me in any kind of clear terms what the formula is for figuring out such models.  More importantly, anyone who's ever done a science experiment knows that your data isn't valid unless you can isolate a variable.  It is impossible to isolate a variable using these Value Added Models.  You can't isolate a teacher's effectiveness when you can't account for home situation, hunger, hormones, apathy, drug addiction, abuse, etc.  So, basing decisions based on this data is absurd.  Value added models are an attempt to quantify the unquantifiable.   
  • You value what you measure.  State tests do not and cannot measure the things our students most need to learn in school:  critical thinking, innovation, empathy, adaptability, and learning to love learning.  Focusing on standardized test data moves our values away from that which is most important for our students.
I guess I'm not truly off the data bandwagon.  I'm just off the test data bandwagon.  I still believe that data is incredibly valuable and should guide our decisions.  I just believe that the data we need to be looking at most is in-lesson formative assessment data that allows us to help each student grow in the best possible way.  Using that data to guide our decisions maximizes student learning.  

I know that fact is inconvenient for those still on the big data bandwagon.  You can't put every teacher's formative assessment data in a spreadsheet for all to see and discuss the way you can with state test scores.  And, trusting teachers to do their job seems like an unpopular position in today's educational climate.

Growth is hard, and change is messy.  If we really care about students learning more and being prepared for their futures, we'll start shifting our focus toward the data that really matters.