Pages

Wednesday, January 15, 2014

Why I'm off the Data Bandwagon

I'm a math guy.  Numbers appeal to me, and I tend to see the world in mathematical terms.

It's a blessing and a curse.  There's lots of problems that I face in my professional and personal lives in which thinking mathematically allows me to see things from a point of view that makes analysis, problem solving, and innovation easier.  I naturally see the connections between the beauty of art, music, architecture, science, literature, and math.

But, very few other people I know yell at their kids when the windshield wipers are on in the car because trying to calculate the number of wipes per minute for no reason while the kids are talking normally can be incredibly distracting.  And, I learned a long time ago that discussing math at a party is more dangerous to one's social standing than discussing religion or politics.  Partygoers rarely care that my favorite number is Phi.  It's as if me going on about it forever makes them think I'm irrational or something.
Image Credit:  Wikipedia

So, you can see why I was naturally attracted to the obsession with data that we've had in education for the past decade and a half.  Playing with numbers was fun.  Plus, there had to be a way to organize and analyze standardized test data into something meaningful that was good for students.

For years I was a data fanboy.   I've left that bandwagon behind, though.

What I've come to realize is that all the analysis, organization, debate, and discussion of data is meaningless if you are looking at the wrong data for what you are trying to accomplish.  Police departments don't check to see how many library books were checked out each month to reduce speeding on Main Street.

Here are a few of the realizations I grew into over time:

  • No matter what the standardized test data says, the best remedy for any shortcomings will always be better teaching and/or helping kids with rough home situations get their basic needs met.  Always.  As a teacher, there's a whole lot I can do about making myself a better teacher.  Unfortunately, despite my best efforts, there's often little I can do about the latter.
  • Looking at individual student scores on standardized tests is pretty much worthless.  If I'm Johnny's teacher and I don't know his weaknesses long before the state assessments, I'm not doing my job effectively.  Daily, in-lesson informal formative assessments should be giving me that information on a regular basis for every students so that I can meet each one's needs.  If I do know Johnny's weaknesses and strengths before the state tests, the assessment data won't tell me anything I don't already know.
  • If teachers in a district/school aren't regularly using the data from daily in-lesson informal formative assessments to make course corrections to their teaching and their students' learning, that's where a district should be focusing its resources - teachers should be discussing best practices, how to replicate amazing lessons, analysis of awful lessons, etc.  And they'll know what lessons are amazing and which ones bomb by whether or not kids learned as evidenced by the daily formative assessments they are giving.  Discussing test data until the cows come home isn't going to help those teachers learn better pedagogy. 
  • Standardized test data is useful for seeing curriculum gaps, large trends, and other more global issues in a school or district and it does have a use.  With that being said, focusing on standardized test data usually comes at the expense of focusing on formative assessment and pedagogy.  And the latter are the whole ball game when it comes to student learning.
  • Value Added Models used to measure student growth are junk science.  I've never had one person be able to explain to me in any kind of clear terms what the formula is for figuring out such models.  More importantly, anyone who's ever done a science experiment knows that your data isn't valid unless you can isolate a variable.  It is impossible to isolate a variable using these Value Added Models.  You can't isolate a teacher's effectiveness when you can't account for home situation, hunger, hormones, apathy, drug addiction, abuse, etc.  So, basing decisions based on this data is absurd.  Value added models are an attempt to quantify the unquantifiable.   
  • You value what you measure.  State tests do not and cannot measure the things our students most need to learn in school:  critical thinking, innovation, empathy, adaptability, and learning to love learning.  Focusing on standardized test data moves our values away from that which is most important for our students.
I guess I'm not truly off the data bandwagon.  I'm just off the test data bandwagon.  I still believe that data is incredibly valuable and should guide our decisions.  I just believe that the data we need to be looking at most is in-lesson formative assessment data that allows us to help each student grow in the best possible way.  Using that data to guide our decisions maximizes student learning.  

I know that fact is inconvenient for those still on the big data bandwagon.  You can't put every teacher's formative assessment data in a spreadsheet for all to see and discuss the way you can with state test scores.  And, trusting teachers to do their job seems like an unpopular position in today's educational climate.

Growth is hard, and change is messy.  If we really care about students learning more and being prepared for their futures, we'll start shifting our focus toward the data that really matters. 

Saturday, January 11, 2014

Testing: The Enemy of Life-Long Learning

This morning, as I was watching my wife make pancakes for breakfast, I started thinking about the absurdity of testing.  Lori is an incredible cook, but has never once taken or passed a cooking test.  She just learned because she enjoyed it.  She tries new recipes, and we give her feedback on those recipes, which she then uses to learn more.

This got me thinking about how harmful a testing culture is for our students.  I'm not just talking about standardized high-stakes tests, but all tests that are designed to measure learning out of the context for which that learning needs to be used.
Photo credit:  Wikipedia

People always say, "We need tests because life is full of tests."  That's nonsense.  Life isn't full of tests.  It's full of assessments.  As an adult, I can count the number of tests I've had to take since college on 1 hand.  As adults, we do stuff and either succeed or learn to do something differently the next time.  If I make pancakes for my kids, I don't need to pass a test first to do it.  I just make them.  And if they are awful, I either fix the recipe or get asked to make omelets next time. 

That's life.  Trial and error.  

The vast majority of adults don't take tests, but we are constantly tested (assessed would be a better word), and we get lots of feedback. When we prepare kids for a world of taking tests, we don't prepare them for the real world which will require them to process the feedback they constantly get from their assessments. Testing forces them to see things as black and white, success or failure. It teaches them that either they know something, or they don't. It doesn't teach them to learn what they don't know. 

Testing teaches our students not to be the life-long learners that we so often preach about in schools, but upon which we so rarely focus our efforts.  

If we really care about our students being life long learners, we need to start assessing them in a way that encourages them to learn from their mistakes.  We need to move away from a culture of testing and towards a culture of meaningful, relevant assessment that mirrors what students will see when they leave our schools.