Data: It Keeps Teachers Busy
Take your pick. But these cats at my school really have to be stopped.
As you may suspect, we here at my school are “data-driven”. That’s right. There is no substitute for data. And the best thing about it, from an administrator’s point of view, must be that you don’t have to worry about how long it takes teachers to collect the data or if it is really of any value in the first place. Just collect that data, tell everybody you are collecting it, and that you are using it to make data-driven decisions….for the kids. The rest, my friend, will fall into place. No worries.
Here is our scenario:
At the beginning of every course we give a “diagnostic” exam which covers the content of the course. About 30 or so multiple choice questions. Each question is to be matched to a standard. There may be more than one question per standard. After the exams are graded, each answer to each question, from each student is put into the “standards mastery tracker” spreadsheet. This is a “tool”, if you know what I mean. (Can you just imagine how excited they were when they found this? It must have been something to see.)
Over the course of the course we are to “track” each students’ mastery of each standard and create reteaching “action plans” and all manner of whatnot, driven by the data, to ensure student mastery yadda yadda yadda…
I will comment on the general stupidity of this in a moment. But first I want to mention this semester’s addition, which is sure to close the Achievement Gap very soon: It is that we must now not simply code answers to the questions as, for instance, 1= correct; 0= incorrect. We must now also indicate which of the three possible wrong answers each child chose for each question.
Now the general critique.
1) When I test students on the content of a course they have not yet taken, and then I find that they score poorly, I am not sure what I have learned. For example, the last time I did this, the overall number of correct answers was 30.2 percent. On a multiple choice exam with four choices, that is pretty much exactly what you would expect if people were just guessing. So I have learned that the students do not yet know the content of the course they have not yet taken. Is that about it? And all I had to do was enter 2500 data points.
2) Then I was required to create an “action plan” based on this data. Seriously. OK…you dummy, my action plan is to now teach the course. Jeezus. Where did you come from?
3) Why does it matter how every student does on every standard? Isn’t that what quizzes, midterms, projects and finals are for? Do any of us really need to be absolutely sure that you understand federalism or foreshadowing or folic acid. The point of looking at all of these assessments as a group is that at the end of the class we can look at a student’s work and see if he or she basically got it or not, how well, etc. If there is a systemic problem, like 80 percent of the class thinks folic acid comes from farts and methane is needed to produce red blood cells…THEN we need an action plan. Not before.
4) These questions from which we glean our precious data may or may not be good questions. They are just questions pulled from websites, written on the train on the way to work or after a couple glasses of wine at dinner. There is no quality control or testing of these questions that would give us any certainty that answering them correctly would constitute “mastery” of anything.
I have worked for three years at a university survey research center and I bet I have taken more college and graduate courses in statistics and research methodology than all of my administrators combined. What we do to and in the name of data here might be legal, but I’m sure it’s a sin.