Testing + PD =
Nuclear Powered Bullshit
Of all the PDs in all the towns in all the world, the ones purporting to be about “data” are easily the most absurd. More from Answer Key:
So if meetings are mostly bullshit, and PDs are bullshit, and as I've written before here, and here, and here, and here (among other places), data–specifically data from "Big Standardized Tests" (h/t Peter Greene Curmudgucation)-- is bullshit, then…
forty-four
PDs about data are nuclear powered bullshit.
These are the toughest to take without going completely insane. Besides all the philosophical reasons for opposing the toxic testing-industrial complex, and all the moral reasons for fighting against the reduction of human beings to data points, the numbers are phony. It’s all a fraud. The data do not mean what they say the data mean.
Now if you are tasked with administering these tests and you happen to be arithmophobic (an arithmophobiac?), you may not notice or want to, but the tests do not measure what they purport to. They are not reliable. They are not valid. First of all, if you’ve ever given a ten question multiple choice quiz to your students, you will know that even if you test exactly the same material again a week later, sometimes the same student will score higher and sometimes their scores will go down.
Performance on a task including an examination depends on any number of factors including blood sugar level and the quantity and intensity of available distractions. That is why no teacher worth the name would ever rely on a single assessment given a single time to evaluate a student in any significant way.
We give tests, but we also give projects and writing assignments short and long, formal and informal. We assign debates, we listen to conversations and questions and we try to get a comprehensive, meaningful picture of what each student knows and can do along with some idea of what they need--particularly from us--in order to be able to know and do more. In other words, it's the opposite of the testing-industrial imperative. And it's not likely to be featured in a PD.
But it’s even worse than that. In addition to not measuring what a student has learned, the tests don’t deliver even within their own narrow objectives. They do not measure what they say they do.
I’ll give you an example: My school hired a group of education consultants to boost our test scores. (Hilariously, we were not allowed to acknowledge that’s what we were doing.) We had days and days and days of meetings with these “experts”--only the math and English departments of course, because (sotto voce) That’s what we test. Meanwhile, everyone else could work on what they needed to do to prepare for actual teaching.
And when we were through with these literal weeks of PDs, we had created five exams to give in each class, one every five weeks and each test would assess five distinct standards representing five discrete skills. That’s five, then five weeks later a different five, then a different five, until we get to twenty-five. Why twenty-five? They’re most frequently on the test.
And after each test we would look at how dismal our scores were and compare them to the previous test to see if they went up or down. In a PD about data* we would assemble in the elementary cafeteria and show each other our dismal scores (our scores? our students’ scores?) and make up a story about why our scores might have gone down or up, and we’d make posters and cut out student names and put up the posters around the cafeteria and go one-by-one around the room and tell our stories and stand for our sins. That is, those of us in the math and English departments. All others adjourn to your rooms to work on teaching stuff.
The tests were all different and "tested" different standards and skills. We were told we could not test the same things twice even if just to see if we had made progress. We compared scores from different kinds of tests and had to pretend that the improvement or decline from one test to the next meant something. “Wow, those kids really got it this time.” “Oh, (downward inflection) that’s disappointing. What did you do differently?” “I taught different stuff!” And we did this for years.
I tried several times during the early days of the catastrophe to point out that we were not actually measuring progress or the lack of it. “It’s apples to orangutans!” “Five answers to five questions is not a valid measure of proficiency!” “Who chose four correct answers as the benchmark for proficient? And why?” “Skills? Standards? I looked at the data and my student scores tracked their reading levels. Aren’t we really just assessing their reading?” I was... unheard.
And there’s more. We were required to give the exam--in English--to every single student. Boost scores? Shhh! Yet we were required to prep for and administer the exams to seniors who would never be taking another test in high school. Our ELD teacher was forced to give the exam--in English--to newcomers to the U.S. who had been in country for a week and spoke zero English. For these students and others, and for the teachers in their classrooms, the tests were a deliberate insult.
So why doesn’t someone put a stop to it? The testing for ratings for rewards and punishments is big business, that’s why. Hundreds of thousands--maybe bazillions of people are deeply invested in its preservation.
But doesn’t your administration understand how destructive it is? Why spend so much time prepping and administering and "analyzing" data that is so meaningless?
“It is difficult to get a man [sic] to understand something, when his [sic]* salary depends upon his [sic]* not understanding it.” Words by which to solve the puzzle and unriddle the world.*
So what do teachers do? Do what you always do: use your experience, expertise and best judgment to assess and support your kids. Be honest: Why does this matter to us? It doesn’t, but do your best anyway.
Students, not skills. Students, not curriculum. Students, never ever Big Standardized Test scores. Start with the kids and what you want for them. Work backward from there to the curriculum. Figure out the standards afterward, and leave the "data" for the bosses to figure out.
Whatever you do, do not take them seriously--not the scores, not the administrators, and certainly not the PDs about data. Give the tests. Look busy at the bullshit PDs. Keep your job. Do not internalize their madness.
*I prefer accurately quoting a source even if they are trapped in the sexist language of their day, but I didn’t want you to think I hadn’t noticed. Feel free to translate to the 21st century: It’s hard to get someone to understand something if their stock options depend on their not understanding it.