Alderton starts by citing data about teachers considering leaving the profession, says that Covid is certainly partly to blame, and then pivots to this:
Okay. First, Jake Bryant
is a former teacher like I am a former athlete. I pitched for the playground softball team when I was 16. Bryant taught at a KIPP charter for one year after graduating from Harvard with a degree in social studies and teaching Yokohama
. Then he went into the consultant biz; I don't find any proof that he was a Teach for America product, but his career follows the same trajectory of TFA insta-experts in education. Bryant moved on to the Gates Foundation, then landed in McKinsey and Company where he leads "research focused on improving educational outcomes." Aka raise test scores.
Bryant's not wrong when he notes that teaching can involve some annoying clerical work, but this piece will go south rapidly. He cites some McKinsey research
claiming that teachers spend 40% of their time on "activities that could be automated," a "report" from January of 2020 (aka The Time Before This Damn Pandemic) that features the usual McKinsey angle which is that we really ought to be able to cut teaching positions and replace them with lower-skilled humans and computers. The areas that technology can "reallocate" teacher time in the areas of preparation, evaluation and feedback, administration, student instruction, and --bizarrely--professional development. The whole "research report" is aimed at promoting personalized [sic] learning, aka computer-directed education. The report actually says "20 to 40" percent of teacher hours could be automated, but Bryant (who co-wrote the report he's referring to) chooses now to go with the 40% figure, which makes sense, because the pandemic has simply accelerated the goals that McKinsey had
back When We Were All Maskless.
As always, when dealing with technology "research," it's important to understand that these are not scientific attempts to predict the future; they're marketing attempts to shape it. So when Alderton drops in phrases such as "experts like Bryant," he's just helping power the smoke machine.
So how does think robots and software are going to "help" teachers.
Streamlining administrative tasks
We turn now to Eric Wang
, a senior director at Turnitin. He's here to beat the drum for Gradescope
, yet another AI product that claims it can provide assessment and feedback for student papers. No, no, and also, no. We've been over the problems with this many times in the past, but for the moment I'll offer just this objection--what does it do to student engagement to be told, "I'm not actually going to look at this--just run it past the gradebot." Does anybody imagine that wealthy and well-connected parents will not demand that teachers had damned well better actually look at their child's work.
Say it with me: computer software cannot assess student writing. See here
, and here
Also, the article brings up Ashok Goel's creation of virtual teaching assistant Jill Watson
to handle "basic" questions (like the kind that you could have answered if you logged on and read a website, but okay."
The Power of Personalization
McKinsey's favorite product--computer-directed education. The big win is supposed to be that the computer can "personalize" the "instruction" by using "adaptive learning." He offers Thinkster and Knewton; Knewton once predicted
that it would be able to tell you what to eat for breakfast to get a good math score and would "solve the global education crisis
," but instead was broken up and sold for parts two years ago
, having not actually solved the global education crisis. This piece of Knewton is owned by Wiley, repped here by Matthew Leavy, who used to work for Pearson. Thinkster Math founder/CEO Raj Valli
offers "We've married man with machine." Here's his metaphor:
If you tell me to jump in the pool and swim back and forth, I’m never going to be a good swimmer. But if you jump in the pool with me and point out that I’m not kicking my right leg or using my left arm, then you can make me better. That’s the kind of observations our tutors are able to make using our technology.
These are not the only two possible coaching approaches for swimmer, and coaches do not use the second one, and none of this is what he's actually proposing, which is to throw the computer in the pool with the swimmer and have it report back to the coach who is sitting in the office somewhere.
You do not make education more personal by taking the persons out of it.
Finally, we get Microsoft's new "tool" for assessing reading fluency. Just have the student read into the camera, and the bot will tell the teacher how well the student reads. Anthony Salcito, the Microsoft VP pitching this, is correct in pointing out that doing this kind of assessment can suck up huge amounts of teacher time. That is an excellent argument for smaller classes; it is not an argument for getting young readers to perform for a computer.
In the future, says Alderton, AI "might optimize not only individual curriculums [sic], but also entgire classrooms." And Goel offers this scary picture of the future: "AI could be used for “matchmaking” — pairing students with the teachers and schools that are best suited to them based on their learning style." Whatever learning style means, exactly.
And from McGraw-Hill, Sean Ryan
makes a plug for student grouping based on mastery learning, along with McGraw-Hill's own adaptive personalized [sic} learning software to "create personalized [sic] learning paths for students in kindergarten through college." In one of the great understatements in ed tech marketing, Ryan notes that "That can be hard to embrace because of social components." But with "more education taking place in hybrid and online environments"--in other words, in systems that have already stripped education of social components--why not put an eighth grader in pre-calc if they're ready, says Ryan, as if no schools already do that.
Writes Alderton, "It’s the beginning of a new era wherein learning is a journey instead of a destination. That makes teachers navigators — which is precisely what most of them want to be." Are there teachers who don't know about the whole journey thing (how many years have we been talking about life-long learners?).
And we end with this:
“Teachers become teachers to help children maximize their potential,” Ryan concludes. “By allowing them to focus more on the social components of learning, technology helps them have the kind of impact they got into the profession to have.”
This seems to play off an assumption embedded in the McKinsey report cited back at the top--that teachers are only really working when they are in front of students. The teachers I know are at least as interested in the academic impact as the "social components," though I can't be 100% certain I know what Alderton means by that. I also know that doing the assessments, the feedback, the breakdown of actual student performance--and not getting a second-hand report on those things--is part of how a teacher gets to better know and serve students.
Alderton could have better served his audience by talking to actual teachers or any of the many critics of all of these education-flavored money-gathering programs instead of serving as an amplifier for the ed tech biz. Or perhaps he could have consulted the folks who would explain how all of these time-consuming elements are just part of why teachers and parents want to see smaller class size and less time-wasting junk like the Big Standardized Test or endless reportage to prove they're doing the job or wasted time trying to log small humans into inadequate websites.