Real Stupid Artificial Intelligence (Personalized Learning's Missing Link)

By
Advertisement
Good lord in heaven.



Intel would like a piece of the hot new world of Personalized [sic] Learning, and they think they have an awesome AI to help. And they have concocted a deliberately misleading video to promote it.

In the video, we see a live human teacher in a classroom full of live humans, all of whom are being monitored by some machine algorithms "that detect student emotions and behaviors" and they do it in real time. Now teachers may reply, "Well, yes, I've been doing that for years, using a technique called Using My Eyeballs, My Ears, and My Brain." But apparently teachers should not waste time looking at students when they can instead monitor a screen. And then intervene in "real time," because of course most teachers take hours to figure out that Chris looked confused by the classwork and a few days to respond to that confusion.

Oh, the stupid. It hurts.

First, of course, the machine algorithm (copywriters will be damned if they're going to write anything like "students will be monitored by computers") cannot detect student emotions. They absolutely cannot. They are programmed to use certain observable behaviors as proxies for emotions and engagement. How will Intel measure such things? We'll get there in a second. But we've already seen one version of this sort of mind-reading from NWEA, the MAP test folks, who now claim they can measure engagement simply by measuring how long it takes a student to answer a question on their tests. Because computers are magical!

Turn it around this way-- if you had actually figured out the secret of reading minds and measuring emotions just by looking at people, would your first step be to get in on the educational software biz?

In fact, Intel's algorithm looks suspiciously unimpressive. They're going to measure engagement with three primary inputs-- appearance, interaction and time to action. A camera will monitor "facial landmarks," shoulders, posture. "Interaction" actually refers to how the student interacts with input devices. And time to action is the same measurement that NWEA is using-- how long do they wait to type. Amazing, And please notice-- this means hours and hours of facial recognition monitoring and recording.

Intel is ready to back all this up with some impressive argle-bargle:

Computers in the classroom have traditionally been seen as merely tools that process inputs provided by the users. A convergence of overlapping technology is making new usages possible. Intel and partners are enabling artificial intelligence at the edge, using the computing power of Intel CPUs to support artificial intelligence innovations with deep learning capabilities that can now know users at a higher level – not merely interpreting user commands but also understanding user behaviors and emotions. The new vision is a classroom PC that collects multiple points of input and output, providing analytics in real-time that lets teachers understand student engagement.

This just sounds so much more involved and deep than "the computer will watch how they hold their lips and tell the teacher what the algorithm says that lip position means."

Who is the market for this? I want to meet the teacher who says, "Yeah, looking at the students is just too challenging. I would love to have a software program that looked at them for me so I could just keep my eyes on my screen." Who the hell is that teacher, standing in front of a classroom looking not at her students, but at her tablet? Who is the administrator saying, "Yes, the most pressing need we have is a system to help teachers look at students."

Of course, there are applications I can think of for this tech

One would be a classroom with too many students for a teacher to actually keep eyes on. Monitoring a class of 150 is hard for a human (though not impossible-- ask a band director) but easy for a bank of cameras linked to some software. Another would be a classroom without an actual teacher in it, but just a technician there to monitor the room.

Here's Intel's hint about how this would play out:

Students in the sessions were asked to work on the same online course work. Instructors, armed with a dashboard providing real-time engagement analytics, were able to detect which students required additional 1:1 instruction. By identifying a student’s emotional state, real-time analytics helped instructors pinpoint moments of confusion, and intervene students who otherwise may have been less inclined or too shy to ask for help. In turn, empowering teachers and parents to foresee at-risk students and provide support faster.

In a real classroom, teachers can gauge student reaction because the teacher is the one the students are reacting to. But if students are busy reacting to algorithm-directed mass customized delivered to their own screen, the teacher is at a disadvantage-- particular if the teacher is not an actual teacher, but just a tech there to monitor for student compliance and time on task. Having cut the person out of personalized [sic] learning, the tech wizards have to find ways to put some of the functions of a human back, like, say, paying attention to the student to see how she's doing.

The scenario depicted in the video is ridiculous, but then, it's not the actual goal here. This algorithmic software masquerading as artificial intelligence is just another part of the "solution" to the "problem" of getting rid of teachers without losing some of the utility they provide.

Intel, like others, insists on repeating a talking point about how great teachers will be aided by tech, not replaced by it, but there is not a single great teacher on the planet who needs what this software claims to provide, let alone what it can actually do. This is some terrible dystopian junk.







0 comments:

Post a Comment