Emotion Tracking Software Could Deter Your Child From Getting Bored With Math

Intel is learning a hard lesson after partnering with Classroom Technologies to develop a face-reading AI that detects student emotions during Zoom calls.

Student engagement technology, created by Intel with Classroom Technologies’ Class software, captures images of students’ faces with their webcams and combines them with computer vision technology and contextual information to predict engagement levels through emotions.

The goal is to provide educators with data about emotional responses that they can use to personalize lessons and improve student engagement. AI can detect that students are getting confused during a specific section of a lesson and send that information to teachers so they can reassess how that particular topic is being taught.

Intel is committed to ensuring educators and students have access to the technologies and tools needed to meet the challenges of a changing world,” said Michael Campbell, Intel’s Global Director for Consumer and Education Segments. commercial. “Through technology, we have the ability to set the standard for impactful, synchronous online learning experiences that empower educators.”

Classroom Technologies CEO Michael Chasen says teachers are struggling to interact with students in a virtual classroom in the age of the pandemic, and the insights offered by this AI technology can help educators to communicate better. Classroom Technologies plans to test emotion reading technology, which Intel hopes to develop into a widely distributed product.

As detailed in a Protocol reportthis face-reading AI already has its detractors, who argue that the use of facial recognition technology on students is an invasion of privacy and that the technology oversimplifies human emotion, which can lead to detrimental results.

As learning has shifted from the classroom to the home, schools have desperately sought new ways to engage with students. A first debate revolved around the use of webcams. Proponents argued that face-to-face interaction enhanced learning and held accountable, while those opposed to the use of webcams said it was a violation of privacy and may increase stress and anxiety levels. Reading students’ faces and analyzing them with AI adds another layer to the problem, critics say.

“I think most teachers, especially at the university level, would find this technology morally objectionable, like the panopticonAngela Dancey, a senior lecturer at the University of Illinois at Chicago, told Protocol. “Frankly, if my institution offered it to me, I would refuse it, and if we were forced to use it, I would think twice before continuing to work here.”

These criticisms come at a time when schools are giving up invasive monitoring software which exploded during the pandemic as students were forced to learn remotely. Often used to discourage cheating, these tools use webcams to monitor eye and head movements, tap microphones to listen to the play, and record every mouse click and keystroke. Students across the country have signed petitions claiming the technology is an invasion of privacy, discriminates against minorities and punishes people with disabilities, such as Motherboard Reports.

There is also the question of whether facial expressions can be used accurately to gauge engagement. Researchers have found that people express themselves immeasurably. As such, critics argue that emotions cannot be determined based on facial expressions alone. Assuming a student has dropped out of a lesson just because they don’t seem interested in your algorithm’s metrics reduces the complexity of emotion.

“Students have different ways of presenting what’s going on inside of them,” said Todd Richmond, a professor at Pardee RAND Graduate School, speaking to Protocol. “That distracted student at this time may be the appropriate and necessary state for him at this time in his life.”

There are also concerns that the analytics provided by the AI ​​could be used to penalize students. If, for example, a student is considered distracted, they may get poor participation scores. And teachers might feel incentivized to use the data if a school system rated educators based on their students’ engagement scores.

Intel created the emotional analysis technology using data captured in real classrooms using 3D cameras and worked with psychologists to categorize facial expressions. Some teachers have found the AI ​​useful, but Chasen says he doesn’t think Intel’s system is “mature yet” and needs more data to determine whether the results produced by the IAs actually match student performance. Chasen says Intel’s technology will be just one piece of a larger puzzle in student assessment.

Intel and Classroom Technologies say their technology was not designed as a surveillance system or to be used as evidence to penalize students, but as we see so often in the tech industry, the products are frequently used to in a way not intended by their creators.

We’ve reached out to Classroom Technologies for comment and will update this story when we get back to you.

Comments are closed.