Trendspotting: AI-equipped technologies are flowing into society to sense and differentiate objects, people, and human emotions.
- Intel has partnered with Classroom Technologies to create Class, a software program that integrates with Zoom to detect students’ emotional states. It’s designed to give teachers insights when learners are bored, distracted, or confused, per Protocol.
- Amazon Alexa can alert users about whether there’s a person or a package waiting at the door. New software that can reportedly differentiate objects from people integrates with video doorbells and security cameras from Ring, Google Nest, and Adobe.
- Ambient.ai has a computer vision intelligence platform that monitors the physical security of entire buildings. It’s a “brain” behind security cameras and door locks that’s intended to replace the need for security guards watching multiple cameras.
- Elliptic Labs recently struck an agreement with Lenovo to integrate its AI Virtual Smart Sensor Platform into mass-market laptops to detect human presence. The technology can reportedly sense when someone walks away from a computer screen and distinguish between someone returning versus just passing by the screen.
Omnipresent AI: Educators and social justice groups are raising concerns about invasions of privacy, the effect of classroom surveillance, and whether the technology works as advertised.
Last year digital civil rights group Access Now pushed Spotify to halt development of its technology that would detect emotion, gender, and age using speech recognition.
A good opportunity: Technology promises to make life easier, yet managing multiple IoT devices and manually interacting with computers and smartphones can be time-consuming.
- By integrating AI with sensing functionality into devices, computers will, in theory, do more work while people do less. For example, instead of keeping an eye on an entire classroom of students, AI can let teachers know who needs more attention.
- Devices that respond to hand gestures cut down on the need to manually type and press buttons.
A creepy problem: Although these technologies are alluring, lax implementation of them has potential for more harm than good.
- Whether AI emotional state detection tech is deployed in the classroom, boardroom, workplace, home, or public sphere, it means that facial expressions and body language are not only being surveilled, but interpreted by a machine.
- Facial expressions and body language vary across cultures. Given AI’s propensity for bias, there will likely be errors, disproportionately affecting certain demographics.
- Awareness that a computer is constantly watching and analyzing actions could be unhealthy, distracting, and will inherently alter behavior.
- Instead of cavalier adoption of AI, the technology needs more regulation and resources to deal with its bias issues.