Several in-the-works applications combine sensors and computing to make smart, connected devices that dial in to our deepest emotions. 

Any sci-fi fan, or really anyone familiar with the phrase “I’ll Be Back,” has cited the fictional Skynet from “Terminator” as the point at which technology becomes too smart. Well, we haven’t achieved sentient technology with its own set of emotions just yet (though a Russian program did pass the Turing Test back in June), but technology can already read and respond to our human feelings.

“As we approach the year 2020, the size of computational power begins to approach zero and you can turn anything into a computer,” said Intel futurist Brian David Johnson. “You don’t have to ask yourself, ‘Can you do it? Can you augment your clothes or your body?’ The question becomes what do you want to do, and why do you want to do it.”

That’s exactly the kind of questioning that has led to this year’s release of robust, real-time life trackers, such as Basis Peak, Apple Watch and a growing number of applications for Google Glass. But to succeed beyond health trackers and smartphone extensions, wearable technology will have to get to know us better.

Asking and answering those questions posed by Johnson can be a challenging first step, but he believes the next step is getting easier because of new highly integrated and compact computing technologies. Intel Edison, a 22-nanometer dual-core processor that supports Linux and open-sourced platforms, and has built-in Wi-Fi, Bluetooth and third-party apps.

Edison is the brain behind a number of wearable prototypes designed by Intel that monitor behavior, such as an infant’s sleep pattern via a onesie or heart rate and EKG data courtesy of an attractive bicycle jersey.

Beyond physiological responses, Intel’s RealSense captures real 3D images and can render them for a wide variety of uses, such as augmented reality games, 3D video and real-time environment mapping, essentially recreating how humans visually sense the world around them.

Other designers have taken more whimsical approaches to wearable technology by using emotion sensors — microchips that respond to skin conductivity, heart rate and other indicators — to broadcast mood or illuminate gaze.

Much like the concept of the connected home, in which multiple devices with different functionalities work in tandem, the applications for emotion-sensing technology require various chips and gadgets playing in concert.

While technology that adapts and responds to human emotions continues to roll out in R&D labs and (soon) the market, what about technology that synthesizes this sort of data and anticipates human behavior?

“People are being sensed in so many different ways,” said Lama Nachman, principal engineer and manager of the Anticipatory Computing Lab at Intel Labs. “[With] wearable devices, continuous heart rate monitoring is becoming something that is very attainable. People are innovating and finding new ways to use existing machine-learning algorithms, and so we’re seeing major advancements in deep learning.”

Enter AutoEmotive, a project from the MIT Media Lab. AutoEmotive aims to help equip automobiles with emotion sensors and respond accordingly to the drivers’ states of mind, in effect preventing accidents. Sensors nested in the steering wheel and doors pick up electric signals from the skin, while a windshield-mounted camera analyzes facial expressions.

In the presence of stress, the vehicle’s coordinated sensors could soften the light and music. If the sensors detect eye stress, the system will broaden the headlight beams to compensate. Most impressive, the AutoEmotive system can alter the color of the vehicle’s conductive paint as a literal red flag that the operator is particularly distressed, so other drivers can avoid a potential accident.

Emotient, a company that specializes in “human awareness devices” uses similar facial recognition algorithms for less life-and-death matters. Their software, currently in beta throughout various retail stores, employs emotion sensing to gauge both customer interest and satisfaction, as well as employee morale, allowing management to both study trends and step in when needed.

By 2015, hundreds of reality-augmenting technologies will move from testing to the market, including sensors that monitor your food and UV intake and deploy an invisible bicycle helmet on impact. Though such tech may not necessarily predict the future, it does help to control our lives in positive ways by using emotion sensing to prevent accidents, misdiagnosis and unhealthy habits.

No, computers aren’t human just yet, but using technology to help us understand each other and our emotions is both ironic and already happening. As Johnson said, “I think having this relationship-based interaction with technology actually should make this interaction much more human.”