Menu
Menu
Silicon Valley is getting emotional

Silicon Valley is getting emotional

Technology like the iPhone X's new camera system and Face ID will increasingly figure out how you feel, almost all the time.

Apple’s shiny new iPhone X smartphone became available for pre-order on Friday

Packed with both bells and whistles and dominating the field in both speeds and feeds, Apple’s hotly anticipated iPhone X will be considered by some to be the world’s greatest phone.

The technology in the iPhone X includes some unusual electronics. The front-facing camera is part of a complex bundle of hardware components unprecedented in a smartphone. (Apple calls the bundle its TrueDepth camera.)

Apple iPhone X camera Apple

The top-front imaging bundle on the iPhone X has some weird electronics, including an infrared projector (far right) and an infrared camera (far left).

The iPhone X has a built-in, front-facing projector. It projects 30,000 dots of light in the invisible infrared spectrum. The component has a second camera, too, which takes pictures of the infrared dots to see where they land in 3D space. (This is basically how Microsoft’s Kinect for Xbox works. Apple bought one company behind Kinect tech years ago. Microsoft discontinued Kinect this week.)

Out of the box, this Kinect-like component powers Apple’s Face ID security system, which replaces the fingerprint-centric Touch ID of recent iPhones, including the iPhone 8.

A second use is Apple’s Animoji feature, which enables avatars that mimic the user’s facial expressions in real time.

Some iPhone fans believe these features are revolutionary. But the real revolution is emotion detection, which will eventually affect all user-facing technologies in business enterprises, as well as in medicine, government, the military and other fields.

The age of emotion

Think of Animoji as a kind of proof-of-concept app for what’s possible when developers combine Apple’s infrared face tracking and 3D sensing with Apple’s augmented reality developers kit, called ARKit.

The Animoji’s cuddly, cartoon avatar will smile, frown and purse its lips every time the user does.

Those high-fidelity facial expressions are data. One set of data ARKit enables on the iPhone X is “face capture,” which captures facial expression in real time. App developers will be able to use this data to control an avatar, as with Animoji. Apps will also be able to receive the relative position of various parts of the user’s face in numerical values. ARKit can also enable apps to capture voice data, which could in the future be further analyzed for emotional cues. 

Apple is not granting developers access to security-related Face ID data, which is stored beyond reach in the iPhone X’s Secure Enclave. But it is allowing all comers to capture millisecond-by-millisecond changes in users’ facial expressions.

Facial expressions, of course, convey user mood, reaction, state of mind and emotion.

It’s worth pointing out that Apple last year acquired a company called Emotient, which developed artificial intelligence technology for tracking emotions using facial expressions.

My colleague Jonny Evans points out that Emotient technology plus the iPhone X’s face tracking could make Siri a much better assistant, and enable richer social experiences inside augmented reality apps.

It's not just Apple

As with other technologies, Apple may prove instrumental in mainstreaming emotion detection. But the movement toward this kind of technology is irresistible and industrywide.

Think about how much effort is expended on trying to figure out how people feel about things. Facebook and Twitter analyze “Like” and “Heart” buttons. Facebook even rolled out other emotion choices, called “reactions”: “Love,” “Haha,” “Wow,” “Sad” and “Angry.”

Google tracks everything users do on Google Search in an effort to divine results relevance — which is to say which link results users like, love, want or have no use for.

Amazon uses purchase activity, repeat purchases, wish lists and, like Google with Google Search, tracks user activity on Amazon.com to find out how customers feel about various suggested products.

Companies and research firms and other organizations conduct surveys. Ad agencies do eye-tracking studies. Publishers and other content creators conduct focus groups. Nielsen uses statistical sampling to figure out how TV viewers feel about TV shows.

All this activity underlies decision-making in business, government and academia.

But existing methods for gauging the public’s affinity are about to be blown away by the availability of high-fidelity emotion detection now being built into devices of all kinds — from smartphones and laptops to cars and industrial equipment.

Instead of focusing on how people in general feel about something, smartphone-based emotion detection will focus on how each individual user feels, and in turn will react with equivalent personalization.

Researchers have been working to crack the emotion-detection nut for decades. The biggest change now is the application of A.I., which will bring high-quality sentiment analysis to the written word, and similar processing of speech that will look at both vocal intonation and word selection to gauge how the speaker is feeling at every moment.

Most importantly, A.I. will enable not only broad and bold facial expressions like dazzling smiles and pouty frowns, but even “subliminal facial expressions” that humans can’t detect, according to a startup called Human. Your poker face is no match for A.I.

A huge number of smaller companies, including Nviso, Kairos, SkyBiometry, Affectiva, Sighthound, EmoVu, Noldus, Beyond Verbal and Sightcorp, are creating APIs for developers to build emotion-detection and tracking.

Research projects are making breakthroughs. MIT even built an A.I. emotion detection system that runs on a smartwatch.

Numerous patents by Facebook, as well as acquisitions by Facebook of companies such as FacioMetrics last year, portend a post-“Like” world, in which Facebook is constantly measuring how billions of Facebook users feel about every word they read and type, every picture they scan and every video that autoplays on their feeds.

The auto-detection of mood will no doubt replace and be superior to the current “Like” and “reactions” system.

Right now, Facebook’s “Like” system has two major flaws. First, the majority of people don’t “engage” with posts a majority of the time. Second, because sentiment is both conscious and public, it’s a kind of “performance” rather than a true reflection of how users feel. Some “Likes” happen not because the user actually likes something, but because she wants others to believe she likes it. That doesn’t help Facebook’s algorithms nearly as much as face-based emotion detection that tells the company about how every user really feels about every post every time.

Today, Facebook is the gold standard in ad targeting. Advertisers can specify the exact audience for their ads. But it’s all based on stated preferences and actions on Facebook. Imagine how targeted things will become when advertisers have access to a history of facial expressions reacting to huge quantities of posts and content. They’ll know what you like better than you do. It will be an enormous benefit to advertisers. (And, of course, advertisers will get fast feedback on the emotional reactions to their ads.)

Emotion detection is Silicon Valley’s answer to privacy

Silicon Valley has a problem. Tech companies believe they can serve up compelling, custom advertising, and also addictive and personalized products and services, if only they can harvest personal user data all the time.

Today, that data includes where you are, who you are, what you’re doing and who you know. The public is uncomfortable sharing all this.

Tomorrow, companies will have something better: how you feel about everything you see, hear, say and do while online. A.I. systems behind the scenes will constantly monitor what you like and don’t like, and adjust what content, products and options are presented to you (then monitor how you feel about those adjustments in an endless loop of heuristic, computer-enhanced digital gratification).

Best of all, most users probably won’t feel as if it’s an invasion of privacy.

Smartphones and other devices, in fact, will feel more “human.” Unlike today’s personal information harvesting schemes, which seem to take without giving, emotionally responsible apps and devices will seem to care.

The emotion revolution has been in slow development for decades. But the introduction of the iPhone X kicks that revolution into high gear. Now, through the smartphone’s custom electronics combined with tools in ARKit, developers will be able to build apps that constantly monitor users’ emotional reactions to everything they do with the app.

So while some smartphone buyers are focused on Face ID and avatars that mimic facial expression, the real revolution is the world’s first device optimized for empathy.

Silicon Valley, and the entire technology industry, is getting emotional. How do you feel about that?

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.

Tags Apple

More about AmazonAmazon.comAppledivineFacebookGoogleMicrosoftMITNielsenTwitterXbox

Show Comments