Ten years ago the iPhone changed human behavior forever. Millions of new applications emerged from nowhere, enabling people to communicate in new ways and live smarter. With the iPhone X, the world is about to change once again.
Named for the 10th anniversary of the device, the iPhone X, is more than just an incremental improvement from the previous versions. In fact, Tim Cook, Apple CEO, proclaimed,
The iPhone X will set the path of technology for the next decade.
Now we all know Apple executives are master showmen and this statement has a tinge of marketing fluff. Especially when the new features like FaceID and edge-to-edge display don't seem to be new or even game-changing. So what's the big deal?
To understand the true potential of the new iPhone X, you need to look beyond the 'cool' features to the underlying new technologies within the device. It's the combination of these technologies that will evolve how people use their mobile devices and change the way you think about user behavior as analysts and developers.
FacialID is Much More Than a Security Feature
FacialID is the most talked-about feature of the iPhone X. Basically, you can unlock your phone by looking at the device and skip the extra step of entering your passcode. At face value, it seems like a vanity feature (pun intended).
The reality is that FaceID is more than just a biometric authentication method. The technologies behind FaceID, which Apple calls TrueDepth, use facial mapping and deep learning to first identify the user and then track their facial features as they change expressions.
TrueDepth uses a combination of cameras, sensors, a flood illuminator and dot projector to map out the orientation of your face in 3D, including the position of your eyes, eyebrows and mouth. Here's a quick explanation of how it works:
- You hold the phone up to your face with your eyes open. (It will not work with closed eyes).
- The flood illuminator detects your face and the infrared camera logs the image.
- The dot projector is activated, scanning the structure of your face with over 30,000 invisible dots.
- An infrared image with the dot pattern is created and analyzed through Apple's neural networks.
- Finally, a mathematical equivalent of your face is created that enables face tracking.
Apple touted this sophisticated technology by launching a fun Animoji feature where you can create personalized characters that mimic your facial features. However, the broader implications go beyond entertainment and will revolutionize how you develop applications and analyze user behavior.
What Facial Tracking Means For Developers
The facial tracking capabilities of the iPhone X will enable developers to create applications that can respond to a user's facial cues.
Apple has given developers permission to use the TrueDepth technology in app development as well as providing them with data for the user's facial expressions. For the first time applications can detect how you smile, frown or raise your eyebrows.
This opens the door for a whole new behavior between humans and their device, where changes in your face dictate how the application behaves. In the same way the App Store changed mobile behavior ten years ago, a new era of face-controlled apps will emerge creating a new category of products.
Imagine an application that can sense when you get confused and send you a help message. Or you join a role-playing game where your facial expressions and words are replicated in your avatar. Or a shopping app sends you a coupon when it notices a positive facial reaction to a specific product.
The possibilities are endless for developers to create a completely new user experience, and as a result, a new form of human analytics and product optimization will come with it.
Facial Recognition for Analytics
Behavioral analytics has come a long way in the past few years as technology has enabled companies to track every click and movement through their application. But one aspect of behavior that has remained elusive — human emotion — could be the key to deciphering why users act a certain way.
The leap between tracking what a user does in your app and what emotions a user feels is a huge advancement in behavioral analytics. If analysts are able to associate emotional triggers with actions, they can close the loop on understanding the thoughts behind the behaviors.
Let's say users avoid a feature within your app, but you have no idea why. Without understanding what is going on in the user's mind, you are limited to making hypotheses and testing different scenarios until you see a change. Instead, what if you knew that the majority of users got frustrated with the step right before the feature and just gave up?
It wasn't the actual feature that they didn't like, but the interaction beforehand that ruined their experience. This type of emotional insight adds a new layer to the current behavioral analytics that will help answer more 'why' questions and alleviate the need for blind optimization testing.
iPhone X is an Augmented Reality Mobile Platform
One of the most important features that most consumers missed in the iPhone X announcement was the new A11 chip and its processor dedicated to neural network computing. Why should people care?
The heightened computing power, coupled with TrueDepth technology, lays the foundation for the iPhone X to become the first mainstream augmented reality (AR) mobile platform.
For those new to AR, it is a method of digitally manipulating the real world with computer-generated images, creating a new composite view. The technology is not new, but the launch of the iPhone X has helped bring it to the forefront of consumer interest.
Empowering Developers To Drive AR Growth
Before the launch of iPhone X, Apple already set the stage for mobile AR at their developer conference when they announced their AR development tool, ARKit.
The new framework helps developers build AR applications on iPhone and iPad, and several companies have already jumped in to create some interesting and useful apps. One of the more commercial examples of AR that has gained popularity is IKEA Place, an app that lets people see their furniture in whatever space they’re in.
The ARKit is opening up the floodgates for developers to experiment and build AR apps. With a potential audience of millions of iPhone and iPad users, developers have a huge distribution system of users to test their AR applications and figure out what works and how users will embrace AR for the future.
Implications of Augmented Reality on AnalyticS
When you have data visualizations where the analysis contains several branches of information, it is really hard to perceive all the data on a two-dimensional screen. AR will solve this problem.
AR for visualization allows analysts to work in a third dimension, which would be a valuable new way to present data. It would be possible to see how different datasets connect and uncover insights that were once hidden on a flat screen.
The AR capabilities of iPhone X also create a more viable option for mobile data visualization. As the workforce becomes increasingly mobile, viewing analyses on a smaller screen is problematic. The smaller your screen, the more data you are likely to miss. AR can solve this by putting several screens within your augmented reality.
By simply moving your iPhone around your environment, you can see different analyses at the same time to get a more holistic view of your data. The world around you becomes a giant screen of analyses.
Whether you are visualizing your data in 3D or creating multiple screens within your smaller device, applying AR to analytics helps you see more information and identify the relationships between the data that will lead to new insights.
Taking Behavioral Analytics To The Next Level
The iPhone X is not a revolutionary product that will instantly change human behavior like the first iPhone. Tim Cook was more accurate when he said that the iPhone X is setting a path for the future.
The underlying technologies hold the clues to how humans will interact with their devices and the world around them. At Interana, we designed our technology to analyze any and all that interact with an application or service, but until now the data was limited by what we could actually track.
Now, the physical world is becoming a trackable series of events that include both user action and user emotion. When the two are combined, we will see a vast improvement in products and services.