Augmented Reality Isn’t Just Visual

Xperiel
Virtual Reality Pop
7 min readJan 7, 2019

--

By Alex Hertel, Co-founder and CEO of Xperiel

Augmented reality is awesome, no question. Think of all the things you can do, the extreme experiences you can have, the time travels you can jump into simply by overlaying and viewing 3D models in the real world. But it’s going to become even better than this. So far, AR has been focused on the use of pixels to enhance visual experiences, but many different sensors and technologies are going to converge to enhance all of our senses — not just sight — to create far richer interactions, augmenting and enhancing what it means to be human and how we interact with the world.

Without a doubt, people are visually-oriented creatures, so we shouldn’t be surprised that early AR use cases have focused on enhancing our sight. This is all thanks to a mobile device’s camera, gyroscope, and screen to overlay digital objects on top of the real world and anchor those models using markers or SLAM-mapping. The majority of today’s best AR — like Apple’s AR Toolkit, Google’s ARCore, Magic Leap, Hololens and other AR wearables — fall in this “visual AR” category.

Disney’s Magic Bench

AR wearables are here and in the future, someone may even develop AR lenses that can be worn as contacts or surgically implanted in the human eye. Here’s a short sci-fi film illustrating a) what that user experience would be like, and b) some of the dangers of going down that road. All of this seems like pure fantasy right now until you read articles like this, and in fact you can already have a similar AR “trip” via hardware installed in a physical space. A good example is Disney’s Magic Bench.

But the potential for AR is truly vast and much of it is yet to be explored. Almost all public attention on this technology has focused on visual use cases, but that’s misleading. Pixels are read-only, and if we think only in those terms, we’re limiting the possibilities of what this amazing new technology can do and where it may take us. It’s called “augmented reality,” not just “augmented vision” or “augmented sight,” and we don’t want AR to be read-only. AR is much more than just pixels and it’s already enhancing some of our other senses. As new applications and technologies develop, AR will expand how we experience the material world in new, previously unimagined ways by bolstering our other senses as well.

This isn’t as far-fetched as it sounds. Humans have been using technology to enhance our senses and physical abilities for hundreds of years, and augmented reality is just the latest incarnation of this principle. Ever since the invention of the spyglass in 1608, the same year as the first refracting telescope and 18 years after the first patented microscope, we’ve been augmenting our sense of sight. Later examples include x-ray machines, radio telescopes, night vision goggles, as well as radar and sonar.

These seem quaint in the light of the AR platforms and wearables that dozens of companies are working on today. Many new cars produced today include the type of heads-up display on their windshields previously only seen in fighter jets, enhancing your driving experience. AR glasses with a similar ability provide a digital overlay wherever you look.

We’ve been improving on some of our other senses, not just visual abilities, for quite a while, too. The first telephones and microphones came to us in 1876, giving us the ability to speak to and hear people at great distances. (When the Lone Ranger whistled for his horse, was he doing anything fundamentally different from using an app to summon an Uber?) Similarly, listening devices such as hearing aids also augment our auditory abilities. In the AR arena, MixHalo and Bose have produced good examples of augmented audio. But, actually, augmented audio has been available to consumers for decades in the form of products such as the Whisper 2000.

What about our sense of feeling or touch? Haptic devices have been around for some time, providing tactile feedback through hardware like our phones or exoskeletal gloves that mimic touch. Imagine the possible educational uses of haptics in the highly-specialized training of surgeons or gem-cutters.

But why should you have to wear a complex device? Smartphone screens have become in some respects just as sensitive as human skin, and maybe even more so. Fundamentally, there are two kinds of augmented touch. One “makes the real world clickable” by using a mobile device’s sensors; the other, in essence, allows the real world to “click on you.” GPS-based triggers let you click on the real world and provide an input signal by going to a physical location. These replace input from a keyboard or a mouse. Pokemon Go, Google Maps, and Uber are all examples of apps that use a phone’s GPS sensors as inputs or triggers.

There are many other ways that AR is replacing computer input devices to augment our senses by making the world “clickable”:

  • Vision recognition: Using a mobile device camera to scan a QR code or other image via computer vision technology. These can be put virtually anywhere in the physical world.
  • Audio recognition: This relies on the microphone of a mobile device to recognize a sound or tune. A familiar example is the app Shazam.
  • Near-field communication: While there aren’t too many NFC-based products yet, you may have tapped your phone on a smart poster that comes alive in an AR world. You can achieve similar effects with an NFC tag inside, say, beer coasters or key fobs, and NFC is the basis of tapping and paying with your smartphone at a cash register.
  • iBeacons: You’ve probably entered a stadium, a store, or an airport and been notified on your phone to take some kind of action that’s relevant to your location and the activity at hand. iBeacons are something like GPS triggers but, instead, rely on low-energy Bluetooth.
  • Gestures: Augmented reality glasses, headsets — and now, through technology provided by companies like ManoMotion, even mobile phone cameras — recognize and capture your hand gestures and other motions, translating them into 3-D digital hands, pixie dust or, perhaps, a few dabs of color on Michelangelo’s Sistine Chapel.
  • AR triggers: These can project a virtual button that isn’t really there onto a wall. You click on it, and the technology in your AR device captures whether you touched the virtual button via hit testing.
  • Eye-tracking: Developed over a couple of decades, this technology can be integrated into EEGs and motion tracking systems. It’s what Stephen Hawking used to type. Apple recently bought the German company SensoMotoric Instruments, which prompted CEO Tim Cook to describe AR as a civilization-changing technology.
  • Haptic feedback: This works in the opposite direction of most AR, letting the world “click” on you. One familiar example: Your phone buzzes in your pocket when you get a text message and tells you to check your device without resorting to something audial or visual. It’s like someone tapping you on the shoulder to get your attention. Disney’s Magic Bench also uses haptic feedback by making the bench vibrate in order to give you feedback in the physical dimension.

Bringing AR to our senses of smell and taste are harder challenges because these sensations are, at root, chemical actions that require contact with appropriate receptors. So far, no one has figured out how to elegantly connect technology with these chemical-based senses, but a few years ago, a British computer science professor designed a device that slipped into the iPhone’s earbud jack and released a whiff of bacon-scented spray, along with the sound of sizzle.

This device is crude, but future technological breakthroughs could make even augmented smell and taste possible. This will have far-reaching consequences for how we think, feel, and remember because taste and smell are connected to the limbic system, which governs emotions and memory. All sorts of new, non-drug therapies for depression, anxiety, and other conditions might well emerge.

AR is pushing out in many directions and one day will augment all five of our senses, not just enhancing our abilities to see. It won’t just be read-only pixels, but rather will make the whole physical world richly and digitally interactive. No one knows where the boundaries are. But because this exciting new technology is already pushing at the limits of our perception, it will spark all kinds of new discoveries about our relation to the physical universe and through enhancing our senses will stimulate new discussions of what it means to be human.

The co-founder and CEO of Xperiel, Alex Hertel is an inventor who completed his Ph.D. in computer science at the University of Toronto and is an expert on the use of immersive technologies to make the physical world digitally interactive. He previously co-founded Walleto which was acquired by Google and became Google Wallet.

--

--