Augmented reality is technology that enables you to overlay digital objects onto the real world. And it’s pretty freaking cool.
I’ve been following augmented reality and virtual reality closely since the beginning of 2016, but it wasn’t until the AR in Action conference last week at the MIT Media Lab that I got to try out an augmented reality headset for the first time. Above is a short demo of me using the Meta 2. On the computer screen, you could see what I was seeing. I played around with a digital Earth floating in the room, spinning it around and stopping it to look at different parts. There was also a demo of the human brain and a gesture-controlled user interface.
To everyone else, it looks like I’m just standing there waving my hands around in the air. But from my point of view, I’m moving digital objects around the room, spinning and zooming in on them with different gestures.
Throughout the conference, I met lots of interesting augmented reality researchers and startup founders. Many of them were MIT grad students in the Media Lab working on AR/VR tech. The main organizer of AR in Action is John Werner, the former Head of Innovation and New Ventures in the Media Lab’s Camera Culture Group. He co-founded the Media Lab’s Emerging Worlds, Ideas in Action, and is the curator of TEDxBeaconStreet, all of which are organizations I’ve volunteered for in the past before I even knew who John Werner was. He is currently the Vice President of Strategic Partnerships at Meta, and his vision is to enable Boston to become the augmented reality capital of the world.
I also got to try Microsoft’s HoloLens. The screen was a lot narrower and the headset was less comfortable but it was wireless and entirely self-contained. With the Meta, I had to be tethered to a computer the whole time. With the HoloLens, I could walk freely because the computer was built into the headset. When I pulled up a 3D graph of stock market forecasts, I could walk around and look at it from different angles as if it was an actual object in the room. Other people who put on the headset could see the same object. I’m really excited thinking about the way this could be used for creating “digital worlds” overlaid over the real one. Imagine the future! Imagine your computer “space” was your room, where you could walk to your drawers and pull out digital work files, or find your movie files floating around when you hop into bed.
From what I’ve seen, the two main ways augmented reality exists today is through headsets and through cameras. In my blog post Virtual Reality, Startups, and Donuts, I posted about an example of smartphone augmented reality:
The way this works is by matching digital objects to certain visual cues. Then, when a camera passes over the visual cue (in this case, the cover of this New Yorker issue), a digital object is overlaid on the screen. You can move around the camera to view the object from different angles, but if the visual cue disappears from view, the object also disappears.
We saw smartphone augmented reality hit mainstream briefly during the Pokemon Go craze of last summer. Digital Pokemon (Digimon? lol) overlaid on the real-world, popping up and “standing” on whatever surface the camera could identify.
Another quirky example of smartphone AR is, interestingly enough, tattoos. At the after-party, we met a guy who showed us his wrist. It had a tattoo of a gecko. Then he put his phone camera over it and a digital gecko popped up to life on his wrist. His company, HoloTats, makes a wide variety of these augmented reality tattoos. He had to leave in a hurry, but he gave me one to try out – Horace the pig.
Horace became my virtual pet for the weekend that I could summon every time I held my camera over my wrist. I almost felt sad when the tattoo finally washed off and Horace faded away to the great beyond. I’ll miss you, Horace.
Thank you to everyone at AR in Action conference and to the organizers. I got to try both the Meta and Microsoft’s HoloLens and expand my mind to the possibilities with augmented reality.
It’s only a matter of time before we go from this: