“HoloLens is the Wave of the Present” — Revisiting HoloLens 1 First Impressions in 2016

Tim Stutts
6 min readAug 30, 2023

Preface: I wrote this entry in early 2016 after trying HoloLens 1 for the first time. It was a pivotal experience for me in understanding what was at the time state-of-the-art for augmented reality technology and believing in the future of head-mounted displays.

I got the demo at Microsoft in Austin, Texas. At time time I had been working on data visualizations for IBM Watson for a few years, and was in the interview phase for a potential next position at Magic Leap, which was ultra-secretive about the mixed reality headset they were building.

Seeing what was possible on HoloLens, ultimately gave me the confidence to take that leap of faith and move to Florida to join Magic Leap full-time in early 2016. That work culminating in the release of Magic Leap One Creator’s Edition in Summer of 2018.

Years later starting in late 2021 I was fortunate to have the opportunity to work on head-mounted display applications at PTC Vuforia that ran on both HoloLens 2 and Magic Leap 1 (ML2 had not quite landed yet).

Fast forward to present day 2023, the technology to pull things off in mixed reality has advanced greatly since the time of this writing, yet a lot of the ‘wow’ moments are in a similar vein, only with better execution.

The following entry includes minor edits, a few reductions and clarifications on HoloLens meaning HoloLens 1, but is otherwise left intact from early 2016. I hope you find it entertaining.

Original Microsoft HoloLens Mixed Reality Headset
Original Microsoft HoloLens Mixed Reality Headset. Source: Microsoft

I awoke early this morning unable to sleep, and decided to write about my first experience with HoloLens [1] yesterday.

Before I talk about my experience using this device, I will say that in the past year, as a point of reference, I have tried Oculus [nowadays Meta] Rift, HTC Vive and Google’s Tango Tablet. Of the three, only Tango is truly augmenting reality, with its handheld screen / camera / IR sensor combo acting as a window into the world in front of the user. In one demo I tapped points on two different walls, and it drew a line between them and outputted the distance between the two points (pretty cool!), but it is not headworn, thus not particularly immersive. None-the-less, it is impressive though in its own right, and might be easier for most users to obtain and experience mixed reality with than a HoloLens (or currently extremely-difficult-to-demo Magic Leap).

My HoloLens experience began as follows. I was brought into a mid-sized, dimly lit conference room containing a number of desks and chairs. I had watched tutorial videos prior, but did not recollect how to put on the device, with its outer ring (contains the cameras, sensors, CPU, etc–super-impressed by how much the hardware engineers have been able to pack in) and hinged inner ring (fits device onto the user’s head), which I found to be a little challenging. Once the HoloLens was on comfortably, I spent a couple minutes repositioning the outer ring so that the field of view area was completely visible to me. After that, I spent another several minutes allowing the device to calibrate to my eyes, finger and scan the surrounding room. For the most part, the device’s guidance through this process–a mix or voice, text, infographics, and GUIs all floating/sounding seemingly about six feet from my face–was intuitive. The finger gesture of selecting an object took a little while to master, but coupled with head-tracking as a pointer, and reinforced by Cortana-powered voice interactions (once looking at an object, a user could simply say, “select,” and this functioned similarly to a finger gesture for selecting), it made for a powerful multi-modal interaction trio. “Bloom,” a flowering-like gesture made with the hand to take the user back to the home-screen, took a bit more time for me to master, but whenever it struggled to recognize this gesture, I just used voice interaction to get back instead.

Speaking of home-screen, that initial screen composed of Windows tiles, containing the applications and so forth, perceptually feels like a three foot tall scaled-up disembodied mobile device screen. I did not take issue with the screen’s familiarity though, especially since having that as a reference point instantly helped me realize that familiar interactions, like selecting a tile, could be performed with some learning. What is even more remarkable about the screen is that it seemingly floats in mid-air. The user can walk around it, even underneath it, stare at it, and it tracks beautifully to room. Furthermore, the resolution is sharp and individual pixels aren’t particularly perceptible (Magic Leap makes a similar claim). From that initial screen, you can launch other windows and place them anywhere you want in the room space. I first launched an app of 3D models, placing it to the left of the home-screen, then proceeding to extract various animals and objects, and position and scale them around the room. Removing them was managed by staring at a delete buttons and saying “select,” though I left a shark lingering in front of my home-screen just for the fun of it. Next I opened a flat plane of a web browser window, positioned it adjacent to the home-screen (instant 3 foot secondly display — pretty powerful) and navigated it with the previously learned select gestures. Following that I opened up an application that allowed to me to explore a galaxy and eventually a solar system. This application took up more of the room z-space and caused the home-screen and other previous items to be temporarily hidden. Entering our own solar system, the line work of the orbits and labels for the planets appeared incredibly sharp and popped out just as much as the planets. I selected Jupiter and a narrated voice instructed me to find it’s red storm. I shuffled past a couple a couple chairs and table in the conference room to get to a vantage point, momentarily remembering where I was physically. A mixed reality experience indeed.

In terms of the camera into the HoloLens world, one of the things that is immediately apparent, especially when walking around in the room, is the narrower-than-reality field of view. It’s a bit like staring through a large window that moves with your head. The HoloLens doesn’t give you peripheral vision–yet. But make no mistake, it’s still pretty incredible, and more appropriately immersive applications on the device, like games, place sounds in the periphery via spatial audio technology embedded the headworn device to key users into the world that they can’t immediately see. I played a robot invasion game that took advantage of this, keying me in sonically as to where my enemies from firing from, so that I would rotate my field of view manually. Those cues gave me enough warning to duck beneath fireballs en route. It was mesmerizing staring up from a crouched position, as they passed over my head, leaving trails of flame that quickly dissipated along their paths. Other interesting things about games on this device is their ability to spatialize elements in accordance with real-world objects, taking advantage of walls (e.g. robots breaking in from the outside) and floors (e.g. in another game, detective scenes unfolding). An added function to aid in spatial awareness was being instructed to look around, giving the system more details about the space. When I did so, I witnessed line-of-sight meshes rendering on furniture and other surfaces, a metaphorical load graphic of accumulating metadata for a previously mysterious metaverse. So meta. For those who have experienced the Tango tablet, it felt similar to the demo that reconstructs a user’s world out of voxels.

So, in summary my first direct impression of the HoloLens [1] is that it is amazing and a huge first step for a head-worn upscale consumer device in the mixed reality space. A 45 minute experience managed to blow my mind even more (and also made me far less nauseous) than the cumulative hours of musing and experimenting with DeepDream last year, if that’s any basis of comparison. If you’re a designer or developer, definitely try a HoloLens [1] as soon as you have the opportunity. It really is the wave of the present.

Spring, 2016

--

--

Tim Stutts

Product designer, leader and innovator writing about emerging technologies, moonshots and lessons learned. Occasional sharing science fiction. http://stutts.io