Bringing XR into a Design Practice — My Own Story

Tim Stutts
7 min readAug 18, 2023

A product designer with entry-level experience in UI/UX design recently asked me how to get a job in AR/VR/XR. Great question!

For part one of this article I’ll share my own trajectory in this area. For me it happened in a roundabout way in a previous era. However there are some useful takeaways from my story that I believe still apply today. I’ll summarize these at the end, along with foreshadowing a bit on a pending part two. Here’s my story.

My expansion into the spatial computing realm began over a decade ago, when most AR/VR/XR technology was still deep within research labs in large tech companies and academic institutions. Around that time I was doing traditional, flat UI/UX design work for desktop, web and mobile, for a big tech company and then a startup. There were no commercially available AR/VR headsets. There wasn’t even core technology necessary to drive augmented reality content on phones and tables. However other technologies and forms of expression of the day allowed a designer to start to explore a spatial direction, if they were so inclined. Game engines like Unity allowed designers to develop 3D prototypes, even if the end projection was on a 2D TV, computer monitor or portable LCD — at least moving in a spatial direction. In my case, I held off from using Unity for the time being, and instead explored creative coding platforms, like Processing and OpenFrameworks, which provided designers an easier entry point into coding up exciting 3D applications. I’d loved using these platforms in my design masters program, where I made hundreds of code-based animations and little interactive applications, but post-graduation, in my Interaction Designer day job, I hadn’t found a way to utilize them yet. I eventually left the full-time work world for a bit and staked it out on my own as an independent design consultant, billing myself as a hybrid interaction designer, prototyper and sound designer. This led to some traditional, more stable UI/UX gigs, which provided me some economic safety to search for other projects that were more on the fringe.

One such project, and what in retrospect I would call my breakthrough into augmented reality themes, was using Processing and OpenFrameworks on an IBM Smarter Planet commercial that overlaid abstract visualizations of energy flowing through a city in the form of intricately complex, code-based, animated vector-based graphics, that traditional 3D animation tools of the time, like Maya and Cinema 3D, were not able to achieve easily at the time. There was no 3D display technology in place per se, but figuring out how to get this content to track to real life video footage as the camera panned around, involved meticulously tracking (often manually and frame by frame — thankfully this part was not my job), the world mesh for anchoring the content — a process that years later would become real-time, automated and taken for granted with mixed reality headset inside-out tracking.

IBM Smarter Planet “Energy” commercial featuring augmented reality visualizations

IBM Smarter Planet “Energy” commercial featuring augmented reality visualizations (2010)

Another leap into an augmented reality theme occurred a couple years later on the input side. At the time Microsoft had released the Kinect, an RGB / depth camera system, which allowed for, among other things, body tracking. Early Kinect games on the Xbox platform provided interaction capability by tracking the position of the hands, which would allow a user to drift their hands over targets and dwell for selecting buttons or punch opponents in a boxing match, displayed on a flat screen. But Kinect wasn’t just limited to Xbox. Microsoft exposed the data so that developers could work with it on a variety of platforms. Oblong Industries, a company whose founder had worked for Spielberg on conceptualizing the technology in the film ‘Minority Report’, was working on commercially viable ways to use hand-tracking instead of gloves as input for a variety of applications. Though Kinect out of the box didn’t track individual fingers at the time, Oblong put a team of computer vision engineers to work to figure out how to make it work. I was brought on to define, design and develop a prototype based on their developed technology that inspired joy with hands. After settling on building a music sequencer, I put on my traditional UX/UI skills to wireframe out a flow of screens and interactions. Eventually I developed, with the kind support of engineers at Oblong, a Cinder prototype that allowed a performer to move and playback audio samples on a timeline via hand position and gestures. While the graphical content was still flat and TV bound, the input was embodied, hinting at a hand-tracking future possible with mixed reality headsets one day.

Oblong “Airborne Beats” hand gesture controlled music sequencer application

Oblong “Airborne Beats” hand gesture controlled music sequencer application (2012)

The following year, one decade before the publication of this writing, is when I finally landed what I would consider my first UI/UX design project that used augmented reality display technology. A friend whom I had initially worked with on the IBM Smarter Planet commercial campaign a few years prior, connected me with a research engineer at Honda, who led a team developing software for a Head-up Display projected onto an automobile windshield. The plan was to use spatial camera technology and computer vision to track the position of objects around a car, and then a separate camera system in the car to track the driver’s head-position, so that augmented reality overlays could be positioned on the windshield aligned to their line of sight. I got to work sketching out some designs to help with situational awareness use cases — specifically visualizing objects intercepting a user’s path (e.g. deer), braking vehicles ahead, and approaching speeding vehicles from behind. I then created a fully tunable HUD prototype in OpenFrameworks to showcase these concepts, some of which the engineers incorporated into a prototype in an actual car.

A sketch for Honda windshield HUD “Situational Awareness” prototype

Sketch of Honda automotive windshield HUD “Situational Awareness” prototype (2013)

In the three years that followed I left contracting, and worked full-time at IBM Watson as a concept designer within an innovation lab and eventually a different lab focused specifically around data visualization. Leadership of the former was dying for solid AR/VR technology, but nothing was able to deliver on that promise yet, so we resorted to using huge 88” touch screens for interactions and tried to achieve, at the insistence of our director, as much realism of 3D content via a flat display as we could muster via Unity and Three.js prototypes. They even brought in an olfactory projector! Towards the end of my time at IBM, my contact at Honda reached out to me, having recently moved to a startup called Magic Leap, which if you’re reading this article, needs no introduction. I applied and got a position as prototyper and over the next four years shifted more in the design direction, working on input, operating system and other aspects of the platform. It was my first full-on augmented reality job that led to a tangible product, and the rest is history.

Magic Leap 1 Lumin OS “World Understanding” object recognition visualization

Magic Leap 1 Lumin OS “World Understanding” object recognition visualization (2019)

About those useful takeaways that I said I’d get to at the end, which might still apply today to UI/UX designers trying to find their way into AR/VR/XR today. Here they are:

  • Be open in terms of the platforms that you work on to get from point A to point B. You will find many more job options if you consider projects that involve other emerging technologies, like AI, custom hardware, even novel web technologies. The UI/UX does not always need to be spatial. Also you will meet talented people at those companies, who are pushing boundaries, and may connect you to other opportunities down the road. Some of them might be in spatial computing. That’s the way it worked for me.
  • Don’t pigeon-hole / limit yourself to just being an AR/VR/XR designer. I have a lot of experience in this realm, but it’s not all I’ve done / am doing / will do. These days, though I’m mostly directing and managing design teams, when I work as an individual contributor, I think of myself as a Product UI/UX designer. That’s a broader bucket than AR/VR/XR, though not too broad. Spatial computing can be just one of your focal points. Also I still enjoy working on flat applications, which satisfy the vast majority of software application use cases.
  • Learn tools that are unorthodox to UI/UX and find ways to bring them into your design practice. For me back in the day these were creating coding platforms. Most people I knew when I started out who dabbled in both UI/UX and creative coding, kept those worlds separate. I worked to bring them together, because I saw the value that they could provide to my design projects and processes. Find or develop something unique into your design practice aside from traditional UI/UX. It can set you apart from other job candidates and lead to exciting opportunities.

For those working in UI/UX design who want to get back to the original question and learn more specifics on how to get into the AR/VR/XR in today’s world, I promise to interview a designer who is newer to the space in a follow-up article. Stay tuned.

--

--

Tim Stutts

Product designer, leader and innovator writing about emerging technologies, moonshots and lessons learned. Occasional sharing science fiction. http://stutts.io