Microsoft Hololens – September 2013 to December 2015 (2 years, 4 months)
After Ascension, I was feeling a bit burnt out and sensed the studio was headed for major changes. During my break, I looked for something fresh, something that could reignite my curiosity and push me in new ways. That led me to a secret AR project in Redmond that felt genuinely revolutionary. Jorg Neumann, known for leading much of Kinect’s content, brought me onto the New User Interface team. I reunited with my old Oddworld boss, Jeff Brown, and joined a crew of incredibly talented senior folks I was eager to learn from as we tackled an ambitious vision together.
Microsoft operated quite differently from Sony. The development process in publishing leaned heavily into executive and production-led decision-making, rather than the bottom-up creative flow I was used to at Santa Monica. I found myself spending more time pitching ideas to leadership and proving their value – a shift that felt a bit grounding, especially since those calls used to fall under the Creative Director’s domain. In hindsight, it made sense given that my last project may have had too much creative freedom? Also, fun bonus: we had actual offices. I hadn’t had one to myself since my days in the old Atari building at Midway. Being able to shut the door and focus? That was a welcome change.
Fragments – Design Director (2013 – 2014)

When I joined NUI, my first team was very small. It was just me and Matthew Hoesterey designing and Matt Turnbull was our producer. We had Asobo studios in Bordeaux doing all the heavy lifting on the implemenation front. Our job was to figure out what the game was, how you played it, what its mechanics were, and getting it to run on a totally different type of device that had less power than a cell phone. Early going, in all the projects, this was tough. Most people collaborated on white boards, wrote design docs, visualized in photoshop – you know, the normal office suite of collaboration and creative tools.
What we ended up doing was super lo fi, but this allowed us to figure out the game. Matthew and I would have legendary brainstorm sessions. We often opened an editor to see things in 3D, but when we were trying to figure out how you play, we often used sticky notes and put them on the table or the wall when discussing a concept. Finally, I decided while I was trying to show how the audio tool worked, I would mock it up with just sticky notes, placing them all over one of the meeting areas on our floor and acting like I’d get close to it to trigger the activation range of the positional sound. This turned out to be a breakthrough – this is how we could mock up all of the crime scenes! We started by placing sticky notes around a room and snapping photos: each one representing a piece of evidence the player would use to build a solution. Once Hoesterey devised a clever filtering system to help make sense of it all, the core gameplay fell into place. And honestly? It turned out to be a solid experience. Too bad only a few hundred people ever got to play it!
Here’s a few learnings from my time on Fragments:
- Developed the core loop, mission and game-play design for a crime solving AR experience
- Part of an incubation team that prototyped new and challenging ideas on the HoloLens platform
- Developed the Tools used in the investigations players used to produce evidence
Holotour – Design Director (2014 – 2015)

I jumped over to HoloTour, which was an existing experience developed by another team. It was the only VR on AR experience for the device, which proved quite challenging to make work, considering our very tight FOV range on the device. Most VR need full or more Angle of View than what we could provide. That being said, the team before me had already mocked up an experience, where the player goes on a guided tour to Rome and learns some things about it while they are there, air tapping on objects in the environment to play animations or hear audio exerpts from a tour guide. It was interesting, just a little hard to see.
My job was to do the next tour, and we moved onto Machu Picchu. Despite never travelling there myself, we put together a tour that was based on the footage the prior owners captured on their visit to the ancient site. We placed areas of interest in a 3D editor (Unity), where players would interact, we wrote a few interesting, yet fun lines for that interaction and the robovoice would read it back to the player. I think we ended up with 4 panoramic captures with tour information blocked out. It worked, but something was missing – but thankfully we figured it out, thanks to a little 3D butterfly in the Aguas Caliente pano. The trick wasn’t the pano, it was the near space AR objects, kind of an ah-ha moment for people developing on an AR device, but it was true. With this knowledge in hand, we leveraged this as much as possible, surrounding the player with interesting AR objects they could get close to and inspect from different angles. We put the player in an AR balloon and let them look straight down through its floor – the reactions were priceless. Finally we capped the tour with an epic storm over Machu Picchu, curtousy of the player calling to the gods to make it rain.
In the end, here is what I did on the project, encapsulated in a handy bullet point list:
- Led an internal team and external partners, shipping the first VR experience on HoloLens
- Directed VO sessions, wrote script and refined the narrative with a writing partner
- Designed, implemented the UX/UI, tour locations, and scripting for the Machu Picchu tour