ROLE : EXPERIENCE DESIGNER, MAY 2016 - AUGUST 2016
As the only company in the world that lives and breathes the entire VR ecosystem from hardware to platform to content, Oculus is leading the charge in making VR the best version of itself in can possibly be. I spent a summer at Oculus working on enhancing design processes and building prototypes exploring different VR interactions.
Typically design teams in VR are comprised of two categories of designers - product designers (background in 2D design) and product specialists (game engine experts with a design sense). This situation results in two sources of truth for a fleshed out design, a 2D and a 3D source of truth. To solve this issue, there has never been a greater need to build VR tools that fit seamlessly into organizational workflow, so that the entire team can speak in a common language and work in 3D space.
For the most part we view tools as a medium of expression, i.e. we think of the designer as being in control of the tool. However, for better or worse that control flows in the opposite direction as well, i.e. the tools we use influence not only how we design but also the design itself. So if we concede that influence, then it seems almost critical that designers be working in 3D space because now it’s not just a workflow optimization but it also influences the product itself, because it influences the way we think.
As I started to explore what a useful tool would look like, I realized that anything that we build is a type of constraint on what a designer can express. Not only that, but their is an inverse relationship between how easy something is to use and the constraints it places on what you can accomplish with it. Any type of a centralized library driven tool that abstracts out the game engine entirely runs the risk of placing too many constraints and the inability to scale.
Instead the tool should leverage on the paradigm of development that Unity encourages. i.e. a component driven approach. On a high level, every product can be viewed as a set of interaction-effect couplings. By interaction I mean user events, such as on gaze or touchpad swipe. Every event has an associated effect such as highlight image or move to the next page. However, instead of abstracting out the wiring of the interactions and effects through a library a better approach is to expose them individually as components.