Ryan Betts
Virtual Reality Pop
18 min readOct 7, 2016

--

Practical VR: A Design Cheat Sheet. Last updated: Jan 6th, 2017. Added link to Andy Stone’s VR typography piece.

Preamble: my work heavily skews towards untethered / mobile platforms, and practical applications rather than entertainment, so this cheat sheet will be biased that way. Oh, and if you want some professional help, there’s info on how to hire my brain at the end of the article :)

Why I made this.

There are a ton of great talks and blog posts about design for VR out there in the world. Watching them all and distilling the most important info is time consuming. I wanted to share my notes. I welcome feedback. I intend to keep this as up to date as I can, as things change, and would love all the help I can get.

If you have something you think I should add, or would like more information on, please don’t hesitate to write a response. There’s a lot more that I haven’t had a chance to put into this cheat sheet yet. Your feedback will help me prioritize.

In this cheat sheet:

  1. A Quick Primer
  2. Guidelines
  3. Process
  4. Tools
  5. Glossary
  6. Platforms
  7. Other resources
A Quick Primer; a short afternoon lesson plan

New to VR?

This is a quick 1–1.5hr itinerary to help orient you to the most important things you need to know about VR. Unfortunately, there is no way to truly get oriented without access to a VR system directly, and the most accessible and soon-to-be popular platforms require an Android phone. I recommend getting one that is Daydream Ready. Anything you attempt to learn about VR before trying it is going to be wasted time.

Talking about VR is like dancing about architecture. You need the experiential framework to hang that learning on.

1. Foundational Design Principles (~20 mins)

Cardboard Design Lab is a first person approach to experiencing foundational principles of VR design

Google Cardboard Design Lab app on Android is hands down the best way for you to internalize the foundational principles of good VR design, as we understand them today.

2. Thinking about Ergonomics and Process (~20 minutes)

Mike Alger is one of the best voices for UX in VR.

Mike Alger covers the important things you need to know about ergonomics in VR, has some examples of ways to think about design solutions, and includes a few thoughts on design process and documentation for teams.

3. Beyond the Basics (~30 min)

Lastly, take a look at some of the really creative exploration being done by the Daydream Labs team. Broken into 3 parts: Interactions, Immersion, and Social. These experiments hint at the amount of ground still to be covered in developing mature design patterns. Lots of fertile ground.

x. Coda

So, welcome to design for VR.

Before you continue, I think it is important for me to emphasize that a lot of the above content is focused on presence-dependent applications: entertainment and gaming. And at the risk of stating the obvious, I want to reinforce:

There are fundamental differences between presence-dependent applications and task-focused applications.

The UI you design for a game is never going to have the same needs as the UI you design for a productivity tool. My guess is a misapplication of conventions from one to the other is going to be the source of most friction in VR user experiences during these early days.

Interaction Design Guidelines to keep you on the path.

General Principles

From Designing for Daydream, Google I/O 2016: https://youtu.be/00vzW2-PvvE

Keeping the frame rate above 60fps is paramount, and ensuring the frame rate stays stable is key.

  1. Optimize for performance.
    Keeping the frame rate above 60fps is paramount, and ensuring the frame rate stays stable is key. Without this, you risk motion sickness. The human vestibular system is a fickle beast. This means discussing with your development team what the aesthetic limits of your world are. Lower fidelity and higher stability is better than higher fidelity and lower stability. PSVR, being lower resolution but running at 120fps, is a good example of this at a hardware level.
  2. Prioritize comfort.
    Fitt’s law is in full force in VR. Ensure that user’s can employ an economy of motion: cluster actions that are used together (ie. next/prev), make objects magnetic/snap to grid, etc. Know before hand whether your application is better suited to seated or standing, or requires full 360° rotation.
  3. Prioritize ease of learning.
    Because there are no precedents for how people should interact with things in VR, there is little room for UI shorthand. This means ensuring that everything provides clear feedback, interactions are explained through action rather than text instruction, and that key concepts are introduced in a timely fashion. I know I said I’m “not interested in games” but Land’s End is the best example I’ve seen of this so far; the game is incredibly easy to learn and there is only one line of text to instruct you on what to do.
  4. Avoid being overly literal.
    We’ve spent the better part of a decade undoing over the top skeuomorphism. The first instinct many people have in VR is to make everything behave like real world counterparts. We need not recreate all the minutiae of everyday existence — a pickle jar doesn’t need to be as difficult to open in VR as it is in real life. Use cues from the real world where helpful, but take advantage of the fact that the physics and character of a VR environment are flexible.
  5. Sound is a critical detail.
    This is tough to understate, and is the largest departure from what we’re used to in software. People will not be multi-tasking in VR — at least not yet — and they will certainly not be doing anything else in the real world while they are in VR. They are going to be wholly consumed by VR, with their attention fully dedicated to the task at hand. Sound helps them situate themselves and focus on that task. It is also one of the fundamental ways that you can provide feedback to the user.

“Hearing is a form of vision that gives you the entire scene all at once”
— Alex Faaborg (Google), Voices of VR episode #423

The Camera (The user’s perspective)

  • Do not attach things to the camera.
    It gets uncomfortable if you cannot look away from something. Think of what it’s like when there’s a smear on your glasses.
  • Do not accelerate or decelerate the camera.
    Keep the camera moving at a constant speed. Acceleration and deceleration will make a user feel uncomfortable. Follow these rules: forward > backward, up / down > strafe left / right, fast camera cuts > gentle camera rotations.
  • Try to match the height of the user’s eyes.
    Is the person short or tall? Think of the perspective difference between a young child and the world’s tallest person. This is much easier with positionally tracked systems, but can be considered in advance when positional tracking isn’t available.

UI Elements

Positioning:

The goldilocks zone for UI elements, ripped from Mike Alger.

Anything closer than 0.5m is difficult to focus on. Anything further than 20m will lose its depth or “3D-ness”. Naturally, due to the screen-based VR displays and optics we have right now, the eyes will be focused at a distance of 2m. Objects will feel most comfortable from 2m-10m away from the user.

A great template for the different content zones, by Mike Alger.

As far as head rotation goes, Samsung Interaction Designer Alex Chu has done research to into comfortable ranges of motion:

  • Left/right: up to 30° for comfort, 55° maximum.
  • Up: up to 20° for comfort, 60° maximum.
  • Down: up to 12° for comfort, °40 maximum.

These guidelines are excellently visualized below by Google’s Mike Alger.

And for roomscale experiences, here is a comparison of recommended ‘room’ dimensions across the Rift and Vive:

http://www.roadtovr.com/oculus-touch-and-htc-vive-roomscale-dimensions-compared-versus-vs-visualized

Interaction:

  • Focus and active states are necessary.
    Shouldn’t have to point this out by now, but nothing in VR is obviously actionable, so either have the object or the reticle indicate something is actionable when focused upon. Sounds are helpful to support this, especially on the active state.

Motion:

  • Avoid lateral motion with close/large objects, and at high speeds.
    If you think motion is appropriate, short/slow steps towards and away from the camera are much easier to stomach.
  • Avoid fast motion toward the user. Remember what you do when something is moving quickly towards your face: you duck. Especially for practical applications, this is probably not the instinct you want to trigger in your users.
  • Be careful about moving the user. If you need to move the user in 3d space (which really means move the camera), don’t use acceleration — linear motion only. The best option is to instantly move them from one position to the other (called “teleporting” them). Follow these rules: forward > backward, up / down > strafe left / right, fast camera cuts > gentle camera rotations.

Reticles:

  • Should be rendered at the same distance as the content the user is focusing on.
  • Have a hover state and an active state.
  • In most cases, they should be shown only when near, or on top of, an interactive entity.

Text:

https://medium.com/emerson-stone/designing-user-interfaces-for-virtual-reality-ea04d4935f6a?source=linkShare-fb161c89eb88 – 1483909391

Larger is better.

  • Avoid showing text on white and translucent backgrounds.
  • Because current HMDs are about 13 PPD, text should be ~1.5° degrees tall. Or roughly 20px tall on most current displays.
Again, from the amazing work done by Mike Alger.

Text size can be calculated using the equation below. I won’t try to explain it for you. Hopefully someone (or maybe me) makes a handy tool that will do the calculations for you, that I can just link to.

This is a calculation you can use to determine a comfortable text size.

Environment

Grounding:

  • Keep users grounded. This is meant literally. Create a floor surface that confirms for the person that they are on solid ground.
  • Frame the important content. The environment should be designed to direct a user’s gaze towards the important content.

Wayfinding:

  • Default the user’s orientation to the most important item of interest.
    You’ll often find that you need to transition a user from one scene to another. Ensure that when you do, you reorient the new scene to line up with where the user is looking.
  • Use sound, motion, light and colour to direct the user’s attention.
    We know from the language of cinema that motion, contrast and colour are the best visual ways to direct someone’s attention. But in VR, we have the unique problem of someone potentially having their back turned to the important element. Ideally, you’ve designed to reduce the chances of this moment, but if it occurs then sound is the best method for coaxing the user into turning their head to face it.

The sky / background:

Your 3d environment will need to have a sky; this is the equivalent of the background. This will usually be either a sphere (a panorama texture using equirectangular or some other type of projection) or a cube with a texture in 6 parts “folded” onto the inside of the box.

Check out Tessa Chung’s excellent in depth series on Skyboxes for more information:

Design Process in VR requires a lot of experimentation.
Do More, Plan Less. From Alex Chu’s Samsung Developer Conference talk.

Watch this video from the Google VR team from 23:00–34:00 to hear about Daydream Labs’ approach to VR product design process:

  1. Hand sketching.
    This should be the only solely 2D step in your layout and interaction design process.
  2. Volumetric Layouts.
    Bring your ideas into the VR world as fast as possible to test the comfort of your volumetric layouts. It is okay to cheat and used flat 2d designs from Sketch as textures to approximate UIs. Do not use your laptop screen as a proxy — actually evaluate these in an HMD. At the earliest phases, you should probably use a technique called “grey boxing” where you use flat shaded objects to rough in the layout — similar to wireframing for 2D interfaces. Once you are confident in the comfort of the layout, then you can focus on improving the fidelity.
  3. User Test!
    Test early and often with real users. It is important to test a cross-section of people, not just users with existing VR experience. It is likely that someone trying your app for the first time will also be trying VR for the first time.
Design Tools unique to VR.

The design tool kit for VR is incredibly nascent. There is no silver bullet to accomplishing a clean design process where all of the tools work together. Be prepared to get your hands dirty. Much tedium lies ahead.

Prototyping & Layout

90% of your designing, at least in these early days, is going to have to be prototyping. There’s just not enough shorthand. So here are some tools I’ve found to be good and bad for those purposes.

Note: This list is definitely incomplete. I don’t yet own a Vive, which has some tools great for the design process such as Tilt Brush. On Oculus, there are Medium and Quill. Both Unity and Unreal have immersive construction environments for room-scale VR platforms that they are in the processes of releasing as well.

Great

Nothing meets my definition of a great prototyping / layout tool yet. Hope this changes soon.

Here’s an article about the latest news regarding Unity’s in-VR editor: http://uploadvr.com/unity-finally-dates-vr-authoring-tool-next-month/

Expected launch is at the end of the year. No specific dates yet.

Good

  • A-Frame: Library for quickly mocking up simple interactive experiences in VR, built by Mozilla atop ThreeJS. Super valuable, and a very quick learning curve to get started. Hands down the best prototyping tool for small, cross-functional teams. In my case, I am a team of one with a background in web technologies, so this put everything within reach. The downside is that for phone VR use cases it is not low-latency enough to be used reliably for user testing.
  • Dayframe: A small boilerplate project I made to enable faster prototyping of Daydream VR experiences, that uses websockets to allow you to emulate the Daydream controller with a spare smartphone you have.
  • Unity: A full environment for creating cross-platform VR experiences. Pretty complex relative to the above 2 options, but much more complete and powerful. Because it’s good enough for most actual production work, this is for your high-fidelity prototypes. Massive learning curve, though.

Less Good

  • Sketch-to-VR: Simple plugin for laying out ideas in Sketch and then quickly exporting them to an A-Frame scene. In practice, I’ve found this to be a very poor indicator of final impact, because it’s just images projected on a single flat cylindrical plane. Once you have a well developed intuition of what works and doesn’t work, this may be a bit more valuable as a quick first step for blocking out a scene. I have my doubts, though, as it removes far too much of the dynamism that VR enables.
  • FramerJS VRComponent: If you’re used to FramerJS, the best scriptable prototyping tool around, you can leverage what you already know and start prototyping “VR” experiences in a familiar environment. Biggest (and ultimately fatal) drawback is that it doesn’t actually support a stereo camera, so you can’t try your prototypes in actual VR — yet. I have my fingers crossed that will change soon.

Modelling

Sometimes you’ll need to generate models quickly. Most simple geometries are included in any tool you’ll be using (cubes, spheres, etc) but you will need more than that. I’ve found that .obj is the most portable format. Here is a list of good modelling tools to get you started.

  • MagicaVoxel: great for simple lego/minecraft-like modelling. Voxel-based models are to 3d what pixel-art is to 2d.
  • TinkerCAD: I’ve found this tool to be excellent for really quickly generating simple compound 3d shapes.
  • Blender: Higher learning curve, but great for when you want to get serious about creating complex objects.
  • Sketchup: (though you need an additional plugin in order to generate .objs)

Also, you can find plenty of models online. I’m sure that more asset stores will emerge to fill the needs not yet met. I’ll keep adding to this list.

  • Sketchfab: Great library of VR content that is actually browsable in WebVR. This is basically the Flickr of 3D content at the moment.
  • Thingiverse: this is largely for printable objects, but it’s still a rich source of 3d models. You’ll need to convert from .stl to .obj format.
  • Sketchup 3D warehouse: Tons of objects in both .skp (SketchUp’s file format) and .dae formats.
  • Unity / Unreal Asset Stores

User Testing

So, now that you have an idea, how do you user test it? There is only one VR focused platform for this right now: Fishbowl VR

There are some DIY setups that I’ll document later. You will want to record two things, though, and synchronize them:

  • A video of the real world user, interacting with the system, so that you can see the physical motions they are making
  • A screen capture of what they are seeing in the HMD

Some companies have developed what are called Mixed Reality capture setups so that the footage of the real-life person is composited into the virtual world. There are no easy solutions for this right now. Here is a good post on it:
http://www.kertgartner.com/making-mixed-reality-vr-trailers-and-videos/

Glossary: inside-baseball acronym-y shit that you gotta know

Equipment

HMD: Stands for Head Mounted Display. These are the VR goggles.

Tracking

Tracked: a tracked VR system is one where the position of the headset in 3d space is known, rather than just the orientation of the head. This allows the user to lean side to side / forward and backward, and crouch or stand taller.

Room scale: a room scale system is one where the tracking area is large enough that a user can move around freely in 3d space. ie. they can walk around the room.

Inside out tracking: this is a type of tracking that is contained on the HMD.

Optics

Display: the screen in the headset. All systems are using LCD screens at the moment, hitting at least 400ppi.

Lenses: all HMDs require lenses to distort the image displayed on the screen and make it more comfortable for your eyes. Imagine how hard it would be to focus on a screen so close to your face!

Barrel distortion: to correct for the distortion that the lenses apply, the rendering engine will create something called barrel distortion so that the image on the display looks sharper when it is reflected to your eye through the lenses.

IPD: Inter-pupilary distance
This is the distance between people’s eyes. Every individual’s is slightly difference and this will impact the effect of the stereoscopic effect. This is what produces the illusion of depth.

FOV: Field of view
Measure of how much of your vision is occupied by the VR screen, measured in degrees, horizontally and vertically.

Units of Measure

PPD: Pixels per degree
In print we had DPI (dots per inch), on screens we have PPI (pixels per inch), and in VR we have PPP (pixels per degree). Knowing this per platform will help you ensure your designs are legible.

Meters:
Everything you define in a scene will be measured in meters. This is true for WebVR (and thus A-frame), Unity and Unreal. Sorry imperial countries, but metric is the the standard for the metaverse.

Display Rates

In VR, you are concerned with the Frame Rate of the software, the Refresh Rate of display, and the Sampling Rate of the position, motion and orientation tracking of the headset and any controllers.

Frame rate: 60fps minimum, 120fps target
How many frames are rendered per second by the software. This is not constant for a piece of software — it depends significantly on the capabilities of the CPU and GPU running it. You want to write your software so that it hits a baseline of 60fps on any target hardware. At the moment, this is a serious limitation of VR on Android web browsers — chrome is capped to 30fps.

Refresh rate: 60hz minimum, 120hz target
How many times per second does the display refresh. All modern VR platforms achieve this baseline. For smartphone VR, this is a spec that can help you define which devices you don’t support.

Sampling rate: ~100hz minimum, 1000+hz target
How many times per second is positional and orientation data sampled. Poor performance here is one of the root causes of motion sickness. All modern VR platforms achieve this baseline. The iPhone 6’s IMU (inertial measurement unit) has a maximum sampling rate of 100hz. Android phones vary wildly. To make matters worse, in Chrome for Android, the sampling rate seems throttled and experiences that require fast motion can quickly lead to motion sickness.

Time to Photons / Motion to Photons: 50ms minimum, 20 ms recommended, 2ms target
The combination of the frame rate, refresh rate and sampling rate will give you the bulk of what can be called total system latency or Time To Photons: the time between the user doing something, and the results being shown on the display. Anything slower than 50ms can be disorienting. Anything 2ms or faster is entirely imperceptible.

More Resources

Check out Steve McCarthy’s VR Glossary project. Chock full of terms and visualizations of key concepts.
http://www.vrglossary.org/

Platforms

The onboarding process, FOV, resolution, frame/refresh rate and input methods are drastically different across the platforms, just to name a few core things. Below, I’ve listed critical info along with links to official design guidelines if available.

Portable Systems

These systems have no cables, and all computation is done on the headset, where a mobile phone acts as the screen and compute brick.

Google Daydream

Looks to be the most complete and accessible platform. Though it hasn’t been released yet, I am bullish on the adoption once Daydream phones, headsets and controllers start shipping. SDK is ready to work with right now.

Platform Guidelines: Developer overview
Release date: November 2016
Marketplace: Android Play Store
OS: Android (limited to Daydream Ready phones)
Input: gaze, non-tracked 3DOF controller w/ trackpad & 2-buttons
FOV: ~90°

Daydream is the best untethered system

Cardboard

I imagine Cardboard VR quickly being supplanted by other far more practical low cost options. That said, it’s still the only iOS focused VR platform, so it can’t be ignored.

Platform Guidelines: Cardboard Design Lab (Android app), Designing for Cardboard
Release date: continuous incremental releases to the app and headset spec
Marketplace: iOS App Store, Android Play Store
OS: iOS/Android
Input: Gaze, 1-button on HMD
FOV: 85–100°

Gear VR

Given Daydream’s OS level integration, it’s tough to imagine Gear VR as smartphone+headset remaining viable. That said, it does provide access to any content that happens to be Oculus Exclusive.

Platform Guidelines: ui+input & navigation
Released: November 27, 2015 (new release expected to be announced Q4 2016)
Marketplace: Oculus Store
OS: Android (limited to Samsung Galaxy series phones)
Input: gaze, d-pad + 1 capacitive action button on HMD (back btn + home btn on new version)
FOV: 96–101°

Tethered & Tracked Systems

Tethered and tracked systems are more powerful than their untethered counterparts and offer positional tracking, but are substantially more expensive, require some amount of installation in a room, and have cables you need to be careful not to trip over.

HTC Vive (room scale!)

Getting into the tracked, but tethered, options the HTC Vive is far and away the best. Has great potential value to you as a design tool, and is very well position as a top-tier practical VR platform for professionals. User base is pretty small and gaming focused.

Released: April 5, 2016
Marketplace: Steam
OS: Android
Input: tracked controllers
FOV: 110°

The Vive is the best tethered system. I promoted the Rift to this spot temporarily, but it sounds as though their claims of room scale are not proven to be stable yet.

Oculus Rift (room scale?)

Roughly the same market size as the Vive, but lower quality and only fractionally cheaper. If you need something like the Rift, get the Vive. The only difference is access to content, which is irrelevant when we’re looking at VR platforms for either practical applications or as design tools.

Platform Guidelines: ui+input & navigation
Released: November 27, 2015
Marketplace: Oculus Store
OS: Windows
Input: Touch (tracked controllers — starting in Oct 2016), gamepad (though I imagine this will be supported by fewer titles after Touch launches)
FOV: 110°

Windows Holographic (room scale?)

A late entry by some people’s standards, Microsoft made a big announcement at the end of 2016 that they would be offering inside-out tethered headsets. At CES 2017, Lenovo became the first company to announce a headset for Windows Holographic that they intend to ship this year. The biggest appeal is the price point ($<400 USD) and the fact that it may be capable of running on reasonably priced Windows laptops, putting it in the same total system cost category as Daydream and GearVR, while retaining the advantages of positional tracking.

Platform Guidelines:
Released: Q2 (???) 2017
Marketplace:
OS: Windows
Input: ???
FOV: ???

Playstation VR

PSVR is a dice roll at the moment. It is definitely the lowest quality of the tracked + tethered experiences, but it also is substantially cheaper and has the biggest installed base of 40+ million PS4s sold. It looks like it will be a hit, but don’t imagine it will be of interest beyond gaming and entertainment.

Platform Guidelines:
Release data: Q4 2016
Marketplace:
OS: Playstation
Input: Gamepad, Tracked Controllers. Note: Sony recently announced that every title will have to support the game pads, which complicates matters.
FOV: 100°

https://backchannel.com/immersive-design-76499204d5f6

  • UX of VR: Pretty exhaustive list of videos, articles and code resources to aid you on your quest. It’s where I sourced most of the things for this cheat sheet.
  • Design Practices in Virtual Reality: another good primer article on VR design principles. More academically researched and chock full of information.
  • Voices of VR: Kent Bye does fantastic almost daily episodes about the VR industry. Some of them happen to be very rich with the latest insights on designing immersive experiences.

--

--