Develop Mobile VR w/ Oculus & Gear VR

This very long blog post is about developing for mobile, untethered VR and some design considerations and pitfalls to look out for when doing so, before moving onto performance optimisation, testing and Oculus Home store submission. But before getting into all that, let’s look at why you should.

There are currently two main strands of VR development — tethered and untethered, or PC-based and mobile-based. PC-based, tethered VR utilises the powerful graphics cards within desktops to run high-end VR headsets such as the Oculus Rift and HTC Vive. Mobile, untethered VR utilises a user’s mobile phone device within a plastic, cardboard or foam VR headset, relying solely upon the processing power offered by that particular model, which will vary wildly from handset to handset.

Mobile VR Considerations

Mobile VR is currently the cheapest way for consumers and enthusiasts to get into Virtual Reality, with a range of mobile handsets supported and low cost headsets available. Whilst this comes with some issues (more on that later) current and future models of mobile handsets are powerful enough to render convincing VR experiences that are comfortable for most (at a solid 60Hz on Samsung Gear VR). Google Cardboard has shipped over 5m+ units whilst Samsung claim over 1m concurrent users are active within the Gear VR ecosystem at any given time.

Headsets range from the cheap Google Cardboard v2 (often free at events) to plastic variations that are a bit more durable, or even foam versions like Merge VR, to Samsung Gear VR and a whole range of other mobile device connected options releasing soon. Whilst having range and options is good, to develop VR you ideally need a clear set of known hardware limitations and capabilities to ensure a solid, smooth, comfortable experience for end users. Therefore will concentrate on the current best of class options, Samsung Gear VR and the Google Daydream VR (releasing later in 2016 but you can start developing now, see my previous blog post about it here).

Because of the low cost of entry for end users and consumers, the mobile VR market is rapidly growing with hundreds of new apps becoming available each month. 2016 has seen a real increase in general awareness of VR amongst consumers, buoyed by the launches of Oculus Rift and HTC Vive and imminent release of Sony PlayStation VR in October, along with VR appearing at many public facing events and shows.

However being based on Android (although iOS does support Cardboard now and vice versa) it means it is easy to publish titles on the Play Store and already we are seeing lots of cheap, poorly designed VR applications that don’t provide great experiences. Therefore the Gear VR and Daydream marketplaces are (or will be when released) curated for quality control.

Whilst this means there are additional barriers to publishing your apps, for end users it helps guarantee a level of quality of experience and overall, will boost VR adoption since comfortable experiences see users return, or what to continue to try others. So in the short term, slightly more of a headache for developers, but in the long term, better for everyone as VR is helped towards mass-market adoption.


There are limitations to mobile VR though, which need to be considered before deciding upon which platform you’re going to support. Firstly, as the name suggests, mobile VR content is driven by mobile devices, not high-end powerful PCs. Whilst newer devices are certainly providing more processing power than ever before, there are still limits to how far and how complex a scene can be rendered in 3D (if you choose to go down that route). However that doesn’t mean it’s ineffective since a 3D generated scene doesn’t have to be photorealistic in order to be convincing or immersive for users. Low poly[gon count] worlds can be just as fun and effective within VR that are far easier to generate on mobile with great performance.

Another aspect to consider is despite being untethered, allowing the user freedom of movement without being cabled to a PC, currently it’s not possible to track the position of the headset, meaning that no matter how far the user wanders, their viewpoint will only ever update in a 3 directions. This means that a lot of the full VR experiences they may have tried on tethered VR, where they can lean in and get up, close and personal with characters from any angle, or walk around within a room-scale area, are not possible with mobile VR. However there is hardware being worked on that will allow this eventually.

So with less processing power available and no positional head tracking currently possible, it’s understandable why mobile VR excels at and is well supported for 360º photos and video applications. Without any form of real interaction though, if you are just looking around looking at or watching something, there is argument that this isn’t proper VR. Others would also state that with the advances being made with mobile VR and available processing power, developers should target full, high-end VR since the gap will only narrow and whatever you develop today for mobile VR will have a limited shelf-life, as the hardware advances rapidly over the next months and years.

Setting Up Development

Whilst it’s easier to publish apps on the Google Play Store for Android, and a great place to start if you are happy to release free content, I’ll mostly be focussing on developing for the curated content store offered by Oculus for the Samsung Gear VR, which will also be relevant for the Google Daydream VR ecosystem when it releases later this year.

Development environment, SDKs and being an Oculus developer

To make a 3D VR app, you need a 3D development environment that supports the necessary Software Development Kits (SDKs) for VR, in this case the Oculus Mobile SDK. The best 3D engine to use for this is Unity, which is available for free or via a paid subscription. The free option has some limitations and requirements of usage but the paid subscription now includes everything you need to publish mobile VR on Android devices (i.e. Samsung Gear VR and Google Daydream VR) and iOS (which doesn’t really have a strong VR footing yet although iPhone VR apps are now supported by Google Cardboard) rather than having to pay for them separately, as you did with older versions.

Once you have Unity, you need to get the Oculus Mobile SDK for Unity and set yourself up as an Oculus developer. This is a simple registration to create an Oculus ID to log into the website with and start an area where you will eventually submit your app for review before publishing. The Mobile SDK has gone through a number of iterations and is now straight forward to work with, being very well documented. Alongside the Mobile SDK, you’ll also want to get the Audio SDK to make best use of the Oculus positional audio offerings that really add immersion for users to your VR apps. Similarly, this is now robust and well documented.

Now you have the Oculus ID setup, you can also join the Oculus forums (and the Unity forums are great too) to find answers to any questions, queries or problems you run into and be part of the wider VR development community.


So that’s the software side; to develop professionally and create great apps, you will need the hardware as well, so you can build, test and run your apps locally before submission to the store. If money isn’t an option, get as many of the supported Samsung mobile handsets you can afford, to test your app on to check performance across the range of devices. If money is tight, aim for the lowest common denominator device, the Samsung S6, to ensure that your app runs on at least the minimum specification mobile handset supported. Having the hardware for testing means you can take your app out on the road too before release and have users test it and give you feedback as well. Usability and user comfort are key elements of VR design and it_is_very_important to make sure your VR apps do not make users feel ill! But this is a subject for another blog post another day…

Make sure your mobile device is set to both types of developer mode and allows installation of apps from unknown sources. You can do this by going to first Settings > About device > Build number and tapping Build number seven times to enable developer mode for Android. Then go to Settings > Application Manager, select Gear VR Service, then Manage Storage. Tap on VR Service Version a number of times until the Developer Mode toggle shows up. Slide it over to enable Oculus Gear VR Developer Mode. Finally, you want to enable your device to install and run apps from unknown sources, i.e. for when testing and you aren’t downloading from the Oculus or Google Play Stores. To do this, go to Settings > Security and check the option Unknown sources and OK the warning prompt. Now your mobile device is set to launch apps that are in development that you can side load onto it for testing and demoing. NB. If you can’t see Gear VR Service as an option, make sure you have put the device into a Gear VR headset previously to start the installation and setup process of the Oculus Gear VR software, drivers and home app — yes this means you need to buy one!

One final element of hardware to software link is getting the OSIG of the Samsung mobile handset to insert into your test builds so that it runs on the Oculus Gear VR system without being a verified app. To do this you will first need to download a Device ID app from the Google Play Store, then run it to get that specific handsets Device ID. Once you have that, go to the Oculus OSIG generator website and enter the Device ID to get your unique OSIG file. Once downloaded, insert into your app package within Unity to allow the built apps to run on your device.

That’s the software and the hardware, what to design and develop?

Of course, not every app you develop has to be released or destined for public consumption so it’s a good idea, especially with VR, to create a series of prototypes and simple ideas first before committing to the full development effort of a polished experience app. Google spent some time making loads of interaction prototypes for Daydream VR that were simple and effective in helping developers understand the possibilities of what would be possible with the new hardware and input controller. If you’re starting out in VR dev, you will need to do this as well so you understand what does and importantly, doesn’t work in VR, then again to understand the current limitations of mobile VR. If your app performs poorly or is uncomfortable to use, it will not get past submission.

Keep it simple

Simple ideas and simple interactions work best for mobile VR because of the limited input options available. Yes, bluetooth Android controllers are available and supported by Samsung Gear VR, and the Google Daydream VR ecosystem states that a device must come with the controller to be compliant, but the majority of users presently do not own bluetooth gamepads. Therefore if you design an app (typically a game) that has only works with a bluetooth gamepad, rather than the built-in touchpad on the side of the headset, you will be dramatically reducing the potential market size of your buying audience.

The touchpad on the side of the Gear VR (v1 retail edition) is bumpy to make it feel like a gamepad D-pad, so reduces the ease of diagonal swipes and interactions but makes it clearer for new VR users where to swipe forwards, backwards, up, down with a nubbin in the centre to mark a button tap area. The newly announced Gear VR 2, available to pre-order for release later in the year, has reverted back to the flat touchpad that the early Innovator Editions had. This is a good design decision since you can have more complex movements tracked from users fingers.

The main downside of the touchpad on the side of the device is that new VR users tend to get headset grabby, holding the headset with their hands as they get used to the sensation of having their senses taken over by VR. This often results in user accidentally quitting or pausing the app, depending how it’s designed to operate, adding to the confusion and makes it tricky if demoing to people as unlike tethered full VR, you can’t see what they’re seeing without the app being mirrored to a display somewhere.

Baring in mind most people who try your app could be new to VR, it could be the first VR experience they’ve ever tried, simple input makes it a lot easier for them to accept and adapt to the technology, which can be overwhelming for some in it’s own right. Thankfully as VR adoption is becoming more widespread and more and more people have access to VR, having to keep this fact in mind will reduce over time and hopefully next year, we won’t have to worry about it so much.

Rock solid performance

In order to pass submission and to make a comfortable experience for users, your app will have to run at a rock, solid 60 frames-per-second (fps) on the Samsung Gear VR. This is necessary as it has been deemed the lowest comfortable framerate for comfortable VR experiences by most users. Any frame drops, even for the briefest instant, can cause users to feel ill as the virtual world is stuttering and jittering, struggling to keep up with their head movements.

This can be quite a challenge if you are unused to 3D engine optimisation or geometry simplification. You will only have about 50,000 polys (100,000 tops) to play with at any given frame within your VR scene, so you are going to have to be clever and think about the style and look overall and implement Unity tricks to get it looking the best whilst remaining stable.

Thankfully the latest version of Unity 5.4 now supports single pass rendering so you can achieve the same result with less effort required by the hardware, with the VR elements being taken care of by the engine rather than drawing everything twice, albeit slightly different to encompass each eye viewpoint, to achieve the 3D depth effect necessary.

Also thanks to John Carmack, original creator of DOOM, is now at Oculus and spends most of his time focused on mobile VR development and tools. As a result, Gear VR has long supported asynchronous timewarp, a technique employed by the SDK to smooth out frame drops and allow developers to get away with the occasional jitter on mobile hardware. But this does not mean you should use it as a crutch and think that you do not need to optimise your codebase! It has limitations and won’t always save you or your users.

Oculus states the following targets to aim for, when it comes to limitations to bare in mind when developing for mobile VR with the Gear VR headset:

  • 50–100 draw calls per frame
  • 50,000–100,000 polygons per frame
  • As fewer textures as possible (but they can be large)
  • 1–3 ms in script execution
  • Throttle the CPU and GPU effectively to control heat, battery usage and scene performance

NB. All other Android APIs and SDKs (such as Google Cardboard) do not give you access usually to direct control over the CPU and GPU in the mobile device, this is something only Oculus does with specific Samsung devices through their partnership in creating the Gear VR and mobile SDKs.

General Best Practice & Design Considerations.

Now it’s time to look at a series of useful tips and tricks centred around general best practice and VR design, to ensure you are providing a great VR experience for new and expert users alike.

General Guidelines

  • The VR marketplace is still fairly small, although it is now possible to sell upwards of 100,000 copies of a VR game app on the Oculus Gear VR store if popular. Don’t expect to become a millionaire overnight, this isn’t Angry Birds level of adoption, yet.
  • The VR developer community is open and welcoming, friendly and helpful. If you get stuck, there are many forums for Unity, Oculus, Gear VR and Android, as well as VR community Slack channels to join. Find a local VR meetup in your area and attend to meet with other VR developers and discuss your findings or problems.
  • Don’t be afraid of the Unity 5 VR tutorials, there are some great, simple and easy to understand examples of VR design from objects, understanding scale, performance, interaction types and pretty much everything else you need to understand of the basics of VR development.
  • Similarly, the Oculus guidelines for VR design and usage of the mobile and audio SDKs are invaluable resources of information, tips and detailed code samples for great VR performance and optimisation.
  • You will need all target hardware available for development and testing before app submission, or at the very least, the baseline minimum specification model, i.e. the Samsung Galaxy S6. (Technically some models of the Gear VR support the Samsung Note 4 mobile handset device but only advanced app developers should look to build and support this with their apps as it takes a lot of performance optimisation in order to maintain stable 60 frames per second in VR with this particular hardware chipsets.)

VR Design

  • Consider length of play time for new-to-VR users; design around keeping it to shorter play periods (15 min periods) and comfortable.
  • Length of gameplay segments can also help users look after potential battery drain and overheating of their mobile device, allowing them to recharge and cool off if necessary without fear of losing progress in-game.
  • If you’re going to make a horror title, clearly label it as so, especially when user testing, so that users can choose whether to continue with it or not. VR immersion feels very real and sudden surprises and shocks have a much more profound effect on users.
  • A scale of interaction, offering initially simple controls allows newbies to enjoy the experience without being overwhelmed by figuring out what to do; they are already pretty overwhelmed by their first VR experience. Advanced or additional controls that can be unlocked later through design, progress within the experience allows a user to feel powerful and skilled as they master the finer controls.
  • The scene should respond to user head movement at all times, even in menus and cut-scenes to ensure the user has a comfortable experience.
  • Similarly, do not take control of the user’s viewpoint and move their head for them, this is incredibly uncomfortable to experience.
  • Depending upon your app theme, avoid moving a user through the environment unless it is at a constant smooth velocity without acceleration or deceleration. If you have to move the user, place them in a cockpit of some description if possible to ground them and give them something in to foreground to focus on.
  • Avoid changing user perspective. Do not pull the camera from first-person to third as the user will feel like the world has been dragged from beneath them.
  • Allow users to adjust the comfort level if possible so that those prone to motion sickness can enable features designed to reduce discomfort, whereas more advanced users can disable them. Features that help reduce discomfort include:
  • Stepped-turning of user avatar
  • Look and teleport fade to a new location for movement
  • Tunnelling to darken and fade out the peripheral visual data when turning

User Interface

  • Remember you are developing a stereoscopic app! Everything has to be rendered twice and to maintain immersion, should be embedded within the 3D world rather than floating over the top as would a traditional UI.
  • If you have to use traditional UI then project onto a surface rather than directly onto the screen, so that it has a sense of depth within the environment. Typically, setting it to appear 1–3m from the user is considered most comfortable.
  • Where possible, design UI into a logical, fitting 3D object within the world, i.e. onto a book, a scroll, a mobile phone or wrist display, so that the user is able to interact with it naturally.
  • Design your UI layout so that it fits naturally within a comfortable viewing area of the user’s vision, so that they do not have to move their head around a great deal to see all the menu options or to navigate; they can move their head to select items but straining to see items to their far left or right causes neck strain.
  • As mentioned previously, input is typically limited to the touchpad interactions as most owners will not have a bluetooth gamepad available with buttons to select menu items. Therefore you will have to consider alternative input mechanisms that utilise the nature of VR offered.
  • Gaze interactions are commonplace with mobile VR apps, where a virtual cursor is shown that follows the movements of the user’s head position. When enabled, the user just has to look at a menu item to interact with it. Typically there will be a radial progress bar that will fill up after a period of time where the user remains focused on one item, that will then be selected.
  • Whilst gaze interactions are easy to use, consider adding a tap to select override for advanced or impatient users since multi-layer UI menus can be tiresome and slow to navigate using gaze progress wait and fill option only.
  • If your UI appears over the application running, make it clear to the user that they are in the menus and not in the world. Depending upon purpose, you may want to pause the action when showing a menu, or at the very least adjust the lighting and focus to be on the menu.
  • Having a static icon in one corner that follows the user’s head movements provides a constant reminder they are in the menus if they were to look further to the left, right or behind them and lose sight of the panel. Even better, have the whole menu follow their head movements so they can never lose sight of it.
  • Use the physical [BACK] button on the Gear VR headset to allow users to navigate back up a level if you have multi-layer menus, or even front-to-back swipes on the touchpad, although this can get confusing if you have a library menu design where they can scroll through a lot of content cells i.e. a photo or video catalogue.

Performance Optimisation & Testing

If you have already read to this point, where we have looked at considerations as to why you should design mobile VR apps for Samsung Gear VR (& to a degree prepare for Google Daydream VR), how to get your development environment setup and general VR design considerations, and you’re still here, then you must be pretty interested! Therefore it is assumed from this point onwards that you have at least a basic knowledge and understanding of 3D asset creation and app development, since I’m switching to a more specific set of technical terminology in order to relate to key aspects of design and development appropriately. You have been warned…

Performance Optimisation

Optimising the performance of your mobile VR app is key, since it ensures that the user has a comfortable experience (horror titles aside) and it passes the submission review as part of the curated store process.

There are a couple of areas to performance; overall app performance, 3D optimisation and battery lifetime. These all play a part to ensure that users can enjoy your app for as long as possible, give great reviews and tell their friends and associates about it to help spread the word.

  • Optimise for 60 frames per second. You can not drop a frame, although Asynchronous TimeWarp will hide and smooth some more complex scenes but do not rely upon it.
  • Don’t rely on the frame rate counter in the Unity editor since it is doing everything twice when you play a scene on your computer. Whilst it can give you a good indication of where your performance levels are at, build out and test on the target hardware to ensure smooth experiences.
  • Users often can’t tell beyond around a virtual distance of 20m whether the image or scene they are looking at is stereoscopic or monoscopic. Use this to your advantage to swap out skyboxes for far off environments to save rendering load on a mobile device.
  • Use the tools built into Unity to help you: The Profiler and Frame Debugger. These will show you were your app is lagging or overloading a scene, allowing you to go through frame by frame to examine how the scene is constructed by stepping through the draw calls. You will likely find objects you don’t need to render, reducing your overall draw call amount.
  • Furthermore, batch your draw calls wherever possible using the Unity Static Batching and Dynamic Batching tools built into the editor.
  • Cull the faces from your 3D models geometry that will never be seen to remove wasteful polys.
  • Similarly, use occlusion culling to ensure you are not rendering things that cannot be seen yet, i.e. the geometry of a room beyond a door that hasn’t been opened yet.
  • Simplify your 3D meshes as much as possible to ensure you have the lowest level of detail for objects without losing finer information.
  • Reduce overdraw as much as possible to ensure fewer objects are drawn over the top of one another. The Unity Scene View Control Bar will give you a understanding of what can be optimised.
  • Whilst there, use lightmapping to bake your shadows onto objects and scenes rather than using expensive dynamic shadows.
  • Beyond skyboxes, if you have to render an object in 3D in the distance then use lower levels of detail (LOD) in your model with fewer triangles and swap out to higher LOD models as they get closer to the users viewpoint.
  • Make sure CPU and GPU throttling is enabled since failing to initialise these values will result in your app running in a downwards clocked environment by default. Gear VR apps are typically CPU bound so favouring this over the GPU will often get the best performance. However if your app is optimised well, you might be able to downclock both the CPU and GPU increasing battery life and therefore session playtime.


The key to a great app is to test regularly and iterate upon after each session, including relevant suggestions and improvements to user flow, interface, process and design as you go, rather than storing all the effort up until the last minute when you think it is 100% done. This way, you will be able to make continuous small adjustments that will overall require less effort than suddenly discovering a major flaw in your design not apparent to you until too late, requiring a huge amount of effort to rework and fix.

As a developer you will be too close and involved with the app to see the issues and bugs, so user testing early on and throughout the development process is critical to ensure you aren’t missing something obvious that a first-time user can spot straight away. However there are still a number of tests that you can and should carry out yourself before unleashing it upon others.

The main types of testing you will be carrying out are functionality and performance related, to ensure that it operates as it should do, at a basic level, in a means that will make for comfortable usage by users. You could write unit tests for aspects of functionality but sometimes you’re just going to have to carry out manual testing and spot any issues yourself.

If you have decided to manage the development process using Agile methodology, then you can create a series of test cases from your epics and user stories to ensure that the app functions and includes the features as intended. Otherwise you will need to think of a series of test cases that effectively test and capture all possible conditions and uses, not just for the expected behaviour and user journey but also what a user could do to disrupt this and potentially end up in a locked state, i.e. not accepting a checkbox or not meeting the score to progress to the next level without an option to try again.

Testing VR for functionality is harder than testing a normal flat app since you are best running it on the device in the VR headset, but this means you cannot quickly switch over to a spreadsheet or notepad to detail issues you spot. So testing in pairs is recommended so one can interact and carry out the tests whilst the other is annotating issues verbally described.

Before you get to this stage though, you can run the app directly in the Unity Editor to check functionality and performance without having to make a build and deploy to a mobile device. As referenced above in the performance optimisation section, the Unity Profiler, Frame Debugger and Scene Viewer will provide a good starting point to test for performance, as well as in the Editor itself throwing up any edge case exceptions and code errors.

User Testing

User testing requires more preparation and time to get right to ensure you are getting valid feedback about your app and not the technology. As mentioned in an earlier blog post, roughly 9 out of 10 people still haven’t tried VR before and so, using them as fresh test subjects, whilst necessary, needs management to make them useful to you.

When arranging a user testing session, have each tester carry out a period of VR familiarisation and acclimatisation before asking them to try out and test your app. If they are new to VR, then this will give them a chance to be wowed by the technology and the experience of being immersed in another world without that excitement affecting the usefulness of their feedback specifically about your VR app. Once they are aware of what VR can do and how it works at a basic level, they will be ready to test your app with a clearer understanding of how it should work, feel and sensations provided. Some good examples of familiarisation apps on Gear VR are the Samsung Introduction to Virtual Reality (free) and Welcome to Virtual Reality by SliceVR (paid).

Prepare a set of questions to ask them after their test session with your app to gather useful feedback and information about how they felt, how easy they understood what to do, where they struggled or got stuck, or aspects that made them feel uncomfortable (from a performance perspective not a content, scary horror angle).

Remember, it’s harder to mirror the content shown on a mobile VR device and so you likely won’t be able to see in real time what they are looking at (and likely pointing at in thin air). Have a set of printouts with you with key screens and menus on from your app so after their test session, they can refer to them and help them describe certain screens or panels that they’ve seen and are highlighting but may not necessarily know or use the same names as you do for them.

If you have the budget available, there are companies now offering VR testing services to take some of the time effort and strain off you. Testronic Labs now offer a paired VR testing service for functionality and compatibility, whereas Player Research are leaders in user research and user testing, both creating and providing comprehensive reports post-test for you to take on-board as part of the service.

So by now, your app should be running smoothly at a solid 60 frames and is bug free (as ever a piece of software can be), has been tested and verified comfortable and easy-to-use by a range of intended end users. Therefore it’s time to submit your app to the Oculus Store and get ready for launch!

Store Submission

Store Submission Process

In order to be able to sell your app on the Oculus Store for Gear VR titles, you will need to have the app reviewed by the Oculus Store team for comfort, performance and general suitability before they will give the green light for it to be released.

This is a relatively straight-forward process but does require a bit of legwork on your behalf to get everything setup in the backend first. The amount of effort required to setup your app depends upon the features included with your app i.e. whether or not you have IAP, matchmaking for multiplayer, achievements, leaderboards and so on. Many elements you will have to setup APIs and IDs for in the backend, then go back into your app project in Unity to ensure that the appropriate values are used for each achievement unlock, IAP etc.

You should have already setup an Oculus ID, as covered in part 2 of this blog series, but if not, it’s quick to do over on Under ‘Developers > Dashboard’ you’ll need to ‘Create new organisation’ before you can create an app profile. Ensure that all the important information is correct for your organisation, such as address, financial banking information so that you can receive hardware but crucially, monthly revenue payments for your app once it’s on sale.


Once you have setup your organisation you can then setup your app, by going to ‘My Apps > Create New App’ and start to enter the information.

IMPORTANT! Currently, there is no way to delete an app entry as a whole once you have created it so ensure all details are correct at time of creation — you can go back and edit app information at any time but if you like a clean dashboard like me, get it right first time!

The first stage is to select the platform — we’re submitting a mobile VR app so select ‘Gear VR’ and enter the full app name to create the initial entry.

Once you have the initial app profile created with name and platform, Oculus will generate you a unique app ID which you will need to use within your Unity project to initialise any Oculus APIs, especially those that relate to IAP or licence ownership checks for the final version.

Similarly, once you have setup financial information, you can then create any IAP tokens and IDs to then call from the Unity project for each appropriate action via ‘Edit Details > Platform > IAP’.

App Store Info

The main information that will appear on the Oculus Store for your app is under ‘Edit Details > Submission Info’. Here you will enter the full and short description, set the genre, features, peripherals supported, any age ratings and price.

Some of these elements are up to you, some like age rating might need an external body to review, or as with price, the Oculus Store team to work with you to agree on a value that is deemed appropriate. Be aware that all apps are set to ‘Free’ by default so if you want to charge for it, be sure to change this before submission!

One time consuming element, as with any store submission, is the art assets you need to create and provide to go along with the listing. There are a few different shapes and sizes of images required, depending upon where and how the listing will be displayed within the store but the guidelines are easy to follow with advice of where to place the logo, or not, to ensure dynamic banners for sales etc don’t cover it when enabled.

One cool asset that you can provide for Gear VR titles (that isn’t yet supported for Rift titles) is a cube map image, so potential customers can view a still from your title in 360º on their Gear VR whilst in the store listing.


Of course, information is all very well but you need to upload a build to be reviewed and ultimately, available for download post-purchase. It is wise to run the Oculus Submission Validator tool against your app APK file before you submit and ensure you have done the following:

  • Set the XML manifest file and install location correctly — Gear VR apps have to be set to install on the device not external storage.
  • A version code set — typically if this is your first submission, 1.0 or incremented if uploading a new build post-review from a previous submission
  • A signed APK so that once reviewed and confirmed ready for release, it can go live without further uploads necessary.

The build management section allows you to upload builds to a variety of different channels: Alpha, Beta, Release Candidate and Live. Note that a number of journalists have access to the Release Candidate channel and so, if you haven’t done any PR or marketing around your app before launch, be prepared for them to potentially stumble across it and release a preview article without warning. So it’s best to reach out before setting a build to this channel, so that you can preempt any issues if it’s not 100% ready for release yet!

Ready to Submit

Once you think you have everything entered, you can submit the information via ‘Submission Info > Submit’ where a handy checklist is shown with current status of each section needed. Once you have a nice row of green ticks, you have a final chance to review the full listing and details below before clicking that [SUBMIT FOR REVIEW] button. Once submitted, the Oculus Store Gear VR team will review your app information, play test it and get back in touch with any suggested amends before it will be confirmed ready for release.

THAT’S IT! You’ve reached the end — well done and good luck with mobile VR development, be sure to share apps you create below in the comments.

The tl;dr Steps

  1. Buy a Samsung mobile device, either S6 or S7:
  2. Buy a Samsung Gear VR HMD:
  3. Get Unity:
  4. Sign-up in the Oculus Developer Centre to get an Oculus ID:
  5. Get the Oculus mobile SDK:
  6. Get the Oculus audio SDK:
  7. Download an Android Device ID app from Google Play Store:
  8. Enter the Device ID into the Oculus OSIG generator:
  9. Download the OSIG file and embed into your Unity project
  10. Prototype ideas & have fun with mobile VR!

Useful Links For Design

Unity VR Tutorials:

Oculus Intro to VR:

Oculus Mobile SDK Documentation:

Unity Forums:

Oculus Forums:

Useful Links for Optimisation and Testing

Oculus Blog — Squeezing Performance Out Your Gear VR Title pt.1:

Oculus Blog — Squeezing Performance Out Your Gear VR Title pt.2:

Oculus Mobile SDK — Testing & Troubleshooting:

Unity — Optimisation for VR:

John Carmack Facebook post about anti-aliasing:

Useful Links for Submission

Oculus Developer Dashboard

Oculus Store — Gear VR Art Assets Guidelines:

Oculus Submission Validator

Oculus App Publishing Overview:

This blog post originally appeared on as a 5 part series.