VR Development – BriteLites


The last post about VR technology I wrote with really no preface as to why write about VR other than I wanted to so this post serves (I should hope) as a preface as to why VR. VR Technology is not groundbreaking, its been around for years along with the buckets of scifi tropes giving VR the center stage, so what makes VR exciting?


There are the Oculus rift, the HTC Vive, PlayStation VR headsets positional tracking and specs to boast and each come with there own set of accessories these however require an expensive high end host PC. But these days everyone with access to a super computer in their pocket has the option of buying one of the mobile headsets to begin their own VR experience.

With mobile headsets like the Google Daydream, Samsung Gear, and flavors of Google Cardboard VR is an affordable option for anyone and everyone to develop and or consume VR content. Unity Game Engine, Unreal game engine to name a few support application development integrating the hardware sdk libraries and a great wealth of developer tools to quickly get develop a VR app.



Then the next question is what makes a great VR experience? I have the Samsung Gear with out any of the peripheral accessories, so I brainstormed simplicity. How can I make a VR experience enjoyable using only interaction supported by the headset, ie motion and touchpad? I reflected upon my early childhood playing litebrite with my friends and how that was such a fun experience and thought that would port to a great VR experience, and have started prototyping.


I decided the simpler the experience would immerse and ultimately give the user an intuitive sense of presence. The controls utilize user head movements to explore a world of spherical lites and the touchpad to select different colored lites and clear lites. BriteLites will be available to download through the oculus store for free in the near distant future.



If you like these blog posts or want to comment and or share something do so below and follow py-guy!


360 Image Viewer VR


This week I’m deviating from posting about data science and python modules to explore virtual reality with my new samsung gear vr to create a 360 image viewer application. This can be done with not much code simply utilizing Unity3d and the OVR sdk to some satisfying results. Following the rest of this post is an abstract walk-through of the steps to create your own 360 image viewer application.

For this tutorial you will need the latest version of Unity 5, and the OVR sdk.

First, make a new Unity Project file with the name “360Viewer”. Choose where you want to save the project on your computer, make sure 3D is selected and click Create Project.
Select OVR from the downloaded folder, drag and drop it into your Assets folder.
Before we jump into our 360Viewer application make sure you’ve created a Plugins folder for your oculus signature file with the structure: Plugins > Android > Assets and then your oculus signature file.


Screenshot (21).png

This file is necessary for the development to access the low-level VR functionality of your device. You can download your oculus signature file when you sign up as an Oculus developer at https://developer.oculus.com/.

Next we want to configure the Unity3D environment to develop for mobile VR. Navigate to File > Build Settings and select Android.

Screenshot (23)

Leave the Development Build field unchecked. This is how you build your application for testing. Click Player Settings and navigate to Player Settings in the Inspector.
Make sure Virtual Reality Supported is checked and select the Oculus SDK. Navigate to identification and set your package name.

Screenshot (16).png

You will want to develop for an API that can run Unity3d Development, Oculus SDK and supports your device. I’ve selected Android 7.0 ‘Nougat’ (API level 24).
Select your minimum and target API. Finally, navigate to Edit > Preferences > External Tools and set the Android SDK and Java SDK paths.

Screenshot (28)


That’s it, you’ve configured your mobile VR development environment!

Now we will create a sphere 3D gameobject by navigating to the Hierarchy, and selecting your scene. Right click and select 3D Object > Sphere.

Screenshot (22)

You will see the Sphere object appear in the scene, let’s make sure it is centered at the origin of our scene at position (0, 0, 0) and set the scale to (100, 100, 100).
Make sure Blend Probes are selected in your sphere’s Mesh Renderer.

Then create a folder for your 360 images or panorama photos and add your images to the folder. These image files will be applied to our sphere’s texture later, but first we need to create a shader that will map the images to the sphere. Create the following shader file and name it DoubleSided from which you will cycle textures.

Shader "DoubleSided" {
Properties {
_Color ("Main Color", Color) = (1,1,1,1)
_MainTex ("Base (RGB)", 2D) = "white" {}
//_BumpMap ("Bump (RGB) Illumin (A)", 2D) = "bump" {}
SubShader {
//UsePass "Self-Illumin/VertexLit/BASE"
//UsePass "Bumped Diffuse/PPL"
// Ambient pass
Pass {
Name "BASE"
Tags {"LightMode" = "Always" /* Upgrade NOTE: changed from PixelOrNone to Always */}
Color [_PPLAmbient]
SetTexture [_BumpMap] {
constantColor (.5,.5,.5)
combine constant lerp (texture) previous
SetTexture [_MainTex] {
constantColor [_Color]
Combine texture * previous DOUBLE, texture*constant
// Vertex lights
Pass {
Name "BASE"
Tags {"LightMode" = "Vertex"}
Material {
Diffuse [_Color]
Emission [_PPLAmbient]
Shininess [_Shininess]
Specular [_SpecColor]
SeparateSpecular On
Lighting On
Cull Off
SetTexture [_BumpMap] {
constantColor (.5,.5,.5)
combine constant lerp (texture) previous
SetTexture [_MainTex] {
Combine texture * previous DOUBLE, texture*primary
FallBack "Diffuse", 1

Create a C# script with name “textureCycler,” and copy and paste the code below. This will enable us to browse our images utilizing the oculus touchpad. Attach the script to your sphere, and select the size and images for your viewer.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class textureCycle : MonoBehaviour {
public Texture[] myTextures = new Texture[4];
int maxTextures;
int arrayPos = 0;
// Use this for initialization
void Start () {
maxTextures = myTextures.Length;
// Update is called once per frame
void Update () {
if (OVRPlayerController.touchRight == true)
GetComponent<Renderer>().material.mainTexture = myTextures[arrayPos++];
if (OVRPlayerController.touchLeft == true)
GetComponent<Renderer>().material.mainTexture = myTextures[arrayPos–];
if (arrayPos == maxTextures)
arrayPos = 0;


Next delete the main camera, navigate to Assets > OVR > Prefabs and drag and drop OVRPlayerController onto your sphere. In the Hierarchy select the OCRPlayerController, if you expand the contents you will see LeftEyeAnchor and Right EyeAnchor children objects. These are utilized to calibrate the virtual environment to the hardware’s optics. We want the OVRPlayerController object to see the inside of the sphere, the OVRPlayerController sends a raycast in the direction the samsung gear is facing and returns the first object it hits.

This is the sphere, we want to see the side of the sphere facing the camera, this is the texture, so select clear flags to skybox, culling mask to default and set field of view to 60. Do the same for the left and right.

Screenshot (29).png

Select your sphere object and make sure you’ve checked static and apply to all children objects. Add some in game lighting so you can view the scene!

If you like these blog posts or want to comment and or share something do so below and follow py-guy!