Biofeedback in VR: My Weekend at Microsoft Reactor

Lauren Bedal
Virtual Reality Pop
6 min readDec 27, 2016

--

Suitcases full of VR gear. Cartons of Peet’s Coffee. My Saturday morning was off to an early start at Microsoft Reactor in San Francisco. Small groups formed around tables as people unpacked bags in between sips of coffee. Quiet conversations grew louder as the eager hackers filed in. Over the next two days, teams were collaborating to build both VR and AR experiences. Enflux, being the main sponsor of the event was also offering use of their motion capture suit along with prizes for ‘Best Full-Body VR’. Since I had not been in town for the meet & greet, lighting pitches, happy hours, I had one thing on my agenda: Find a team. Quick!

As I floated around to different teams, I came across a designer and developer who had formed a team with two French students studying neuroengineering from UC Berkeley. The students were demoing a Muse headset they had brought with them. For those not familiar, the Muse device is a EEG neurofeedback headband that senses the electrical frequencies of brainwaves as well as muscular activity of the face (For more info about what Muse reads, read more here. With a partner app called ‘Calm’, users can view real-time feedback, record sessions, and establish goals for concentration and meditation. As a previous owner of a similar biofeedback device, I was excited to collaborate with the group to see how frequency data could integrate into a VR experience.

Muse headset

After a quick discussion and some brainstorming, we decided to create a VR experience that is driven by a users brainwave data in real-time. By integrating the Muse data with a VR environment, we could replace traditional methods of input (such as taps, clicks, sliders) to utilize output from Muse during a given user session. Two input methods that the students were comfortable working with included concentration levels (gamma wave activity 25hz to 50hz) and eye blinking. While the students got busy with Muse data, the remainder of us divided up design and development tasks.

We decided to build our experience for Google Cardboard using the Google VR Toolkit. With this platform, our experience could be easily accessible to a wide audience, and especially for Muse wearers that do not have expensive VR equipment.

Humble Beginnings of a Unity Scene

I was tasked to build both the logic and look and feel of the environment. Having previously started a winter scene in Unity, I offered an option to expand on this early direction; the team readily agreed. As the afternoon approached, our team began to lock down logistics. How can Muse data be utilized in our winter environment?

Dream Team in Action: Me, Julian, Robin, Fabien, Jonathan, Steven

As a team we settled on the following user interactions for this experience:

  • Gamma level frequency (concentration level) would determine the severity of snowfall within the scene.
  • A gaze-based hover (over a large snowflake object) coupled with an eye-blink would teleport the user to a different area of the scene.

After wrapping up the basic layout of the environment, I began to include large snowflakes serving as teleportation points to bring the user to different vantage points within the scene.

Basic interaction to teleport would happen like this:

  • User locates large blue snowflake in scene
  • User utilizes eye-gaze based cursor to hover over snowflake, turning the snowflake red. Cursor increases in diameter to reinforce hover state.
  • While cursor is in selected state with red snowflake, user blinks to teleport to that area of the scene
Hover states
Teleportation Points and Flags

We were able to get this working fairly quickly with help from unicorn developer Julian. Just out of high school, he quickly created a script allowing the game engine to read the muse data. Working side-by-side with Julian, we were able to tweak the Unity scripts to give the correct output.

Getting the snowfall synced with the concentration data requires a bit more fancy footwork from all team members. The data from the Muse has to be normalized in a way in which a Unity script can handle the data. This affected the creation and variations of snowfall via a particle emitter. This particular part of the project was a challenge not only with data normalization, but also dealing with the variations of concentration levels from one user session to the next. Because a given users concentration patterns as they pertain to ‘highs’ and ‘lows’ of gamma waves differ, to receive the most accurate results, each user demoing our project would have to go through a quick calibration.

Calibration Process

On Day 2, was spent getting users calibrated and running multiple sessions with the Muse and Google Cardboard in realtime. The calibration served as a nice opportunity to onboard users and explain what the Muse headset was, how it works, and how they would navigate through the Unity environment. It was great to finally see things coming together (and working!). We had just enough time to wrap up, create a video, and write about our process. Then it was time for presentations, judging, and awards!

Adding finishing touches and submitting our work!

At exactly 1pm on Sunday, all teams submitted their projects.

The afternoon consisted of both a short presentation from all teams followed by an open house where the general public could demo the resulting projects. After the presentations and open house, Three judges as well as Enflux, our sponsor, determined the winners for various prizes. The judges present included

  • Livi Erickson, Developer Evangelist at Microsoft,
  • Craig Cannon, Director of Marketing at Y-Combinator
  • Tom Emrich, investor and partner at Super Ventures.

I am proud to say that our team, “Full Dive” walked away with two prizes, as well as some cash!!! We were awarded

  • Best VR Experience
  • Best Full-Bodied Experience (2nd submitted project using Enflux motion capture suit)

Thank you to Enflux and all sponsors for the weekend! I met some wonderful people, including Fabien who I collaborated with the following weekend at Dance Hack Day with Kinetech Arts! Read more here.

The best tip I can give first time VR hackers?

Dive In!
It might seem like everyone there has much more expertise and your time might be best spent on your own, getting up to speed. Let’s be real- hackathons are a social experience to learn and grow from each other. Even those with much more experience are there to pass along their knowledge. I believe it is the best way to learn, especially for beginners! If you are new to VR and want to get a taste of the end-to-end design process, I recommend attending a hackathon.

Cheers! Send some love if you enjoyed this read.
Let’s Collaborate! Please feel free to connect via
Linkedin.

--

--