Wednesday, October 26, 2016

Interface Design and Unusual Platforms, Part 2

This is a continuation of part 1, and I would recommend reading that first for some background info.

My last post left off with the Wii Fit Balance Board and Kinect for a game played on hoverboards battling giant robot spiders. How can I possibly top that?

Well, up next is the Leap Motion (not to be confused for AR company Magic LEAP) - it bears some similarity to the Kinect in terms of hardware, but it is used to track your hands with great detail...every movement of every finger mapped to a skeleton!


I entered a Leap Motion Game Jam to get a free Leap Motion (one of the best ways to get free hardware!). As always I started with the question of "what makes this hardware unique and what can I do with it that I couldn't do with any other hardware?"

Other games have tried to simulate gardening and farming, but their shortcoming is that growing plants really involves working with your hands, something hard to replicate in video games using a keyboard or controller. The Leap Motion is unique in that it allows you to use your hands with the full range of dexterity we have in real life. So I made a garden simulator. It’s important to remember though that digital experiences do not need to perfectly match their real-world inspiration, sometimes it can be more fun to do something weird. So I took a weird twist on gardening - the player's interaction is to literally pinch and pull the plant upward to make it grow. Then you can dip your hand in a water bucket to sprinkle the plant with water to make it bloom. But beware! If you pull the plant to quickly you will pull it right out of the ground!

I ended up making this game in about 24 hours, because I procrastinated until the very last moment to get started. Here is a gameplay video. This one you can also download from itch.io if you have a Leap Motion and want to test it out!

Of course I'm going to get to Virtual Reality, but one more thing before I do - Augmented Reality. I haven't actually made anything for Augmented Reality yet, but this stuff is crazy cool. It basically covers any way hardware can be used to alter your experience of real physical space, including overlays with AR glasses, digital displays on top of real-world camera usually using a phone, and of course projections like this one:


This is a height-map projector, it projects a different color based on how close the surface is to the projector. I had a chance to play with one of these recently, and it's great - you can play with the sand with your hands and build mountains and valleys and lakes, and the colors update in real time as you play. 

AR Games very commonly refer to two different types of games - Augmented Reality, which I was just explaining as being digitally overlayed on the real world, and Alternate Reality - which is like a parallel universe with the same global coordinates as our universe, but is instead inhabited by Pokemon. 


Okay, so now we can finally get to Virtual Reality, I'm sure that's what you're all here for. I'd like to take this opportunity to draw a distinction between Head-Mounted Display (HMD), Head Tracking, and full VR headsets.

A Head-Mounted Display is like a monitor attached to your face - it's pretty cool that it lets you get up close and personal, but the limitations without head-tracking means you are looking at a fairly stationary display.


Head-tracking is similar to other forms of motion tracking (hand-tracking, body-tracking, motion-tracked controllers...). Head-tracking does what you'd expect - it tracks the motion of your head and then uses that data for something useful in the experience. zSpace, which I mentioned earlier, utilizes head-tracking to create the illusion of a holographic display, though the glasses are clear so you are still seeing the real world when you look through.



A full VR headset combines head-tracking with a head-mounted display to allow you to move your head around and experience a virtual world as if you were embedded in it.



Virtual Reality offers some pretty amazing opportunities for developers, because it is very unique from the traditional gaming experience. Moving your head around in physical space is so completely different from using a mouse to pan your view on a 2D monitor, and stepping forward with your feet (in room-scale VR) is quite different from using WASD to update your position in the digital space.

In some sense we should be thinking about the interface for VR the way we think about the interface for mobile.



It took a while for developers to realize some amazing things about the touch-screen interface. In the early days people thought about touch-screens in terms of mouse-and-keyboard controls, where buttons on the screen would be clicked on with a mouse they would instead be "tapped" on with a finger for the touch screen. But touch-screens offered a really incredible advantage over mouse-and-keyboard, which we discovered when we learned that toddlers could intuitively interact with a touch-screen, long before they could use a mouse. Part of this obviously comes from dexterity, but I think there is another key reason for this in that moving a mouse has a disconnect between your action and the result you see - you move a physical object around on a surface, and then you see a result happen on a separate physical object which is the computer screen. With a touch-screen there is not disconnect, the thing you are touching is the thing that displays the result of your touch, so it is much more intuitive to understand the relation between your action and the result. 

VR offers a similar solution for navigation, where we previously used a mouse and WASD to navigate a space, now we can use our head and body, which is a much more intuitive translation.

In part 3 of this segment I will get into the meat of my presentation - how can we stop thinking about the mouse and keyboard when we design for mobile and VR (and mobile VR)?

Continue to part 3.

No comments:

Post a Comment