Wednesday, October 26, 2016

Interface Design and Unusual Platforms, Part 3

This is the third and final installment of Interface Design and Unusual Platforms. If you haven't read the first two, please start here.

The first two posts were about the incredible variety of hardware available, and how each one provides a unique source of input from the user. In all of my design work I like to start with answering the question "what is unique about this hardware, and what can I do with it that I couldn't do with any other hardware?"

This post will be focused on what I would argue are the two most common pieces of "unusual" hardware - mobile and VR - and how we as designers need to step away from the mindset of designing for mouse and keyboard.

Not long ago I played a VR hangman game where the user would fill in the hangman letters using a digital keyboard floating out in space, and would reach out with their motion-tracked controller to "click" on the keys using the trigger on the controller. There are many VR experiences that attempted to have the player "click" on things in the world, however this interface fell victim to using BOTH a mouse input and a keyboard input. Now I'm not sure if hangman is the best use of VR, but I am certain there is a better option out there than having the player click on letters from a digital keyboard.

On the complete other end of the spectrum is a game called Fantastic Contraption, which was popular many years ago but recently ported to VR. In this game, every action you can take is about the physical space around you - switching between tools and items happens by reach with the motion-tracked controller to various parts of your body, and this makes it easy to remember where things are - "this tool is always behind my right shoulder." Fantastic Contraption also has an adorable "living" menu in the form of a cat that follows you around in the game. Have a look for yourself:



Another thing to keep in mind with VR is that almost any kind if input from the user's hand has the risk of feeling unnatural. If you are using the motion-tracked controller to place a digital hand over something and then pull the trigger to pick it up, many users feel unnatural because the shape and texture of the object they see does not match the shape and texture they feel in their hand.

One of my games addressed this by not using hands at all - every interaction is completely gaze-based. Not only does this eliminate the unnatural feeling, but it also plays on a common human fantasy of being able to alter the world with your mind - staring at things to make them explode. You can see the demo video:



Mobile games have faced similar mouse-and-keyboard tendencies, and have also come up with some inventive alternatives.

One of the first issues that comes to mind for me is when a game has the user tap on something to select it, and then tap somewhere else to move it there. This is a common interaction with a mouse, but on a touch screen it makes so much more sense to drag it across the screen to the new location, because that is how we would move things in real life.

Another issue on mobile which isn't a problem with mouse-and-keyboard is that if something is on the top-left corner (or top-right corner for left-handed users) then the user's hand will be completely blocking the screen when they try to touch it.

Some of the more intuitive interactions include the two-finger pinch and stretch to zoom in and out, which I believe may have been genius when it was first used. And that one must have taken a fair amount of pondering, because there is no comparable interaction with the mouse and keyboard - it was completely unique to the touch-screen.

Most interactions where you swipe across the screen are also fairly intuitive, especially when you swipe towards the edge of the screen to throw something away. There is a VR equivalent to this which you can see in Job Simulator where the user throws objects toward the edge of the space to discard them.

I think that part of the reason why we end up stuck in the situation of defaulting to mouse-and-keyboard controls is because the majority of games are developed on a computer using the mouse and keyboard regardless of what platform it will end up on, and sometimes it takes a while before the developer is able to test on the target platform, and by that time the developer may have already gotten comfortable with the controls so it is intuitive for them and they don't notice the problem. We might see this improve somewhat when VR developers can start developing their games while inside VR, a solution some of the bigger game engines are releasing.

So, next time you are developing for anything other than mouse-and-keyboard, here is a challenge for you. Before you make any interactions or think about any kind of input, first think about what things are going to happen in the game - what is the in-game player going to do - then imagine yourself using the target hardware, or go and use it with nothing running, and imagine making those in-game interactions. Go ahead and pick up a blank phone or put on a dark VR headset, and close your eyes, and feel the interaction with the hardware.

Thanks for reading! Feel free to leave thoughts and suggestions in the comments - especially if you have examples of interesting interfaces for interesting hardware (both good and bad). Also subscribe and follow :)


Interface Design and Unusual Platforms, Part 2

This is a continuation of part 1, and I would recommend reading that first for some background info.

My last post left off with the Wii Fit Balance Board and Kinect for a game played on hoverboards battling giant robot spiders. How can I possibly top that?

Well, up next is the Leap Motion (not to be confused for AR company Magic LEAP) - it bears some similarity to the Kinect in terms of hardware, but it is used to track your hands with great detail...every movement of every finger mapped to a skeleton!


I entered a Leap Motion Game Jam to get a free Leap Motion (one of the best ways to get free hardware!). As always I started with the question of "what makes this hardware unique and what can I do with it that I couldn't do with any other hardware?"

Other games have tried to simulate gardening and farming, but their shortcoming is that growing plants really involves working with your hands, something hard to replicate in video games using a keyboard or controller. The Leap Motion is unique in that it allows you to use your hands with the full range of dexterity we have in real life. So I made a garden simulator. It’s important to remember though that digital experiences do not need to perfectly match their real-world inspiration, sometimes it can be more fun to do something weird. So I took a weird twist on gardening - the player's interaction is to literally pinch and pull the plant upward to make it grow. Then you can dip your hand in a water bucket to sprinkle the plant with water to make it bloom. But beware! If you pull the plant to quickly you will pull it right out of the ground!

I ended up making this game in about 24 hours, because I procrastinated until the very last moment to get started. Here is a gameplay video. This one you can also download from itch.io if you have a Leap Motion and want to test it out!

Of course I'm going to get to Virtual Reality, but one more thing before I do - Augmented Reality. I haven't actually made anything for Augmented Reality yet, but this stuff is crazy cool. It basically covers any way hardware can be used to alter your experience of real physical space, including overlays with AR glasses, digital displays on top of real-world camera usually using a phone, and of course projections like this one:


This is a height-map projector, it projects a different color based on how close the surface is to the projector. I had a chance to play with one of these recently, and it's great - you can play with the sand with your hands and build mountains and valleys and lakes, and the colors update in real time as you play. 

AR Games very commonly refer to two different types of games - Augmented Reality, which I was just explaining as being digitally overlayed on the real world, and Alternate Reality - which is like a parallel universe with the same global coordinates as our universe, but is instead inhabited by Pokemon. 


Okay, so now we can finally get to Virtual Reality, I'm sure that's what you're all here for. I'd like to take this opportunity to draw a distinction between Head-Mounted Display (HMD), Head Tracking, and full VR headsets.

A Head-Mounted Display is like a monitor attached to your face - it's pretty cool that it lets you get up close and personal, but the limitations without head-tracking means you are looking at a fairly stationary display.


Head-tracking is similar to other forms of motion tracking (hand-tracking, body-tracking, motion-tracked controllers...). Head-tracking does what you'd expect - it tracks the motion of your head and then uses that data for something useful in the experience. zSpace, which I mentioned earlier, utilizes head-tracking to create the illusion of a holographic display, though the glasses are clear so you are still seeing the real world when you look through.



A full VR headset combines head-tracking with a head-mounted display to allow you to move your head around and experience a virtual world as if you were embedded in it.



Virtual Reality offers some pretty amazing opportunities for developers, because it is very unique from the traditional gaming experience. Moving your head around in physical space is so completely different from using a mouse to pan your view on a 2D monitor, and stepping forward with your feet (in room-scale VR) is quite different from using WASD to update your position in the digital space.

In some sense we should be thinking about the interface for VR the way we think about the interface for mobile.



It took a while for developers to realize some amazing things about the touch-screen interface. In the early days people thought about touch-screens in terms of mouse-and-keyboard controls, where buttons on the screen would be clicked on with a mouse they would instead be "tapped" on with a finger for the touch screen. But touch-screens offered a really incredible advantage over mouse-and-keyboard, which we discovered when we learned that toddlers could intuitively interact with a touch-screen, long before they could use a mouse. Part of this obviously comes from dexterity, but I think there is another key reason for this in that moving a mouse has a disconnect between your action and the result you see - you move a physical object around on a surface, and then you see a result happen on a separate physical object which is the computer screen. With a touch-screen there is not disconnect, the thing you are touching is the thing that displays the result of your touch, so it is much more intuitive to understand the relation between your action and the result. 

VR offers a similar solution for navigation, where we previously used a mouse and WASD to navigate a space, now we can use our head and body, which is a much more intuitive translation.

In part 3 of this segment I will get into the meat of my presentation - how can we stop thinking about the mouse and keyboard when we design for mobile and VR (and mobile VR)?

Continue to part 3.

Sunday, October 23, 2016

Interface Design and Unusual Platforms, Part 1

I'm going to do something a little bit different today. In a couple of weeks I'll be going to Unite LA (the big Unity3D conference) as a speaker, and my talk is on Interface Design for Mobile and VR (and other unusual platforms). I've been trying to work on my talk, but I keep finding myself with writer's block, and I think I find it easier to blog that to assemble a presentation, so I'm going to jot down my ideas here. Feedback is welcome :)

I'd like to start off with an overview of unusual hardware, and specifically give some examples of how I've designed for some of them. Then I will move on to talking specifically about designing interfaces.

In no particular order, here is a list of all of the variety of hardware I can think of (and I have developed on each of them): Mouse, keyboard, gamepad, motion-tracking controller, head-tracking, touch-screen, Kinect, Wii balance board, heartrate, 3D stylus (zSpace), 360 treadmill, Leap Motion. You may notice that some of these are a type of hardware, whereas others are a specific brand. That is because I am lazy, and also some types of input are recognized more for one brand than for the type of input. Think of it like saying "Kleenex" instead of "facial tissue". People know what you mean.

Mouse, keyboard, and gamepad are probably the most common input people think of when they think of game development, so I am not going to talk about those. As I talk about each of the other types of hardware, I want to emphasize that my intention with design is to answer this question: "What is unique about this hardware that no other hardware can do?"

Let's start with the zSpace, the one you are most likely to have never heard of.




The zSpace is a semiholographic display tablet with a 3D stylus, and special glasses that has markers on them to allow the display to be projected toward you, giving a full 3D effect. Without actually using one, it can be very hard to imagine, so you will just have to trust me that when you use it everything you see looks completely real. Not "real" in the sense that it is photorealistic, somethings are very stylized, but real in the sense that you believe you could reach out and pick it up, like a toy.

When I was first presented with this device I had a team of artists and programmers working with me, and the only direction we were given is "this device is used for tools, education, training, and medical applications, see if you can use it to make a game." And so we set out. We began by orienting ourselves to the hardware, using the existing apps, then building some prototypes. We were looking for "fun" in the interactions, but we were also looking at what made it really unique, and very quickly we discovered both in the form of "drawing" 3D shapes using the 3D stylus. In particular, drawing spheres was very satisfying and enjoyable. So how can you make a game about drawing spheres? In parallel with our experiments, we had also been brainstorming game concepts, and had mostly sittled around the idea of having floating islands (to make full use of the 3D effect) that had bunnies on them, and the bunnies were being hunted by foxes. So as the player, your goal was to keep the bunnies safe. We tied this to our sphere-drawing mechanic by having the player draw bubbles around the bunnies to make them float up out of harm's way, and they would float over fenced in safe-zones where foxes couldn't reach them (there was also a small random chance of the bubbles blowing away in a wind gust, to add anticipation). In addition to bubble drawing, the player could also use the stylus to pick up items - a carrot to lure the bunnies, a stick to pop bubbles, and a fan to move bubbles around. In the end it was pretty enjoyable, but there was one thing missing...how to represent the stylus in the 3D world in a meaningful way? That answer came from one of our artists, who decided to make a pair of chopsticks. It did the trick of trying all of our interactions together. See it in action here.

Next up: 360 Treadmill. This is in essence a treadmill where you can walk in any direction while staying in place.



A couple of years ago I worked as a developer for Virtuix Omni, one of the first of such devices. What makes this hardware unique is pretty obvious, the input is which way and how fast the player is walking. It is actually similar to input from a joystick on a gamepad, so it translates fairly well to most first-person experiences. The big difference is that you are physically walking, and it is usually accompanied by a VR headset, so it gives you full immersion into your experience. My first thought, which a lot of other developers have also thought for VR, was virtual tourism. I think it is a bit more compelling on the Omni than in seated VR because the Omni lets you walk around the virtual destination. I did try to set mine apart a bit by thinking about what we usually do when we travel - we take pictures. So I made a virtual tour of Amsterdam where the only interactions are walking and taking pictures, and those pictures (screenshots, in this case) are saved to your computer so you have them to keep when you return from your virtual tour. You can check it out here.

Now for a real treat: the Wii Fit Balance Board!



The Balance Board is a pressure sensor (or multiple pressure sensors?) that can very accurately tell which way you are leaning. When you first stand on the board it asks you to stand still (that's right, the board talks to you) while it calibrates to your weight. After that it can tell any direction you lean based on where you are putting pressure. If you pick up one foot and step forward or backward, it know which foot and if you stepped forward or backward with it (based on the foot still on the board placing more pressure on the toe or heel). I have tried unsuccessfully to trick it, it's a pretty smart board!

While in grad school I was assigned to a team working with the U.S. military to design a fitness game. Obviously a mouse and keyboard was not going to work for this, so we set out to try all of the hardware we could find that was used for fitness games. We experimented with phones, PS Moves, and devices that track biometric feedback, before settling on a bizarre combination - we decided to pair the Balance Board with a Kinect to track both shifting weight and also upper-body movement. Oh yeah, and we added in a heart rate monitor :D

Here's how the game worked - we used used two balance boards so that two players could control hover boards in the game by leaning to steer. With their arms they could reach out to collect things or punch to "fire" a projectile (a ball of plasma that would come rocketing out of your fist). We also strapped heart rate monitors onto the players, partly because we needed the data, but we also decided to add a gameplay element around them. When the players step onto the balance board we took an initial heart rate reading, then as the game progressed and a player's heart rate went up we would make that player's plasma balls bigger and more damaging. The game was cooperative, so the two players worked together to fight off giant robot spiders, but at the end we gave them a score that combined how many spiders they kills and how many items they picked up. A fascinating thing happened when we added the heart rate effect - it actually evened out the scores between players who were more fit and less fit. Since we were taking an initial reading and tracking increase, players who were already in good shape had to work harder to get the positive effect. We found through testing that this leveling of the playing field made the game more fun for everyone. Want to see a gameplay video? Of course you do.

I really want to tell you about the rest of the hardware, but this post is now fairly long and I need a break, so I will continue in part two! Also, this seems to have done the trick to beat my writer's block!

Don't forget to subscribe and follow!

Continue to part 2.

Wednesday, October 19, 2016

IndieCade Festival

Over the weekend I had the fantastic opportunity to hang out at the IndieCade Festival for Independent Game Developers. IndieCade is one of my favorite game dev events because it is really focused on Indies - it showcases the best indie games in development (or recently released), it offers indie devs a chance to network with each other and with potential publishers, and it has a great lineup of talks targeting the most useful information for development as an indie studio.

There were four talks I found particularly useful - the first was called "So you want to start a game company? Corporate Formation and IP Strategy" and was led by Jonathan Pearce. I've been operating Astire Games as a Sole Proprietor with a DBA (I hope...), but I have run into some limiting factors by not being incorporated (most notably I've been trying to become a Sony developer, and as far as I can tell you need confirmation of your corporate entity). Anyway, I've had a lot of concerns and questions about forming an LLC versus an Inc, issues with IP and with paying employees (or offering rev share), the potential for incorporating in a different state, and deep-seated fears about legal issues arising in my entrepreneurial endeavors. Jonathan's talk was both inspiring and informative, and laid to rest some of my biggest concerns, though I still have some more decisions to make - I am completely on-the-fence about what state to incorporate in.

The second talk I attended was actually a workshop called "Paper Prototyping" and was run by several experienced educators, one of whom I spent some time talking to - Michael Annetta (sadly I do not recall the names of the others involved). This two-hour session took us on the ins and outs of developing a game idea from scratch, and although a lot of it was stuff I already knew, it was awesome to see it from a new perspective and it gave me some great ideas for better ways to teach design and prototyping (as you know, I teach level design, UI design, and rapid prototyping at the Art Institute). This workshop was also very active and I made some new friends that I continued to see throughout the rest of the festival.

On the second day I finally ended up at a VR talk called "Non Photoreal VR" about the visual design side of VR, dealing with performance issues, and tricks for making an environment feel "more 3D" using parallax, lighting, and fog.

The last talk I attended was by a friend of mine - Chris DeLeon (@ChrisDeLeon) - called "Starting Meetups that Make Games." In his talk, Chris gave some background on his experience running several different game developer meetups/clubs where participants would form teams and create and publish games in their spare time (often either students or folks working in a non-game field, or game developers who want to try a different discipline outside of their normal job). Two of the biggest problems facing game dev students when they graduate are 1.) not having a shipped title and 2.) not having experience working in a cross-discipline team. As we learned in this talk, a game dev meetup can help solve both of these problems.

I was inspired by Chris' talk to set into motion something I have been considering for a while - I want to give my students at the Art Institute (as well as students at the other colleges in Austin) a chance to get some hands-on work in an interdisciplinary group on a game that will ship. This is now one of my top-most priorities - I have always been passionate about the idea of improving education, and now here is something very tangible that I am very capable of contributing.

Aside from the four fabulous talks, IndieCade also offered me one other tremendous opportunity. IndieXChange (one of the IndieCade tracks) offers something called "Speed Dating" where developers get five minutes to meet with a publisher and pitch their idea, then can follow up afterward with those publishers. I was lucky enough to get to meet with reps from Sony and a rep from Oculus, and discuss my VR project Sundown Arcadia. It was an exhilarating experience - I always love an opportunity to talk up my games, and having an avid audience of publishers listening was quite thrilling.

One of the best parts about IndieCade, in my opinion, is the quality of the connections being made. At your average game dev conference you walk away with a big stack of business cards of people you are probable not going to contact. I left IndieCade with just seven business cards, but I have a very specific follow-up email for each one of them.

Between the superb talks, the phenomenal pitch opportunity, and the unique networking connection, I would call my IndieCade experience a success!