The first two posts were about the incredible variety of hardware available, and how each one provides a unique source of input from the user. In all of my design work I like to start with answering the question "what is unique about this hardware, and what can I do with it that I couldn't do with any other hardware?"
This post will be focused on what I would argue are the two most common pieces of "unusual" hardware - mobile and VR - and how we as designers need to step away from the mindset of designing for mouse and keyboard.
Not long ago I played a VR hangman game where the user would fill in the hangman letters using a digital keyboard floating out in space, and would reach out with their motion-tracked controller to "click" on the keys using the trigger on the controller. There are many VR experiences that attempted to have the player "click" on things in the world, however this interface fell victim to using BOTH a mouse input and a keyboard input. Now I'm not sure if hangman is the best use of VR, but I am certain there is a better option out there than having the player click on letters from a digital keyboard.
On the complete other end of the spectrum is a game called Fantastic Contraption, which was popular many years ago but recently ported to VR. In this game, every action you can take is about the physical space around you - switching between tools and items happens by reach with the motion-tracked controller to various parts of your body, and this makes it easy to remember where things are - "this tool is always behind my right shoulder." Fantastic Contraption also has an adorable "living" menu in the form of a cat that follows you around in the game. Have a look for yourself:
Another thing to keep in mind with VR is that almost any kind if input from the user's hand has the risk of feeling unnatural. If you are using the motion-tracked controller to place a digital hand over something and then pull the trigger to pick it up, many users feel unnatural because the shape and texture of the object they see does not match the shape and texture they feel in their hand.
One of my games addressed this by not using hands at all - every interaction is completely gaze-based. Not only does this eliminate the unnatural feeling, but it also plays on a common human fantasy of being able to alter the world with your mind - staring at things to make them explode. You can see the demo video:
Mobile games have faced similar mouse-and-keyboard tendencies, and have also come up with some inventive alternatives.
One of the first issues that comes to mind for me is when a game has the user tap on something to select it, and then tap somewhere else to move it there. This is a common interaction with a mouse, but on a touch screen it makes so much more sense to drag it across the screen to the new location, because that is how we would move things in real life.
Another issue on mobile which isn't a problem with mouse-and-keyboard is that if something is on the top-left corner (or top-right corner for left-handed users) then the user's hand will be completely blocking the screen when they try to touch it.
Some of the more intuitive interactions include the two-finger pinch and stretch to zoom in and out, which I believe may have been genius when it was first used. And that one must have taken a fair amount of pondering, because there is no comparable interaction with the mouse and keyboard - it was completely unique to the touch-screen.
Most interactions where you swipe across the screen are also fairly intuitive, especially when you swipe towards the edge of the screen to throw something away. There is a VR equivalent to this which you can see in Job Simulator where the user throws objects toward the edge of the space to discard them.
I think that part of the reason why we end up stuck in the situation of defaulting to mouse-and-keyboard controls is because the majority of games are developed on a computer using the mouse and keyboard regardless of what platform it will end up on, and sometimes it takes a while before the developer is able to test on the target platform, and by that time the developer may have already gotten comfortable with the controls so it is intuitive for them and they don't notice the problem. We might see this improve somewhat when VR developers can start developing their games while inside VR, a solution some of the bigger game engines are releasing.
So, next time you are developing for anything other than mouse-and-keyboard, here is a challenge for you. Before you make any interactions or think about any kind of input, first think about what things are going to happen in the game - what is the in-game player going to do - then imagine yourself using the target hardware, or go and use it with nothing running, and imagine making those in-game interactions. Go ahead and pick up a blank phone or put on a dark VR headset, and close your eyes, and feel the interaction with the hardware.
Thanks for reading! Feel free to leave thoughts and suggestions in the comments - especially if you have examples of interesting interfaces for interesting hardware (both good and bad). Also subscribe and follow :)