Showing posts with label ui. Show all posts
Showing posts with label ui. Show all posts

Saturday, May 6, 2017

Game Prototyping in Unity, Part 1: Creating an Evironment

This 5-part blog series is going to cover content that I recently made into a tutorial on Pluralsight called Game Prototyping in Unity. The blog post will summarize what you would learn in the tutorial, to help you decide if it's right for you before subscribing to the Pluralsight course listing.

In Part 1 we will cover the process of Creating an Environment. The first step is to create a 2D layout which will be your plan for the level design, and you can do this with any 2D tool you are comfortable with - Photoshop, MS Paint, even paper and pencil!

It's important to create a 2D layout before beginning building in 3D because it allows you to explore multiple options, take a couple of attempts and make changes quickly, get feedback early, and also to have a plan so you can stay on track when you begin building.

Benefits of creating a 2D layout:

  • Explore multiple options
  • Make changes quickly
  • Get feedback early
  • Have a plan
When I'm getting started on a 2D layout, I like to start with the player's start location:
Then fill in some details around it:
Be sure to label everything as you add new elements:
And think about where to add interactions:
Be sure to add a clear and distinct goal:

Once you have a top-down layout, it's time to move into Unity. You can import your top-down layout into Unity and apply it to a plane and use it as a reference to build on top of. 


Be sure to scale it so it is an appropriate size compared to a character controller. Most character controllers are 1 meter wide, so you can use a unit cube to measure while you scale your layout. 

The next step is Whiteboxing - the process of building a layout using 3D primitives such as cubes, spheres, cylinders, etc. 


For any part of the layout that would be interactive, we want to create Prefabs which allow us to easily apply changes to a large number of objects at once and ensures that they have the same visuals and behaviors. 


Once you have your interactive Prefabs, you'll want to start placing them around the scene. Before you start placing though, it's important to give some thought to the Level Flow. You don't need to place your interactive objects exactly where you have them in the layout, this is the perfect time to try some different options and get a good feel for the progression. Level Flow is how the difficulty increases over time, and it can be tempting to use a linear increase:


However, a better approach is to add a challenge, followed by a moment of rest, and oscillate between increasing and decreasing difficulty:


It's important not to increase too quickly or you will cause anxiety, but you also don't want to increase too slowly or you will cause boredom. 

Once you have given some thought to your level flow, you can place your interactive prefabs in the scene. 


Be sure to test regularly as you are adding interaction points, to be sure that your level flow feels good. I recommend testing with Unity's default first-person controller during this phase. 

Now you have a whiteboxed scene in Unity. Part 2 will cover importing art and sound assets and integrating them into the whitebox. 

If you would like to follow along with the full video tutorial, please check it out on Pluralsight!

Wednesday, October 26, 2016

Interface Design and Unusual Platforms, Part 3

This is the third and final installment of Interface Design and Unusual Platforms. If you haven't read the first two, please start here.

The first two posts were about the incredible variety of hardware available, and how each one provides a unique source of input from the user. In all of my design work I like to start with answering the question "what is unique about this hardware, and what can I do with it that I couldn't do with any other hardware?"

This post will be focused on what I would argue are the two most common pieces of "unusual" hardware - mobile and VR - and how we as designers need to step away from the mindset of designing for mouse and keyboard.

Not long ago I played a VR hangman game where the user would fill in the hangman letters using a digital keyboard floating out in space, and would reach out with their motion-tracked controller to "click" on the keys using the trigger on the controller. There are many VR experiences that attempted to have the player "click" on things in the world, however this interface fell victim to using BOTH a mouse input and a keyboard input. Now I'm not sure if hangman is the best use of VR, but I am certain there is a better option out there than having the player click on letters from a digital keyboard.

On the complete other end of the spectrum is a game called Fantastic Contraption, which was popular many years ago but recently ported to VR. In this game, every action you can take is about the physical space around you - switching between tools and items happens by reach with the motion-tracked controller to various parts of your body, and this makes it easy to remember where things are - "this tool is always behind my right shoulder." Fantastic Contraption also has an adorable "living" menu in the form of a cat that follows you around in the game. Have a look for yourself:



Another thing to keep in mind with VR is that almost any kind if input from the user's hand has the risk of feeling unnatural. If you are using the motion-tracked controller to place a digital hand over something and then pull the trigger to pick it up, many users feel unnatural because the shape and texture of the object they see does not match the shape and texture they feel in their hand.

One of my games addressed this by not using hands at all - every interaction is completely gaze-based. Not only does this eliminate the unnatural feeling, but it also plays on a common human fantasy of being able to alter the world with your mind - staring at things to make them explode. You can see the demo video:



Mobile games have faced similar mouse-and-keyboard tendencies, and have also come up with some inventive alternatives.

One of the first issues that comes to mind for me is when a game has the user tap on something to select it, and then tap somewhere else to move it there. This is a common interaction with a mouse, but on a touch screen it makes so much more sense to drag it across the screen to the new location, because that is how we would move things in real life.

Another issue on mobile which isn't a problem with mouse-and-keyboard is that if something is on the top-left corner (or top-right corner for left-handed users) then the user's hand will be completely blocking the screen when they try to touch it.

Some of the more intuitive interactions include the two-finger pinch and stretch to zoom in and out, which I believe may have been genius when it was first used. And that one must have taken a fair amount of pondering, because there is no comparable interaction with the mouse and keyboard - it was completely unique to the touch-screen.

Most interactions where you swipe across the screen are also fairly intuitive, especially when you swipe towards the edge of the screen to throw something away. There is a VR equivalent to this which you can see in Job Simulator where the user throws objects toward the edge of the space to discard them.

I think that part of the reason why we end up stuck in the situation of defaulting to mouse-and-keyboard controls is because the majority of games are developed on a computer using the mouse and keyboard regardless of what platform it will end up on, and sometimes it takes a while before the developer is able to test on the target platform, and by that time the developer may have already gotten comfortable with the controls so it is intuitive for them and they don't notice the problem. We might see this improve somewhat when VR developers can start developing their games while inside VR, a solution some of the bigger game engines are releasing.

So, next time you are developing for anything other than mouse-and-keyboard, here is a challenge for you. Before you make any interactions or think about any kind of input, first think about what things are going to happen in the game - what is the in-game player going to do - then imagine yourself using the target hardware, or go and use it with nothing running, and imagine making those in-game interactions. Go ahead and pick up a blank phone or put on a dark VR headset, and close your eyes, and feel the interaction with the hardware.

Thanks for reading! Feel free to leave thoughts and suggestions in the comments - especially if you have examples of interesting interfaces for interesting hardware (both good and bad). Also subscribe and follow :)


Interface Design and Unusual Platforms, Part 2

This is a continuation of part 1, and I would recommend reading that first for some background info.

My last post left off with the Wii Fit Balance Board and Kinect for a game played on hoverboards battling giant robot spiders. How can I possibly top that?

Well, up next is the Leap Motion (not to be confused for AR company Magic LEAP) - it bears some similarity to the Kinect in terms of hardware, but it is used to track your hands with great detail...every movement of every finger mapped to a skeleton!


I entered a Leap Motion Game Jam to get a free Leap Motion (one of the best ways to get free hardware!). As always I started with the question of "what makes this hardware unique and what can I do with it that I couldn't do with any other hardware?"

Other games have tried to simulate gardening and farming, but their shortcoming is that growing plants really involves working with your hands, something hard to replicate in video games using a keyboard or controller. The Leap Motion is unique in that it allows you to use your hands with the full range of dexterity we have in real life. So I made a garden simulator. It’s important to remember though that digital experiences do not need to perfectly match their real-world inspiration, sometimes it can be more fun to do something weird. So I took a weird twist on gardening - the player's interaction is to literally pinch and pull the plant upward to make it grow. Then you can dip your hand in a water bucket to sprinkle the plant with water to make it bloom. But beware! If you pull the plant to quickly you will pull it right out of the ground!

I ended up making this game in about 24 hours, because I procrastinated until the very last moment to get started. Here is a gameplay video. This one you can also download from itch.io if you have a Leap Motion and want to test it out!

Of course I'm going to get to Virtual Reality, but one more thing before I do - Augmented Reality. I haven't actually made anything for Augmented Reality yet, but this stuff is crazy cool. It basically covers any way hardware can be used to alter your experience of real physical space, including overlays with AR glasses, digital displays on top of real-world camera usually using a phone, and of course projections like this one:


This is a height-map projector, it projects a different color based on how close the surface is to the projector. I had a chance to play with one of these recently, and it's great - you can play with the sand with your hands and build mountains and valleys and lakes, and the colors update in real time as you play. 

AR Games very commonly refer to two different types of games - Augmented Reality, which I was just explaining as being digitally overlayed on the real world, and Alternate Reality - which is like a parallel universe with the same global coordinates as our universe, but is instead inhabited by Pokemon. 


Okay, so now we can finally get to Virtual Reality, I'm sure that's what you're all here for. I'd like to take this opportunity to draw a distinction between Head-Mounted Display (HMD), Head Tracking, and full VR headsets.

A Head-Mounted Display is like a monitor attached to your face - it's pretty cool that it lets you get up close and personal, but the limitations without head-tracking means you are looking at a fairly stationary display.


Head-tracking is similar to other forms of motion tracking (hand-tracking, body-tracking, motion-tracked controllers...). Head-tracking does what you'd expect - it tracks the motion of your head and then uses that data for something useful in the experience. zSpace, which I mentioned earlier, utilizes head-tracking to create the illusion of a holographic display, though the glasses are clear so you are still seeing the real world when you look through.



A full VR headset combines head-tracking with a head-mounted display to allow you to move your head around and experience a virtual world as if you were embedded in it.



Virtual Reality offers some pretty amazing opportunities for developers, because it is very unique from the traditional gaming experience. Moving your head around in physical space is so completely different from using a mouse to pan your view on a 2D monitor, and stepping forward with your feet (in room-scale VR) is quite different from using WASD to update your position in the digital space.

In some sense we should be thinking about the interface for VR the way we think about the interface for mobile.



It took a while for developers to realize some amazing things about the touch-screen interface. In the early days people thought about touch-screens in terms of mouse-and-keyboard controls, where buttons on the screen would be clicked on with a mouse they would instead be "tapped" on with a finger for the touch screen. But touch-screens offered a really incredible advantage over mouse-and-keyboard, which we discovered when we learned that toddlers could intuitively interact with a touch-screen, long before they could use a mouse. Part of this obviously comes from dexterity, but I think there is another key reason for this in that moving a mouse has a disconnect between your action and the result you see - you move a physical object around on a surface, and then you see a result happen on a separate physical object which is the computer screen. With a touch-screen there is not disconnect, the thing you are touching is the thing that displays the result of your touch, so it is much more intuitive to understand the relation between your action and the result. 

VR offers a similar solution for navigation, where we previously used a mouse and WASD to navigate a space, now we can use our head and body, which is a much more intuitive translation.

In part 3 of this segment I will get into the meat of my presentation - how can we stop thinking about the mouse and keyboard when we design for mobile and VR (and mobile VR)?

Continue to part 3.

Sunday, October 23, 2016

Interface Design and Unusual Platforms, Part 1

I'm going to do something a little bit different today. In a couple of weeks I'll be going to Unite LA (the big Unity3D conference) as a speaker, and my talk is on Interface Design for Mobile and VR (and other unusual platforms). I've been trying to work on my talk, but I keep finding myself with writer's block, and I think I find it easier to blog that to assemble a presentation, so I'm going to jot down my ideas here. Feedback is welcome :)

I'd like to start off with an overview of unusual hardware, and specifically give some examples of how I've designed for some of them. Then I will move on to talking specifically about designing interfaces.

In no particular order, here is a list of all of the variety of hardware I can think of (and I have developed on each of them): Mouse, keyboard, gamepad, motion-tracking controller, head-tracking, touch-screen, Kinect, Wii balance board, heartrate, 3D stylus (zSpace), 360 treadmill, Leap Motion. You may notice that some of these are a type of hardware, whereas others are a specific brand. That is because I am lazy, and also some types of input are recognized more for one brand than for the type of input. Think of it like saying "Kleenex" instead of "facial tissue". People know what you mean.

Mouse, keyboard, and gamepad are probably the most common input people think of when they think of game development, so I am not going to talk about those. As I talk about each of the other types of hardware, I want to emphasize that my intention with design is to answer this question: "What is unique about this hardware that no other hardware can do?"

Let's start with the zSpace, the one you are most likely to have never heard of.




The zSpace is a semiholographic display tablet with a 3D stylus, and special glasses that has markers on them to allow the display to be projected toward you, giving a full 3D effect. Without actually using one, it can be very hard to imagine, so you will just have to trust me that when you use it everything you see looks completely real. Not "real" in the sense that it is photorealistic, somethings are very stylized, but real in the sense that you believe you could reach out and pick it up, like a toy.

When I was first presented with this device I had a team of artists and programmers working with me, and the only direction we were given is "this device is used for tools, education, training, and medical applications, see if you can use it to make a game." And so we set out. We began by orienting ourselves to the hardware, using the existing apps, then building some prototypes. We were looking for "fun" in the interactions, but we were also looking at what made it really unique, and very quickly we discovered both in the form of "drawing" 3D shapes using the 3D stylus. In particular, drawing spheres was very satisfying and enjoyable. So how can you make a game about drawing spheres? In parallel with our experiments, we had also been brainstorming game concepts, and had mostly sittled around the idea of having floating islands (to make full use of the 3D effect) that had bunnies on them, and the bunnies were being hunted by foxes. So as the player, your goal was to keep the bunnies safe. We tied this to our sphere-drawing mechanic by having the player draw bubbles around the bunnies to make them float up out of harm's way, and they would float over fenced in safe-zones where foxes couldn't reach them (there was also a small random chance of the bubbles blowing away in a wind gust, to add anticipation). In addition to bubble drawing, the player could also use the stylus to pick up items - a carrot to lure the bunnies, a stick to pop bubbles, and a fan to move bubbles around. In the end it was pretty enjoyable, but there was one thing missing...how to represent the stylus in the 3D world in a meaningful way? That answer came from one of our artists, who decided to make a pair of chopsticks. It did the trick of trying all of our interactions together. See it in action here.

Next up: 360 Treadmill. This is in essence a treadmill where you can walk in any direction while staying in place.



A couple of years ago I worked as a developer for Virtuix Omni, one of the first of such devices. What makes this hardware unique is pretty obvious, the input is which way and how fast the player is walking. It is actually similar to input from a joystick on a gamepad, so it translates fairly well to most first-person experiences. The big difference is that you are physically walking, and it is usually accompanied by a VR headset, so it gives you full immersion into your experience. My first thought, which a lot of other developers have also thought for VR, was virtual tourism. I think it is a bit more compelling on the Omni than in seated VR because the Omni lets you walk around the virtual destination. I did try to set mine apart a bit by thinking about what we usually do when we travel - we take pictures. So I made a virtual tour of Amsterdam where the only interactions are walking and taking pictures, and those pictures (screenshots, in this case) are saved to your computer so you have them to keep when you return from your virtual tour. You can check it out here.

Now for a real treat: the Wii Fit Balance Board!



The Balance Board is a pressure sensor (or multiple pressure sensors?) that can very accurately tell which way you are leaning. When you first stand on the board it asks you to stand still (that's right, the board talks to you) while it calibrates to your weight. After that it can tell any direction you lean based on where you are putting pressure. If you pick up one foot and step forward or backward, it know which foot and if you stepped forward or backward with it (based on the foot still on the board placing more pressure on the toe or heel). I have tried unsuccessfully to trick it, it's a pretty smart board!

While in grad school I was assigned to a team working with the U.S. military to design a fitness game. Obviously a mouse and keyboard was not going to work for this, so we set out to try all of the hardware we could find that was used for fitness games. We experimented with phones, PS Moves, and devices that track biometric feedback, before settling on a bizarre combination - we decided to pair the Balance Board with a Kinect to track both shifting weight and also upper-body movement. Oh yeah, and we added in a heart rate monitor :D

Here's how the game worked - we used used two balance boards so that two players could control hover boards in the game by leaning to steer. With their arms they could reach out to collect things or punch to "fire" a projectile (a ball of plasma that would come rocketing out of your fist). We also strapped heart rate monitors onto the players, partly because we needed the data, but we also decided to add a gameplay element around them. When the players step onto the balance board we took an initial heart rate reading, then as the game progressed and a player's heart rate went up we would make that player's plasma balls bigger and more damaging. The game was cooperative, so the two players worked together to fight off giant robot spiders, but at the end we gave them a score that combined how many spiders they kills and how many items they picked up. A fascinating thing happened when we added the heart rate effect - it actually evened out the scores between players who were more fit and less fit. Since we were taking an initial reading and tracking increase, players who were already in good shape had to work harder to get the positive effect. We found through testing that this leveling of the playing field made the game more fun for everyone. Want to see a gameplay video? Of course you do.

I really want to tell you about the rest of the hardware, but this post is now fairly long and I need a break, so I will continue in part two! Also, this seems to have done the trick to beat my writer's block!

Don't forget to subscribe and follow!

Continue to part 2.