About Me

My photo

My goal is to make a difference in the world, and I think the way to do that is through games. My education is in programming and 3D animating, and I have worked in e-Learning and game development. 

Sunday, August 5, 2018

Prototyping Game Systems, Part 2: Building a Prototype Play Space

This is Part 2 of a 5 part series on my latest course on Pluralsight: Prototyping Game Systems. If you missed Part 1 be sure to start there!

In part 1 we covered the design-prototype relationship, how to assess design needs, and we discussed game mechanics and systems. Next we need a playable space to prototype in. When prototyping in an existing game, it's generally best to make a "safe" scene for prototyping the new mechanics, to avoid causing issues that might break the main playable scene. This is especially important if multiple developers are working on the project simultaneously.

In this course we are using Unity to develop the game, and Unity has an excellent built-in prototyping tool called Pro Builder.


Pro Builder allows you to create and manipulate 3D shapes in the Unity editor, without needing an external 3D modeling tool.


With the help of Pro Builder, you can easily shape out a playable space in just a matter of minutes. The manipulator tool allows you to adjust faces, edges, and vertices.


You can also extrude and bevel to make interesting shapes and crevices.


And when you are satisfied with your shape, you can apply textures, either to the entire shape or to individual faces. Pro Builder even has a simple UV editor to give you more precision for the texture.

For this course I've created a large open space with stairs and an arch, so the walkable ground has various heights and there is an area to walk under where the character won't be visible.


Next we drop in our character model and add a nav mesh. This game is in 3rd person, and the character controller has already been built for us and uses the nav mesh to move around.



The way the nav mesh works is it collects information about any surfaces that are marked "Static" and generates a "walkable" area, shown with a blue overlay. This nav mesh tells the character where it can walk via a Nav Mesh Agent component, which is attached to the character controller.


Now we have a playable space set up for our character to run around that doesn't interfere with the main scenes of the game, so we are all ready to start prototyping the new mechanics.


In part 3 we will build new mechanics for the game, including digging, looting, and powerups. Thanks for reading, and if you enjoyed this please check out the full course on Pluralsight: Prototyping Game Systems for Swords and Shovels.


Tuesday, July 31, 2018

Prototyping Game Systems, Part 1: Design-Prototype Relationship

In this 5 part series I will give an overview of the topics covered in my new Pluralsight course Prototyping Game Systems for Swords and Shovels.


This is an intermediate course for developers who have some experience with Unity and are ready to start building systems. If you are new to Unity, I recommend starting with my Game Prototyping in Unity blog series.

The Prototyping Game Systems course covers:

  • Understanding the Design-Prototype Relationship
  • Building a Prototype Play Space
  • Prototyping Non-Combat Systems
  • Adding Juicy Interactions
  • Playtesting and Iterating Based on Feedback
Let's get started with the Design-Prototype Relationship. 



Prototype is a broad term that is used in many industries to describe an early sample of a product used to test a concept or process. Within the game industry, we typically use prototype to describe a playable game concept that was created quickly to test new ideas. It’s important to note that nowhere in this definition does it say this needs to be created in a game engine, or even on a computer. The purpose of the prototype is to test an idea, and often the first test should be on paper. 


You might hear game developers refer to a paper prototype, which is a playable game concept created on paper or using physical objects like cards, dice, or game pieces. Basically anything you can use to test your idea without doing any coding or creating any digital content. Paper prototyping is very useful at the start of a project when you’re not really sure what you’re making and you don’t have any scripts or art assets to work with. This course specifically focuses on digital prototypes because it is working within the framework of an in-progress game, so there are already some art assets and scripts to work with. 


The three most common/useful times to prototype are during a Game Jam; when starting a new project; or when experimenting with new systems or mechanics for an existing project. This course covers the third option, creating and testing new systems within an existing project. Many aspiring game developers get their start working on games from scratch, either with student projects or side projects they take up on their own. Being able to integrate your work into an existing project is a major turning point for new developers, and one of the most desirable skills in a new entry-level hire. 

So now we know when to prototype, but what are we prototyping? Generally we prototype Game Mechanics, or Game Systems which are collections of mechanics that work together. What is a Game Mechanic? Game mechanics are the rules that determine how the game functions, how the user interacts with the game, and how the game responds to the user. 

A very simple example of a mechanic might be to click on an item to pick it up. Generally when you start to explain a mechanic, it will lead to questions which will help you work out the related mechanics. In this example my first question would be "What happens when I pick up an item? Does it go into my inventory? Do I hold it in my hands? Can I use it? Do I craft things with it?" And as you start answering those questions you’ll start to define your other mechanics. 


Now that we understand mechanics, let's talk about another common phrase in prototyping: Rapid Prototyping


Rapid Prototyping is the process of quickly creating, testing, and iterating on prototypes. The first key here is speed, you want to find a testable solution as quickly as possible. You also want to have a clear direction so that you know what you are working towards and keep your goals in focus. And it is important to critically review everything you make during this process, you want to test everything, don’t be afraid of failure. 

The first part of the prototyping process involves assessing the design needs. Generally when you join an existing project there is a design document in place. 


The design document is full of useful information about the game, but it can be daunting and difficult to tell what is important and specific to the systems you are prototyping. 


Since this course focuses on prototyping the non-combat systems of the game, the most useful information to us is anything related to digging/looting and upgrading the tools and weapons. 

While assessing the design needs, it is important to keep the project resources in mind. 


Pretty much every game being developed has a limited amount of time, money, and people. Understanding the available resources is an important part of prototyping. 

In Part 2 we will dive into the game engine to build a Prototype Play Space. 

If you enjoyed this blog and want to learn more about these topics and follow along with hands-on examples, please check out my full course on Pluralsight - Prototyping Game Systems for Swords and Shovels



Thanks for reading! 


Sunday, March 25, 2018

Game Dev Tutorials

Many game devs get their start from following tutorials, because even though there has been a surge of game-related classes and programs in colleges and schools, a lot of the content taught is overarching principles of design and workflows. Hands-on how-to material is still largely dominated by online tutorials.

Tutorials are a great way to learn how to build specific mechanics within specific engines or frameworks, and can be beneficial even for seasoned developers. Often tutorials cover not only scripting, but also art asset creation, world building, animation, and effects...you can make a complete game from scratch by following a tutorial, or you can pick up bits and pieces to fill in knowledge games.

For the past year I've been working on tutorializing content that I teach in classes, workshops, and camps. This is beneficial to anyone participating in my classes because it means they can work at their own pace and continue to learn even when I am not there, and it also allows people from all around the world to experience this content because it is available online (and much of it is free).

If you are an aspiring game developer, or if you teach game dev classes or workshops, I hope you will consider checking out my suite of tutorials. Here is a rundown of what I have to offer (all tutorials use the Unity game engine):


 

This tutorial covers everything you need to know about using 2D sprites in Unity, either for 2D game objects  animated for 2D gameplay, or for UI elements in a 2D or 3D game. Topics covered include sprite sheets, color, sliders, and rotation/scale/position animations.



Unity has a built-in physics engine that allows you to quickly and easily animate objects using physical properties such as force, torque, friction, mass, and bounciness. In this tutorial you will learn how to use these different properties to create a wide range of movement and behavior for physics-based objects in both 3D and 2D.



In addition to rigidbody physics, Unity also supports cloth physics. This tutorial covers the basics for setting up an object to behave like cloth, including constraints, stiffness, stretchiness, and animated force for wind and gravity.




This tutorial breaks down the most common features of Unity's Particle System component, including sprite sheets, velocity over lifetime, color over lifetime, trails, emissions, and bursts.



One of the biggest distinguishing factors from a student game and a professionally developed game is the "juiciness" of the interactions. The hallmark of game design is making the game "feel good" and so much of that good feeling comes from making things feel reactive to the player's actions. This tutorial covers specifically using particles and sound effects to make the interactions juicy.



This tutorial is available on Pluralsight and covers the full pipeline for creating your first game prototype. It covers whiteboxing, importing assets, building a level, adding animations, particles, lighting, and sounds, and making it a complete experience with a UI start screen and win/lose screens.


https://www.pluralsight.com/courses/unity-vr-fundamentals

This tutorial is specifically intended for aspiring VR developers, and gives the basics of getting started with room-scale VR on the Vive, and mobile VR.

Please subscribe to our youtube channel for more tutorials, gameplay videos, and other fun things! Thanks for reading!

Tuesday, January 30, 2018

Year in Review 2017

January seems as good a time as any to reflect on the past 12 months of progress and growth, both personally and professionally. 2017 was a tough year for many with uncertainty surrounding political and regulatory changes. For me and my studio, the sociopolitical climate has had minimal impact, and overall 2017 was quite a success.

Astire Games made a reasonable profit from contracts and consulting work, and most of that profit was converted into hardware at the end of the year. We had several small contracts that went well and ended on a positive note, and one large project that continues into 2018. The contracts and consulting work find our ongoing development of Cosmos Arena, which has been a slow and arduous project but I am optimistic it will perk up in 2018.

Cosmos Arena itself saw some exciting updates in 2017 as we pushed hard to make it a presentable demo for the Intel Showcase at the Austin Game Conference. Some new gameplay elements have emerged, and we've made strives towards better player feedback and overall UX improvements.



I also finally was able to afford to bring on a concept artist to help us work towards our own art direction (up until now we've been relying on the Asset Store, but with the new art direction we can begin internal development of our art assets to better match the style and feel of the game).



On a personal note, I was thrilled to accept an Adjunct Faculty position at the University of Texas for Spring 2018 teaching a course on AR and VR for games. I continue to teach at the Art Institute of Austin, but UT brings new and exciting challenges to the academic side of my career.

Here's hoping 2018 is filled with progress for everyone! Thanks for reading!

Sunday, January 21, 2018

VR Fundamentals

There is a common misconception that Virtual Reality began with the Oculus Kickstarter in 2012. Oculus launched the new wave of VR hype, but this was not the first occurrence. In the 1990's there was a boom of VR being integrated into arcades.

Nintendo even released the Virtual Boy as a VR device for home use.


Even before the 190's, VR existed as a concept and has a number of early attempts including the Sensorama in 1962.


In 1830 we had the invention of the first Stereoscope, which allowed the user to view a stereoscopic 3D image.


Virtual Reality has a long and rich history. Today, there are a number of VR devices on the market. At the top we have the Oculus Rift (now owned by Facebook) and the HTC Vive. Both come with a headset that tethers to your PC, hand-tracked controllers, and external tracking devices that allow for positional head-tracking.


The positional head-tracking allows for what we call 6 Degrees of Freedom, meaning x/y/z rotation and x/y/z position. It also allows for Room-Scale VR which is where the user can walk freely around a VR room, about 2 meters in each direction.

Beyond Oculus and HTC, there is also a growing market for mobile VR including the Samsung Gear VR, Merge VR, Google Cardboard, and Google Daydream.


Mobile VR generally involved inserting your smartphone into a headset. It offers 3 Degrees of Freedom (meaning rotation only) but it allows you to escape the tether to a powerful PC. You can put on a mobile headset anywhere - on the bus, at an event, while waiting in line, even in bed...The only thing needed to power mobile VR the smartphone you already carry with you.

With all of these headsets on the market, there is a growing need for developers to create new and exciting content. This is a phenomenal time to get into VR development, because not only are there a ton of hardware options available to the public but there are also a ton of resources being made available to potential developers. The Unity game engine offers built-in VR support that will automatically work with a number of the headsets listed above. The hardware developers also make interfacing with their hardware very straightforward through SDKs that anyone can download.


There has never been a better time to become a VR developer. I've created a tutorial to help you get started on your way: https://www.pluralsight.com/courses/unity-vr-fundamentals




Thanks for reading! If you enjoyed this, please Follow and Subscribe!

Sunday, August 27, 2017

Game AI: Non-Human Behavior Part 5

This is part 5 of a series on Game AI for Non-Human Behavior. Here's what you might have missed!
Part 5 will be a deep dive into sensory input, and resulting behaviors.

Source: Atari
In 2015 there was an article published in nature that used Atari 2600 games to explore Reinforcement Learning in AI. Reinforcement Learning is the process of allowing AI to explore different options and learn behavior through a reward system, as opposed to Supervised Learning where AI performing sub-optimal behavior is explicitly corrected. The article, "Human-level control through deep reinforcement learning," examines sensory input used to understand the surrounding environment. In their research, the sensory input used correlates to visual input (or sight), allowing the AI to look at the pixels that make up the current game state.

Source: The Hunt, Netflix
In contrast, these blind catfish living in underwater caves have adapted to survive completely without sight. When one sense fails, we rely on honing our other senses to adapt to a world without the missing sense.

Source: https://askabiologist.asu.edu/echolocation
As children, we learn the five senses as sight, sound, smell, taste, and touch. These are the senses we as humans understand, because these are the senses we experience. However, there are other senses present in nature that are outside our area of experience. Bats, for example, use sonar in place of sight to determine the positions of things around them.


A "sense" is defined as a system of a group of cell types that responds to specific physical phenomena, which correspond to a particular region of the brain that interpret those signals. Some plants have specialized cells which detect gravity, allowing the plants to grow upright with their roots growing down into the earth.

Source: http://www.defenders.org/sharks/basic-facts
Even with senses we understand, such as smell, non-human creatures have far superior uses of them in many cases. Sharks can determine the direction a smell is coming from based on which nostril received the scent first.

Source: nature.com/scientificamericanmind/journal/v19/n4/images/
scientificamericanmind0808-22-I1.jpg
Though the focus of reinforcement learning is on changing behavior based on rewards, it relies on information about the environment being interpreted through senses. We can see something similar happen in the classic rat maze example - the hungry rat is placed in a maze that has a piece of cheese at the end and allowed to explore the maze until it finds the cheese. The rat is then placed back at the start of the maze, and this continues until the rat manages to navigate the maze perfectly. The cheese in this example is obviously the reward, but the rat needs a way to understand its environment in order to get the reward.

Source: http://www.smithsonianmag.com/smart-news/
were-terrible-distinguishing-real-and-fake-schools-fish-180953162/
Beyond just understanding the environment, the senses also allow organism to understand the creatures around them, resulting in group-based behavior like flocking. To put flocking in simple terms, we can use an algorithm that uses cohesion, alignment, and separation to simulate behavior well enough for gameplay purposes.

Source: http://harry.me/blog/2011/02/17/neat-algorithms-flocking/
Each agent in the AI flock will calculate it's desired vector of movement based on cohesion (the need to stay within in the group), alignment (the need to face the same direction as the group), and separation (the need to avoid hitting other members of the group). The calculation for cohesion is to look at the position of your neighbors within a specific range, find the center of mass, and move towards that center of mass. For alignment, the agent looks at the direction each of its neighbors is facing and aligns itself to the average. For separation, the agent will check to see if it is too close to any of its neighbors, and adjust accordingly. The three of these combined into the agent's velocity vector will result in a simple flocking behavior.

Source: https://www.gizmodo.com.au/2014/03/
what-happens-when-you-throw-four-sharks-into-a-giant-school-of-fish/
However, the flock's simple behavior may be disrupted by the presence of predators, causing the agents to need to assess their environment beyond just the position and orientation of their neighbors.

Source: https://www.wired.com/2011/12/
the-true-hive-mind-how-honeybee-colonies-think/
Bees are another wonderful example of teamwork behavior. Pheromones are special scented chemicals that allow some organisms to communicate information with each other via scent. Bees use pheromones the share information through the hive, resulting in a hive the essentially thinks together as if it were a brain with each bee acting as a neuron. I can imagine a similar system that could be used to have a group of friendly characters so coordinated that they act as a hive of bees with the player as their queen.

If you have enjoyed this series on AI for non-human behavior, please follow and subscribe!

Tuesday, August 22, 2017

Game AI: Non-Human Behavior Part 4

This is part 4 of a series on Game AI for Non-Human Behavior. Here's what you might have missed!
  • Part 1: Defining "Game AI" and "Non-Human Behavior"
  • Part 2: Making Decisions, Predators and Prey
  • Part 3: Weird Inspirations from Nature
 Today let's talk about how creatures in the wild go about hunting.

Living things need food to survive. There are different ways to get food - some plants make their own food, some animals eat plants, and some creatures hunt. Hunting generally refers to actively pursuing your food, but for the purpose of this post I'm going to expand it to include luring your food to come to you.

Source: https://www.youtube.com/watch?v=z5fOsgrAJiU
Here are a few of the tactics that predators use to catch their food:
  • Speed
  • Strength
  • Traps / Lures
  • Stealth / Camouflage
  • Teamwork
  • Built-in Tools (ie. claws and teeth)
The Venus Fly Trap is a fabulous example of  a lure / trap. Predatory plants have adapted this way of life because they live in environments where it is challenging to absorb nutrients from the ground using roots. So, they capture little packets of nutrients from the air in the form of insects. Predatory plants have different methods of luring prey including bright colors and sticky nectar. Once a fly is lured into the danger zone, the Venus Fly Trap has special hairs inside it's "mouth" that detect when something bumps against it. On the second bump, the trap will close. That's some interesting and clever behavior there - if it immediately started closing, the fly could still be airborne and would have an easier time escaping, so by waiting for the second bump the fly is lulled into a false sense of security thinking it can safely land. After the trap has closed, it waits for 5 more stimuli of the hairs before it begins digestion, to ensure the thing it trapped is actually an edible insect wriggling around.

Source: keywordsuggest.org/gallery/242369.html
Another clever take on luring prey is the Angler Fish, which has a glowy lure on top of it's head that attracts unsuspecting fish into its terrifying mouth. Using traps and lures can save a lot of energy, because you don't need to move around as much if your food comes to you.

Source: www.flickr.com/photos/uswildlifeimagescom/5733874585
Birds have some of the scariest built-in tools in the form of deadly talons. Talons can be used for a variety of killing purposes including dismemberment and squeezing to death, and can be used in combination with the beak to tear critters into bite-sized pieces (sometimes while the critter is still alive).

Source: http://tigers4kids.weebly.com/hunting--diet.html
Big Cats can cover the spectrum of strength, speed, stealth, and teamwork, but they are certainly not the only ones. Tigers are known for their tremendous strength which allows them to take down prey in their solitary lifestyle.

Source: http://southernafricatravel.com
Lionesses work together to bring down prey that would be much too large and fast to bring down solo. While working together, the lionesses also employ stealth to sneak up close to their prey before begging the chase.

Source: https://s-media-cache-ak0.pinimg.com/originals
/62/c0/d2/62c0d2b7d56679af09270d36e72a5f5c.jpg
Cheetah, being the fastest land animal, is a go-to example of using speed for hunting. Cheetahs can reach top speeds of 60 miles per hour, though only for short sprints. Perhaps more impressive than their top speed is how quickly they can start and stop a sprint.

Source: wideopenspaces.com/10-animals-school-humans-camouflage/
Leopards are a master of camouflage, their patterns of spots help them melt into the shadows of trees and brush.

Many of these methods of hunting can also be used by prey to avoid being hunted. Prey rely on their speed, strength, teamwork, and stealth to stay alive in a dangerous world. And all of these methods can be applied to your characters when design your AI's behavior in games.