About Me

My photo

My goal is to make a difference in the world, and I think the way to do that is through games. My education is in programming and 3D animating, and I have worked in e-Learning and game development. 

Sunday, March 25, 2018

Game Dev Tutorials

Many game devs get their start from following tutorials, because even though there has been a surge of game-related classes and programs in colleges and schools, a lot of the content taught is overarching principles of design and workflows. Hands-on how-to material is still largely dominated by online tutorials.

Tutorials are a great way to learn how to build specific mechanics within specific engines or frameworks, and can be beneficial even for seasoned developers. Often tutorials cover not only scripting, but also art asset creation, world building, animation, and effects...you can make a complete game from scratch by following a tutorial, or you can pick up bits and pieces to fill in knowledge games.

For the past year I've been working on tutorializing content that I teach in classes, workshops, and camps. This is beneficial to anyone participating in my classes because it means they can work at their own pace and continue to learn even when I am not there, and it also allows people from all around the world to experience this content because it is available online (and much of it is free).

If you are an aspiring game developer, or if you teach game dev classes or workshops, I hope you will consider checking out my suite of tutorials. Here is a rundown of what I have to offer (all tutorials use the Unity game engine):


This tutorial covers everything you need to know about using 2D sprites in Unity, either for 2D game objects  animated for 2D gameplay, or for UI elements in a 2D or 3D game. Topics covered include sprite sheets, color, sliders, and rotation/scale/position animations.

Unity has a built-in physics engine that allows you to quickly and easily animate objects using physical properties such as force, torque, friction, mass, and bounciness. In this tutorial you will learn how to use these different properties to create a wide range of movement and behavior for physics-based objects in both 3D and 2D.

In addition to rigidbody physics, Unity also supports cloth physics. This tutorial covers the basics for setting up an object to behave like cloth, including constraints, stiffness, stretchiness, and animated force for wind and gravity.

This tutorial breaks down the most common features of Unity's Particle System component, including sprite sheets, velocity over lifetime, color over lifetime, trails, emissions, and bursts.

One of the biggest distinguishing factors from a student game and a professionally developed game is the "juiciness" of the interactions. The hallmark of game design is making the game "feel good" and so much of that good feeling comes from making things feel reactive to the player's actions. This tutorial covers specifically using particles and sound effects to make the interactions juicy.

This tutorial is available on Pluralsight and covers the full pipeline for creating your first game prototype. It covers whiteboxing, importing assets, building a level, adding animations, particles, lighting, and sounds, and making it a complete experience with a UI start screen and win/lose screens.


This tutorial is specifically intended for aspiring VR developers, and gives the basics of getting started with room-scale VR on the Vive, and mobile VR.

Please subscribe to our youtube channel for more tutorials, gameplay videos, and other fun things! Thanks for reading!

Tuesday, January 30, 2018

Year in Review 2017

January seems as good a time as any to reflect on the past 12 months of progress and growth, both personally and professionally. 2017 was a tough year for many with uncertainty surrounding political and regulatory changes. For me and my studio, the sociopolitical climate has had minimal impact, and overall 2017 was quite a success.

Astire Games made a reasonable profit from contracts and consulting work, and most of that profit was converted into hardware at the end of the year. We had several small contracts that went well and ended on a positive note, and one large project that continues into 2018. The contracts and consulting work find our ongoing development of Cosmos Arena, which has been a slow and arduous project but I am optimistic it will perk up in 2018.

Cosmos Arena itself saw some exciting updates in 2017 as we pushed hard to make it a presentable demo for the Intel Showcase at the Austin Game Conference. Some new gameplay elements have emerged, and we've made strives towards better player feedback and overall UX improvements.

I also finally was able to afford to bring on a concept artist to help us work towards our own art direction (up until now we've been relying on the Asset Store, but with the new art direction we can begin internal development of our art assets to better match the style and feel of the game).

On a personal note, I was thrilled to accept an Adjunct Faculty position at the University of Texas for Spring 2018 teaching a course on AR and VR for games. I continue to teach at the Art Institute of Austin, but UT brings new and exciting challenges to the academic side of my career.

Here's hoping 2018 is filled with progress for everyone! Thanks for reading!

Sunday, January 21, 2018

VR Fundamentals

There is a common misconception that Virtual Reality began with the Oculus Kickstarter in 2012. Oculus launched the new wave of VR hype, but this was not the first occurrence. In the 1990's there was a boom of VR being integrated into arcades.

Nintendo even released the Virtual Boy as a VR device for home use.

Even before the 190's, VR existed as a concept and has a number of early attempts including the Sensorama in 1962.

In 1830 we had the invention of the first Stereoscope, which allowed the user to view a stereoscopic 3D image.

Virtual Reality has a long and rich history. Today, there are a number of VR devices on the market. At the top we have the Oculus Rift (now owned by Facebook) and the HTC Vive. Both come with a headset that tethers to your PC, hand-tracked controllers, and external tracking devices that allow for positional head-tracking.

The positional head-tracking allows for what we call 6 Degrees of Freedom, meaning x/y/z rotation and x/y/z position. It also allows for Room-Scale VR which is where the user can walk freely around a VR room, about 2 meters in each direction.

Beyond Oculus and HTC, there is also a growing market for mobile VR including the Samsung Gear VR, Merge VR, Google Cardboard, and Google Daydream.

Mobile VR generally involved inserting your smartphone into a headset. It offers 3 Degrees of Freedom (meaning rotation only) but it allows you to escape the tether to a powerful PC. You can put on a mobile headset anywhere - on the bus, at an event, while waiting in line, even in bed...The only thing needed to power mobile VR the smartphone you already carry with you.

With all of these headsets on the market, there is a growing need for developers to create new and exciting content. This is a phenomenal time to get into VR development, because not only are there a ton of hardware options available to the public but there are also a ton of resources being made available to potential developers. The Unity game engine offers built-in VR support that will automatically work with a number of the headsets listed above. The hardware developers also make interfacing with their hardware very straightforward through SDKs that anyone can download.

There has never been a better time to become a VR developer. I've created a tutorial to help you get started on your way: https://www.pluralsight.com/courses/unity-vr-fundamentals

Thanks for reading! If you enjoyed this, please Follow and Subscribe!

Sunday, August 27, 2017

Game AI: Non-Human Behavior Part 5

This is part 5 of a series on Game AI for Non-Human Behavior. Here's what you might have missed!
Part 5 will be a deep dive into sensory input, and resulting behaviors.

Source: Atari
In 2015 there was an article published in nature that used Atari 2600 games to explore Reinforcement Learning in AI. Reinforcement Learning is the process of allowing AI to explore different options and learn behavior through a reward system, as opposed to Supervised Learning where AI performing sub-optimal behavior is explicitly corrected. The article, "Human-level control through deep reinforcement learning," examines sensory input used to understand the surrounding environment. In their research, the sensory input used correlates to visual input (or sight), allowing the AI to look at the pixels that make up the current game state.

Source: The Hunt, Netflix
In contrast, these blind catfish living in underwater caves have adapted to survive completely without sight. When one sense fails, we rely on honing our other senses to adapt to a world without the missing sense.

Source: https://askabiologist.asu.edu/echolocation
As children, we learn the five senses as sight, sound, smell, taste, and touch. These are the senses we as humans understand, because these are the senses we experience. However, there are other senses present in nature that are outside our area of experience. Bats, for example, use sonar in place of sight to determine the positions of things around them.

A "sense" is defined as a system of a group of cell types that responds to specific physical phenomena, which correspond to a particular region of the brain that interpret those signals. Some plants have specialized cells which detect gravity, allowing the plants to grow upright with their roots growing down into the earth.

Source: http://www.defenders.org/sharks/basic-facts
Even with senses we understand, such as smell, non-human creatures have far superior uses of them in many cases. Sharks can determine the direction a smell is coming from based on which nostril received the scent first.

Source: nature.com/scientificamericanmind/journal/v19/n4/images/
Though the focus of reinforcement learning is on changing behavior based on rewards, it relies on information about the environment being interpreted through senses. We can see something similar happen in the classic rat maze example - the hungry rat is placed in a maze that has a piece of cheese at the end and allowed to explore the maze until it finds the cheese. The rat is then placed back at the start of the maze, and this continues until the rat manages to navigate the maze perfectly. The cheese in this example is obviously the reward, but the rat needs a way to understand its environment in order to get the reward.

Source: http://www.smithsonianmag.com/smart-news/
Beyond just understanding the environment, the senses also allow organism to understand the creatures around them, resulting in group-based behavior like flocking. To put flocking in simple terms, we can use an algorithm that uses cohesion, alignment, and separation to simulate behavior well enough for gameplay purposes.

Source: http://harry.me/blog/2011/02/17/neat-algorithms-flocking/
Each agent in the AI flock will calculate it's desired vector of movement based on cohesion (the need to stay within in the group), alignment (the need to face the same direction as the group), and separation (the need to avoid hitting other members of the group). The calculation for cohesion is to look at the position of your neighbors within a specific range, find the center of mass, and move towards that center of mass. For alignment, the agent looks at the direction each of its neighbors is facing and aligns itself to the average. For separation, the agent will check to see if it is too close to any of its neighbors, and adjust accordingly. The three of these combined into the agent's velocity vector will result in a simple flocking behavior.

Source: https://www.gizmodo.com.au/2014/03/
However, the flock's simple behavior may be disrupted by the presence of predators, causing the agents to need to assess their environment beyond just the position and orientation of their neighbors.

Source: https://www.wired.com/2011/12/
Bees are another wonderful example of teamwork behavior. Pheromones are special scented chemicals that allow some organisms to communicate information with each other via scent. Bees use pheromones the share information through the hive, resulting in a hive the essentially thinks together as if it were a brain with each bee acting as a neuron. I can imagine a similar system that could be used to have a group of friendly characters so coordinated that they act as a hive of bees with the player as their queen.

If you have enjoyed this series on AI for non-human behavior, please follow and subscribe!

Tuesday, August 22, 2017

Game AI: Non-Human Behavior Part 4

This is part 4 of a series on Game AI for Non-Human Behavior. Here's what you might have missed!
  • Part 1: Defining "Game AI" and "Non-Human Behavior"
  • Part 2: Making Decisions, Predators and Prey
  • Part 3: Weird Inspirations from Nature
 Today let's talk about how creatures in the wild go about hunting.

Living things need food to survive. There are different ways to get food - some plants make their own food, some animals eat plants, and some creatures hunt. Hunting generally refers to actively pursuing your food, but for the purpose of this post I'm going to expand it to include luring your food to come to you.

Source: https://www.youtube.com/watch?v=z5fOsgrAJiU
Here are a few of the tactics that predators use to catch their food:
  • Speed
  • Strength
  • Traps / Lures
  • Stealth / Camouflage
  • Teamwork
  • Built-in Tools (ie. claws and teeth)
The Venus Fly Trap is a fabulous example of  a lure / trap. Predatory plants have adapted this way of life because they live in environments where it is challenging to absorb nutrients from the ground using roots. So, they capture little packets of nutrients from the air in the form of insects. Predatory plants have different methods of luring prey including bright colors and sticky nectar. Once a fly is lured into the danger zone, the Venus Fly Trap has special hairs inside it's "mouth" that detect when something bumps against it. On the second bump, the trap will close. That's some interesting and clever behavior there - if it immediately started closing, the fly could still be airborne and would have an easier time escaping, so by waiting for the second bump the fly is lulled into a false sense of security thinking it can safely land. After the trap has closed, it waits for 5 more stimuli of the hairs before it begins digestion, to ensure the thing it trapped is actually an edible insect wriggling around.

Source: keywordsuggest.org/gallery/242369.html
Another clever take on luring prey is the Angler Fish, which has a glowy lure on top of it's head that attracts unsuspecting fish into its terrifying mouth. Using traps and lures can save a lot of energy, because you don't need to move around as much if your food comes to you.

Source: www.flickr.com/photos/uswildlifeimagescom/5733874585
Birds have some of the scariest built-in tools in the form of deadly talons. Talons can be used for a variety of killing purposes including dismemberment and squeezing to death, and can be used in combination with the beak to tear critters into bite-sized pieces (sometimes while the critter is still alive).

Source: http://tigers4kids.weebly.com/hunting--diet.html
Big Cats can cover the spectrum of strength, speed, stealth, and teamwork, but they are certainly not the only ones. Tigers are known for their tremendous strength which allows them to take down prey in their solitary lifestyle.

Source: http://southernafricatravel.com
Lionesses work together to bring down prey that would be much too large and fast to bring down solo. While working together, the lionesses also employ stealth to sneak up close to their prey before begging the chase.

Source: https://s-media-cache-ak0.pinimg.com/originals
Cheetah, being the fastest land animal, is a go-to example of using speed for hunting. Cheetahs can reach top speeds of 60 miles per hour, though only for short sprints. Perhaps more impressive than their top speed is how quickly they can start and stop a sprint.

Source: wideopenspaces.com/10-animals-school-humans-camouflage/
Leopards are a master of camouflage, their patterns of spots help them melt into the shadows of trees and brush.

Many of these methods of hunting can also be used by prey to avoid being hunted. Prey rely on their speed, strength, teamwork, and stealth to stay alive in a dangerous world. And all of these methods can be applied to your characters when design your AI's behavior in games.

Sunday, August 20, 2017

Game AI: Non-Human Behavior Part 3

This is part 3 of a series, if you are just joining in be sure to start at part 1!

Nature is weird, and full of constant surprises.There are a lot of things in nature that we still don't understand.Take for example these thousands of circles in the Kalahari desert:

Source: Africa, Netflix
We don't know what causes them...Scientists have suggested numerous options - but ruled all of them out - including poisonous plants, insects, and magnetism. There is so much about our planet that is outside of our understanding, and so much to draw inspiration from.

Source: Bill Nye the Science Guy 
People think science is pretty cool, that's why we have a large chunk of the entertainment industry using science to draw in their audience. As game developers, we can use that fascination to our advantage, to get our players excited.

Source: Niche, Game
I've been playing a game recently called Niche where the player controls a herd of animals and tries to help them survive. The game does some really interesting things with reproduction and genetics, allowing the player to alter the chances of certain traits, and choosing which animals will mate with each other. This game does a really great job of taking elements of nature and science that people find fascinating, and then turning those elements into engaging gameplay mechanics.

Source: Factorio, Game
Another game that I think does a great job with this is Factorio. In Factorio, the player plays as a character stranded on an alien planet, and the player needs to collect resources to build a rocket to get off the planet. A lot of the resources can't be used as raw materials, so the player builds a lot of different kinds of processing plants, essentially creating an entire factory from scratch. But the alien species living on the planet does not want to see their beautiful planet destroyed by pollution and over-harvesting. The more the player builds and expands, the more of the natural planet gets destroyed, and the angrier the aliens become. The aliens will attack the player and the base, because they are defending their planet. I really like the way this game makes you think about how your actions are impacting nature, and finding more sustainable and less invasive ways to do things is more challenging but helps keep you at peace with the creatures inhabiting the planet.

I've design characters and AI for a handful of games, and I've learned a few lessons from the process. Instinctively, it might seem that character design, combat, and behavior are all distinct design challenges that can be handled independently, but I've found that when it comes to the player interacting with the characters, these three things are very tightly intertwined, and they deserve to be designed together. There are games that have characters that just make sense because their character design feels perfectly aligned with their AI behavior, and the transition from behavior to combat seems completely natural and expected for how that character is perceived. On the other hand, I'm sure we've all played games where a character feels out of place, their behavior does not make sense for what that creature is, and the transition to combat feels clunky and unexpected.

I personally have fallen into this trap, and I'd like to think that I learned from my mistakes, and hopefully you can too. The very first game I worked on where I designed the AI and combat was a co-op exergame where two players work together to defeat giant robot spiders in an underground subway system.

We spent a lot of time designing and researching the exercise portion, and not a lot of time thinking about our enemies. We knew we needed something that would be immediately recognized as evil, we needed the player to be able to attack from a distance, and we wanted the enemies to swarm the player. A couple of interesting things that came from our design was that the spiders would come out of webs in the corners, so players could predict where they would come from, and some spiders could attack from a distance by spitting sticky webs at the player to temporarily trap them.

This project was over 5 years ago. I was the lead designer on this project so I was responsible for designing engaging enemies, but it was my first time designing anything related to AI, so I treated the character design, AI behavior, and combat completely separately, because I didn't know any better. I took the easy route - enemies were mainly different in their health and how much damage they could do, with the exception of the web-spitting spiders, and the only distinction for the boss was that she was huge and much stronger than the little minions, but no interesting differences and no narrative reason why there was one enormous spider surrounded by a horde tiny spiders.

Why are they robot spiders? What made them? The spiders are pretty scary, but their creator is probably pretty terrifying. Why are they attacking the player? Blood lust? Or are they defending their nest? Perhaps seeing a mother spider with an egg sac could help shed some light on their behavior. If I could go back and do it again, I would begin by considering why are these characters here, and what is their incentive to attack the player. I would also draw inspiration from nature to come up with more interesting behavior instead of run in and attack.

Source: The Hunt, Netflix
Spiders are bizarre but highly specialized hunters, with unique features and behavior adapted to any environment they live in.

Source: The Hunt, Netflix
This is a Portia Spider hunting another spider by dropping down on it from above. The Portia Spider specializes in hunting other spiders, so rather than building her own web she goes out on the hunt.

Source: The Hunt, Netflix
 This is a Spitting Spider that shoots poison out of its mouth at its prey. And this Spitting Spider could end up being a meal for Portia.

Source: Africa, Netflix
And here is a spider that is cartwheeling down a hill to escape a predator. Spiders are an amazing example of how their actual behavior in nature is probably more profound and well-designed than anything we could come up with when brainstorming a spider-based character for a video game. This world has wonders and terrors beyond our wildest imagination.

Source: Image By Jon Richfield
 If you are concerned that spiders are just not large enough in real life to inflict the kind of terror you are looking for, then allow me to present to you the Huntsman spider which can get up to one foot in diameter, and can move nearly a meter per second. That sounds pretty terrifying to me.

In part 4 I'm going to talk about modes of hunting (and avoiding being hunted) that can be found in nature.

Wednesday, August 16, 2017

Game AI: Non-Human Behavior Part 2

This is part 2 in my series on non-human game AI, if you missed part 1 check it out first!

In nature, survival and reproduction are the two biggest driving factors of decision-making.Let's start with survival.

Starting with the big picture view, we know that all living things require energy to live, ie. food, and organisms have evolved different techniques based on where they get their energy from. One way to break this down is to figure out whether a creature is a predator or a prey at various points in the food web.

At the top we have organisms that are always predators, and generally nothing hunts them. But moving down the chain, there are creatures hunt, but are also hunted. This is where we can see some interesting behavior trees. Just from the high level goal of "survival" these creatures will need to make decisions about which is more important - avoiding a predator or finding food. If food is plentiful, that decision is easy, but if that creature has gone a long time with out food they may take bigger and bigger risks to find food, encroaching into areas they know to be dangers.

From a design perspective, what's interesting here is we can use this information in two very distinct ways. 1. Some games, especially hunting games and some survival games, attempt highly realistic simulated environments with a balance of creatures that exist for the player to hunt. These games can use information about predator/prey relations to generate believable content. 2. Beyond realistic simulations, in any game with enemies we can regard to player as a part of the predator/prey relation. The player wants to survive, so she must defend herself from enemies either with stealth or armor or by attacking and killing the enemies first, however most games have other objectives and the player must decide how much risk they are willing to take to accomplish those objectives.

In any scenario in games that have enemies, we can decide - do we want the player to feel like a predator here, or prey? Do we want the player to feel sneaky and clever and avoid getting caught/killed, or do we want the player to feel powerful and dominant and on the hunt? Both options create interesting dynamics, and a lot of games alternate between the two to create powerful exciting experiences.

As an example, in World of Warcraft if you encounter a high-level creature too early you will probably try to avoid it because you know it's stronger than you, but once you have leveled up you might return to fight it once you know you have a chance to defeat it.

Let's design an AI for a creature that is in the middle of a predator/prey situation. I like to think of AI decision-making systems as a sort of pro-con list:

This gives us an idea of the possible behaviors the AI might take, and what some of the factors are that determine that decision. Based on our pro-con list, we know that the primary decision we want to focus on is "stay and eat" vs. "run and hide" and we know that some of the factors include how hungry they are, how prevalent the food is, how dangerous the predator is, how close the predator is, and whether or not the predator has seen them. Now we can work on prioritizing these and converting them into a decision graph (or a behavior tree or state machine, depending on your approach).

This would obviously be different for different types of creatures, and it is also a very simplified solution - it only covers one very specific decision and two possible behaviors. Generally AI will have a lot of possible behaviors and different decision factors across the spectrum of possibilities. That is really what makes AI design such a challenging area to work in.

The complexity of the AI design depends heavily on how realistic the behaviors need to be. In theHunter:Call of the Wild the designers knew that players wanted a realistic hunting experience, and that often players would spend a lot of time watching an animal before taking a shot, so the designers had to be prepared to do extensive research on how those creatures behaved to ensure a believable experience for the player.

Most games are not held to quite such high standards of realism. Creatures in Legend of Zelda do not have parallels in real life, so they have the flexibility to be weird and wonky and still be believable.

In part 3 I'm going to get into some of the stranger behavior in nature, and how we can use it as inspiration in AI design.

Image Sources: