Getting started with Bitbucket & Git

This post is to serve as quick and painless instructions to get the less technically-inclined up and running with working version control as fast and painlessly as possible.🙂

First and foremost, download and install Git: http://git-scm.com/downloads. Default options should be fine for everything.

There are a number of Git clients out there, but the one I am currently using is TortoiseGit. Its easy to use and integrates well with Windows. Download and install with default options: http://code.google.com/p/tortoisegit/. TortoiseGit adds a menu to your right-click context menu whenever you right click on a file or folder. This a very easy way to interact with your repositories.

So now we have everything we need to start working with Git, however, many of the benefits of version control come from hosting your repository on a server. This acts as a live access point for you and your team, as well as a measure of security by having your important files redundantly saved online. Bitbucket.org is a fantastic free service that allows you to host free public and private repositories. Sign up for BitBucket and create a private repository.

Bitbucket uses SSH encryption to protect your private repository and grant you and only you remote access to your files. You need to generate a public and private ssh key pair. TortoiseGit  comes with a tool called PuttyGen. Open it and click the “Generate” button. It will ask you to move your mouse around a bunch to help create “randomness”. Save BOTH the public and private keys to a safe location on your computer. Usually a folder named “.ssh” in your home directory is used. The private key is your identity. Never give it to anyone. It is yours and it is as unique as you are. The public key is given to anyone you want to be able to verify your identity. For us, this is BitBucket.  Go to the SSH Keys section in your BitBucket account settings. Here is where we upload our public key.

Now, go to your repository on the BitBucket web site and click the “Clone” button. Copy the URL. It should look something like this: “git clone git@bitbucket.org:<your_user_name>/<your_repo_name>.git”

In Windows explorer, right click, you should see the TortoiseGit menu options. Select “Git Clone…”. All you need to do is paste your repository URL, tell it where to save your local copy of the repository on your computer, and supply your private key file that we generated earlier using PuttyGen.

Thats it! You now have a working Git repository! The basic way to interact with your repository is to add and edit files like you normally would. When you are happy with your changes, right click on your repository, and select “Git Commit…”. A window will pop up showing you all of your changes. Make sure you check any new files you added. Write a comment describing the changes you made in detail then click commit. This commits the changes to your LOCAL repository, you probably want to upload them to your BitBucket repo so your team mates can check out your awesome changes. Just press the “Push…” button on the next window. Now we have pushed all of our changes to the Bitbucket repo!

These are the basics of getting set up and how you will work with Git the majority of the time. There is, of course, a lot more to working with Git, TortoiseGit, and BitBucket than explained here, but with a little googling you should be just fine.

You can learn more about Bitbucket features and SSH here if you are so inclined: https://confluence.atlassian.com/display/BITBUCKET/Bitbucket+101

Have fun with Git!🙂


Setting up Unity for Git Version Control

As of Unity 4.2:

Navigate to Edit>Project Settings>Editor

Set Version Control Mode to “Meta Files”

Set Asset Serialization Mode to “Force Text”

This will set Unity to save all of its files in text format vs. binary format. You should do this as soon as possible, it will decrease the chances that you ever lose any work as you interact with your version control system. Unity will now generate .meta files for nearly every file it works with. Make sure to add all the .meta files to the repository.

Some of Unity’s files do not need to be added to the repository. We can create a “.gitignore” file to tell Git which files to ignore. Create a file named “.gitignore” in your repository root directory and add the following text:

# OS X

# Unity

# MonoDevelop/Visual Studio

This will tell Git to ignore auto generated files that will only clutter your repository and get in the way!


Unity Debug Rendering

I’m working on a little project that requires very precise collision placement. I need to be able to visualize collision boundaries and how they relate to graphics as I play test in a fast iterating environment. I have found that Unity really doesn’t have an acceptable answer to this problem so I decided to come up with my own solution. Unity allows you to render from multiple cameras at the same time. By being selective about what objects each camera renders, we effectively have a system where we can toggle the visibility of certain collections of objects by enabling and disabling the associated camera. I decided for my purposes I wanted the background, characters, special effects, and collision visualization all on different layers. We can parent all of our new cameras to an empty game object so their orientations never gets out of sync. I added 5th camera to clear the screen. You can number your cameras in the order you want them to render so you never forget. You end up with something that looks like this:


There’s still a little bit more work to do before this setup will work any magic. We need to edit some camera properties. First is render order. Thats the Depth field. Set the clear camera to 1 and increment by one for each successive camera. Next is Clear Flags. For my purposes, I set the clear camera to “Skybox”. This clears the entire frame buffer for the next render frame. I set the background, character and fx cameras to “Dont Clear”. This ensures that we see everything each of these cameras renders. Lastly, we set the Debug camera to “Depth Only”. That will clear the Z buffer so our collision boundaries get rendered even if they are technically behind our actual game graphics. The last property we need to set is the “Culling Mask”. This is how we sort which objects get rendered by which camera. You need to create a Layer for each camera then set the Culling Mask appropriately. Make sure to set your game objects to the appropriate layer as well!


So technically this works, but you’re going to want to create a debug material so you can actually see the collision boundary in relation to the graphics. A solid transparent material works best. Unity’s “Transparent/VertexLit” material is exactly what we want. Just set the main and emmisive colors to the same color. Don’t forget to lower the opacity on the main color.


We now have a setup that allows us to toggle on and off different graphics layers, and most importantly, we have a GREAT way to visualize our complex collision data!

Here’s an in-game view with different cameras turned on and off:


So this is great, but we can do even better. Its a pain digging through the hierarchy every time we want to change what gets rendered. Wouldn’t it be nice if we could just collapse our parent CameraData object and never have to worry about it? Thanks to Unity’s awesome editor scripting functionality we can do just that.

We start by adding a simple script to our CameraData object to provide easy access to our Camera objects.

using UnityEngine;
using System.Collections;

public class CameraData : MonoBehaviour {

public GameObject bgCam;
public GameObject charCam;
public GameObject fxCam;
public GameObject debugCam;


Now we can simply drag and drop our cameras from the hierarchy to the respective field in the inspector.


We can script a custom window in Unity to give us more intuitive controls over debug rendering.

using UnityEngine;
using UnityEditor;
using System.Collections;
using System.IO;

public class DebugRenderWindow : EditorWindow {

bool displayFoldout = true;
CameraData cameraData;

bool bgCameraToggle = true;
bool charactersCameraToggle = true;
bool fxCameraToggle = true;
bool debugCameraToggle = true;

void OnEnable(){
cameraData = (CameraData)GameObject.Find(“CameraData”).GetComponent(typeof(CameraData));

// Add menu item named “Debug Renderer” to the Window menu
public static void ShowWindow(){
//Show existing window instance. If one doesn’t exist, make one.

void OnGUI(){
displayFoldout = EditorGUILayout.Foldout(displayFoldout, “Render Settings”);
bgCameraToggle = EditorGUILayout.Toggle(“BG Draw: “, bgCameraToggle);
GameObject bgCam = cameraData.bgCam;

charactersCameraToggle = EditorGUILayout.Toggle(“Character Draw: “, charactersCameraToggle);
GameObject charCam = cameraData.charCam;

fxCameraToggle = EditorGUILayout.Toggle(“FX Draw: “, fxCameraToggle);
GameObject fxCam = cameraData.fxCam;

debugCameraToggle = EditorGUILayout.Toggle(“Debug Draw: “, debugCameraToggle);
GameObject debugCam = cameraData.debugCam;



Now we have a nice little intuitive window that doesn’t get in the way and best of all, we can can place it anywhere we like just like Unity’s built-in windows!

Painting With Polygons

I was inspired when I saw this video: http://vimeo.com/5660045. So much so that I just had to figure out how to implement it in Blender. It’s really not that hard.

Here’s what I was able to achieve with just a little bit of tinkering:




First add an empty to the scene at the origin. Set a location key frame. Move one frame ahead and translate the empty by one unit on the x axis. Add another location key frame. Go to the graph editor, select all of the curves then set the interpolation type to linear. Lastly, add a cycles modifier. This will ensure the effect has no temporal artifacts. When you are done you should have something that looks like this:


Once our empty is all set up, we need to add a displacement modifier to the object we want to apply the technique to. Add a displacement texture. I recommend starting with a clouds texture, but feel free to experiment for different effects. Set the texture coordinates to object then set the object field to our empty object.


Lastly, we need to enable motion blur. As far as I know this only works with Blender Internal at the moment. Set the motion samples to something greater than 1. The motions samples and the shutter value can be played with. Motion samples between 4 and 8 have been desirable to me. Blender Internal’s motion blur is calculated by rendering out where objects would be on sub frames and then combining them with transparency. The more samples the longer it will take to render.


Thats it! I encourage you to play with the textures, displacement and motion blur settings to achieve some really cool results!


Welcome to Tiki Island!

Welcome to Tiki Island!


Meet the terrible Tiki Island Golem!


Tiki Island resident:



Tiki Island was featured in a competitive multi-player children’s iPad game. Players took turns moving from island to island enchanting tikis and avoiding the troublesome golem of the island. All the 3D assets were made in Blender. I was responsible for technical graphics such as particle effects and atmospheric perspective, ensuring optimal frame rates, game design, game play programming, lighting, rendering,  batch scripting, bridging communication between artist and engineers, and training the team in tools and procedures. Working on such a restrictive system poses its share of difficulties, but nothing that can not be overcome with critical thinking and perseverance!

Low Poly Volcano


This was such a fun project. Every problem was tackled from a procedural point of view. This is true programmer art. The notion that programmers can not create beautiful art is ridiculous. When approached from a logical  and methodical point of view, good art becomes possible to even those who claim not to have a trace of creativity. The tree was created without ever even going into edit mode! Thats right, I didn’t move a single vertex on my own. Honest, Its all modifiers! The lava and pyroclastic flow of the volcano are nothing more than cubes spawned with a particle system. I used the grease pencil in projection mode to draw the paths of the lava directly onto the volcano. From there I spawned emitter geometry. The ground and water are procedurally displaced planes. The clouds are another particle system of cubes all spawned at the same elevation. The cubes are given a transparent material to create the interesting look. I just browsed through the particle system seeds until I found one that made a nice composition. Procedural art is fun!