SJBOT & Patch Prototype 2.5

I had a good, though tiring, day at the BDC Entrepreneurship booth at the St. John’s Board of Trade Trade Show & Conference. I met a few new folks, like Ashley from Fundamental Inc, who was kind enough to talk about her work in renewable institution planning, and Armin from AS Works, who had a really cool drone on show.

Also saw a few familiar faces, including co-boothee Julie Lewis from SassyTuna Studio, which was nice – another game dev was good to see!

I wasn’t sure what to expect from the experience, but I talked to a number of folks about what I’m doing, and had a few people who wanted to connect either on the recruitment side or on the contracting side of things, both of which are encouraging signs – getting new devs is a challenge, and finding work is always good!

I spent a lot of my day, however, just showing off Beat Farmer and watching people play it, which lead to the refinement of yet another prototype, which I’m calling 2.5. This is the first version guided by a substantial amount of user feedback, and I feel really good about the result.

Check it out!

Demo Reel: Patch Prototype

I got the art for the first prototype from Clay recently, and so I reached out to my music guy, Georgie, to get some sound to back the demo up. Georgie mentioned he’d like to get a demo video to help during composition, so I put something together last night.

I’m tempted to call this the Soul Patch, but that’s probably not a good idea, given Georgie borrowed a banjo recently.

I’m prototyping with this level because it requires most of the elements I expect to need for every level in the full game, without requiring many single-use graphics or effects.

So far it’s been interesting. This level marks my first exposure to the Animation system in Unity, and it’s been interesting. I found myself struggling at first to understand how to put things together, because I really wanted to administrate state changes centrally, but once you’ve given up on that notion, things become fairly simple – import a sprite, create a GameObject, drag a set of animation frames over the object, and the Animation editor takes it from there.

I’m using a couple of control variables to manage transitions to new states, and I think I’ll probably end up allowing the Animation to drive the rest of my gameplay.

I’m not sure that’s the most efficient solution here, but it’s certainly the most straightforward, and to be honest, Beat Farmer isn’t likely to be a technically demanding game. My primary concern is keeping it relatively manageable in terms of development effort.

It’s nice, though. My little beets are growing up so fast!

Theorycrafting a fully-integrated Eve MMOFPS

I’ve been doing a lot of game design of late, both for Beat Farmer and for other projects I’ve had in the back of my mind for a while. Having stumbled over Eve and its FPS tie-in ambitions again today, I kind of want to figure out what my approach to this might look like.

First, some background.

Eve Online is a massively multiplayer game with starship combat and player-driven economics and “geographic” (topological?) control. It is famous for, among other things, its difficulty curve:

Can’t find an author – if it’s you, HMU

Many moons ago now, CCP Games released Dust 514, a first person shooter aimed at giving new players an “in” to the Eve universe, as well as expanding the existing player base’s available experience.

Dust 514 was shuttered, and CCP has gone through a couple of stabs at a design for a successor. The first was Project Legion, which, judging by the name alone, may have contained some of the ideas I’m going to talk about below. The second will be Project Nova, which sounds like it will be a much more straight-FPS first. After looking at GunJack videos, that’s probably the right decision. Eve was never a game for the nice-UX crowd.

Integrated Game Worlds and Design

As fascinating as it is to read about Eve’s many born-in-real-life treacheries, the universe itself isn’t that compelling. The compelling part of Eve has always been its large-scale gameplay, where massive alliances conduct battles so fierce that time literally slows down within the game so that the servers (and, no doubt, the clients) can run the necessary calculations.

That seems like a great place to start for a planetary conflict FPS. Fleets need to bring their troop transports to bear because ground forces are the only forces that can take territory. That’s interesting from a space combat standpoint because at that point you need to have well-protected transport ships.

Luckily, Eve already has these gigantic transports, and shipping is its own deep specialization within the game. But.

Blowing Things Up With Smaller Guns

The thing about Eve is this: the whole idea is to make spaceships shoot at each other. Every piece of the player-driven economic system is, in the end, a way to let both hardcores and casuals put together ships and fleets and super-fleets and massive starships and starbases oh my. These heavy things necessarily dominate the gameplay, and there’s no reason to imagine that a planetary assault wouldn’t come down to who has the better fleet in every case.

So you’d have to introduce gameplay restrictions in order for this to work.

You could, for example, add planetary defence systems that are able to  blow a Titan out of the sky in short order but, in keeping with the game’s oeuvre, can’t hit anything smaller than a supercapital. This could get even more interesting if planets can throw off their governance once in a while, leading to both sides having to dodge space flak.

(Side note: There are almost certainly Eve players who would enjoy an associated game wherein their only job was to manage the governance of planets, quashing rebellions and maximizing productivity. There could be a tie-in between this game and the FPS players as well, where FPS players lead rebellions and security forces in conflict with one another.

There could even be a competitive aspect to the management game, with “sleeper” agents embedded in the game exercising various levels of mismanagement to mess with the managers. I don’t know whether the management types would like this element, but I digress…)

Another, possibly complementary, approach would be to  introduce the idea of orbital pollution, where intense spaceborne warfare in proximity to a planetary body would reduce the effective value of the planet while the orbit was “cleaned up”. You could still see skirmishes and raids with smaller ships to try to disrupt or destroy orbiting transports, but the heaviest ships would have to concentrate on controlling the supply line rather directly participating in orbital dominance.

To my mind, it seems like introducing these limiting factors would allow the optimization of the play experience in each game for its intended players.

Ground Assaults, Economics, and Availability

As to the transports themselves, I feel like the Dust approach, which limited itself to orbital bombardments, was a poor solution. It would be much more Eve-ish, to my mind at least, to integrate the transports themselves into a resource equation for a particular planetary war. Eve players and FPS players would have to negotiate how many clones, resources, and buildings to supply to the war effort. FPS players could see – possibly depending on their rank – exactly what is left available on the planet and in orbit, which would raise the stakes on their individual and team performances.

From a technical standpoint, one of the harder things to get right about this situation would be matchmaking. I think there’s room here, too, for improvement.

The first thing would be ensuring the conflict doesn’t pause because FPS players go to bed. Bots could be brought in as part of the supply drop, ensuring that FPS and space gameplay are not over-reliant on one another. FPS players might face AI-controlled bots for a while, then see a gradual increase in clone troopers, who would be controlled by other players.

For those unfamiliar with the Eve Universe, there are grades of clones, which brings a whole different question into play – when FPS players field their top-tier clones, and why? What payment do they require? How do they handle the collapse (or retreat; betrayal is a key part of the game, after all!) of their space-based support lines?

This appeals to me as a former Eve player. Judging exactly how many clones might available for an assault and for how long would become a critical part of the supply line equation. Balancing different bot types – Titanfall comes to mind – would come into play. And it’s possible to tie in economic incentives by offering clone trooper contracts on a cross-game marketplace to improve your assault’s chances.


Allowing cross-game discussion and negotiation in some form would be critical, even if it’s more like email than Messenger in design. Eve players and FPS players probably have some crossover, but I would expect that in most cases you’d see two separate entities (Eve Corp/FPS Mercenaries)  discussing terms via this system. From a community and design standpoint, this could be a great place to gather data about how players are actually doing what they do and how to improve the experience.


That’s kind of what I see being the best version of the tie-in idea. It’s obviously weighted towards Eve-first. That’s the starting point, and it’s also the only CCP game I’ve played. That also means there’s a recognizable and not necessarily positive bias in the design here, and I’m not sure I’ve solved the issues perfectly.

Regardless, I’ll be interested to see how CCP deal with Project Nova and to what extent they implement gameplay that melds these different genres in interesting ways.


Solo dev: Bizdev edition

I’ve been working on getting the non-gameplay aspects of  Beat Farmer figured out of late, and that has meant getting some of the basics figured out for Perfect Minute as a functioning business.

Before I did anything, I needed to commit to the business more heavily than I have been. I have an aversion to not paying people for their work, so I started putting away $100 per paycheque from my day job. There are a variety of opinions on funding game development, some of which encourage you to self-fund, others focused more on external investment, but as a rule I find that paying out of pocket helps me remember to look for the best possible value for my money, so that’s my preferred bootstrapping method.

With that tiny pot of money, my first order of business was finding an artist. I’m trying to hire locally where possible, so I sent out a call on this blog and on my friendly local game development Facebook community. I got a few portfolios right away, including an artist I was very interested in working with, Clay Burton.

Finding someone so quickly meant I had to scramble a bit to get the contract drawn up. I initially considered using Law Depot, but I didn’t feel confident that I would get something I could trust to legally enforce the rights I needed.

I looked around town to find a lawyer specializing in IP and media and settled on Lindsay Wareham at Cox and Palmer, whose focus areas include Intellectual Property and Startups, which seemed like a good fit. I’ve since discovered that Cox and Palmer have several folks working together in this area, as well as a helper program for startups in general, which gives me hope that I have, for once, made a pretty good call.

The drafting of the contract took a couple of weeks and wasn’t too expensive, as legal matters go. A lot of good questions came up during my conversation with Lindsay, though, stuff like:

  • Are you incorporating? (not yet)
  • What share structure do you intend to use for your corporation? (not sure, and I have conflicting information about the best structure to use)
  • Where will the copyright and moral rights reside? (with me until incorporation)
  • Do you foresee selling products other than games? (yes)
  • Do you need trademarks registered? (yes, when I have a bit more money)

Two weeks later I had a shiny new contract ready to fill out. I sent it over to my artist, who sent it back with his name on it…but not a witness! This is my first time doing this, and I didn’t want to bug the guy more than necessary, but after chatting with Lindsay, I had to go back and beg him to get it witnessed as well. So that’s ready to go.

I also sent out a call a while ago for a music person for the game, and I use the word “person” on purpose there, because I don’t know much about doing music in a game.

One of the musicians I know in town recommended his buddy, Georgie Newman. Georgie and I had spoken briefly after that initial request, but never got around to talking further. I reached out and we decided to meet up and chat.  That turned out to be really great for me, as Georgie knows what he is at to a much higher degree than I do when it comes to game audio.

That conversation has now left me with a number of things I need to do (“action items”, as the cool fogies say):

  • Flesh out the design for Beat Farmer enough to do cost and marketing plans
  • Figure out how much Beat Farmer is going to cost to make and market
  • Figure out the best sales model for this game and its follow-ons
  • Figure out how I’m going to fund the first few Perfect Minute Games ($50/week ain’t gonna cut it forever, after all)

As error-prone dark-groping goes, this has actually been ok. I’m hopeful that I can get all the way to the publishing phase without destroying myself and/or the company financially or otherwise in the process.

I’ll keep you posted!

Small art contract

As I mentioned on Twitter,

I’m looking for a freelance artist to do a small job for Beat Farmer.

I’m looking for someone who can do clean 2d/3d work in a cute/cartoon style. If you happen to know anyone who might suit,  please have them send a portfolio to

Super Simple Unity Surface Shader

As part of a project I’m involved with, I’ve been back at the shader business a little bit lately. In particular, I’ve been interested in how to provide input to a shader to allow dynamic displays of various kinds.

This post will be super-basic for those of you who already know how to write shaders, but if you’re just starting out with them and using Unity, it may provide a little extra help where you need it.

The shader explained below is a surface shader, which means that it controls the visual characteristics of particular pixels on a defined surface, and more particularly that it can interact with scene lighting. It also means that Unity does a lot of heavy lifting, generating lower-level shaders out of the high level shader code.

Doing this the way I am below is probably overkill, but since I’m learning here, I’m gonna give myself a pass (Shader Humour +1!).

Creating and Using a Surface Shader in Unity

In Unity, a Shader is applied to a rendered object via the object’s Material.  As an example, in the screenshot below, a shader named “PointShader” is applied to a Material named Upstage, which is applied to a Quad named Wall.

You can see in the UI that the Upstage material exposes two properties (actually 3, but we can ignore one of them), Color and Position. These are actually custom properties. Here’s a simplified version of the shader code for PointShader.

Shader "Custom/PointShader"{
  Properties {
    _MainTex("Dummy", 2D) = "white" {}
    _MyColor ("Color", Color) = (1,1,1,1)
    _Point ("Position", Vector) = (0, 0, 0, 0)
  SubShader {
    // Setup stuff up here
    // More setup stuff

    sampler2D _MainTex;
    fixed4 _MyColor;
    float4 _Point;

    // Implementation of the shader

That “Properties” block defines inputs to the shader that you can set via the material, either in the Unity editor or in script.

In this case, we’ve defined 3 inputs:

  1. We will ignore _MainTex below because we’re not really using it except to ensure that our generated shaders properly pass UV coordinates, but basically it is a 2D graphic (that is, a texture). It’s called “Dummy” in the editor, and by default it will just be a texture that is flat white
  2. _MyColor (which has that My in front of it to avoid any possible conflict with the _Color variable that exists by default in a Unity Surface Shader)  is a 4-component Color (RGBA). This type is basically the same as the Color type used everywhere  else in Unity. This variable has the name “Color” in the editor, and defaults to opaque white.
  3. _Point is a 4-component Vector, which is slightly different from a Color in that it uses full floating point components, as you can see in the SubShader block. It’s referred to as Position in the Unity UI. The naming is up to you; I’m just showing you that you can use one name in code and a different one in the editor if you need to. It defaults to the origin.

As you can see in the screenshot above, you can set these values directly in the editor, which is pretty handy. The real power of this input method, however, comes when you start to integrate dynamic inputs via scripting.

PointShader was created as a sort of “selective mirror”. It allows me to apply an effect on a surface based on the location of an object in my scene. In order to do this, I have to update the _Point property of my material.  The code below shows how I’m doing that in this case.

public class PointUpdate : MonoBehaviour {
  public Vector2 texPos;
  internal override void Apply(Vector3 position) {
    var transformedPoint = this.transform.InverseTransformPoint(position);
    var tempX = .5f - transformedPoint.x / 10;
    var tempY = .5f - transformedPoint.z / 10;
    texPos = new Vector2(tempX, tempY);
    var material = this.GetComponent<MeshRenderer>().material;
    material.SetVector("_Point", texPos);

Whenever my tracked object moves, it calls this Apply method, supplying its own position as a parameter. I then map that position to the local space of the object on which my shader is acting:

transformedPoint = this.transform.InverseTransformPoint(position);

Then I turn that mapped position into coordinates on my texture.

Three things you should know to understand this calculation:

  1. Texture coordinates are constrained to the range of 0 to 1
  2. A Unity quad has sides of length 10
  3. In this case my texture coordinates are inverted to the object orientation

var tempX = .5f - transformedPoint.x / 10;
var tempY = .5f - transformedPoint.z / 10;
texPos = new Vector2(tempX, tempY);

Finally, I set the value of _Point on my material. Note that I use the variable name and NOT the editor name here:

material.SetVector("_Point", texPos);

With this value set, I know where I should paint my dot with my shader. I use the surf() function within the shader to do this. I’ve added the full SubShader code block below.

SubShader {
  Tags { "RenderType"="Opaque" }
  LOD 200
  // Physically based Standard lighting model, and enable shadows on all light types
    #pragma surface surf Standard fullforwardshadows

  // Use shader model 3.0 target, to get nicer looking lighting
  #pragma target 3.0

  sampler2D _MainTex;
  fixed4 _Color;
  float4 _Point;

  struct Input {
    float2 uv_MainTex;

  void surf (Input IN, inout SurfaceOutputStandard o) {
    if(IN.uv_MainTex.x > _Point.x - 0.05
        && IN.uv_MainTex.x < _Point.x + 0.05
        && IN.uv_MainTex.y > _Point.y - 0.05
        && IN.uv_MainTex.y < _Point.y + 0.05 ) {
      o.Albedo = _Color;
      o.Alpha = 1;
    } else {
      o.Albedo = 0;
      o.Alpha = 0;

The Input structure defines the values that Unity will pass to your shader. There are a bunch of possible element settings, which are described in detail at the bottom of the Writing Surface Shaders manpage.

The surf function receives that Input structure, which in this case I’m using only to get UV coordinates (which, in case you’re just starting out, are coordinates within a texture), and the SurfaceOutputStandard structure, which is also described in that manpage we talked about.

The key thing to know here is that the main point of the surf() function is to set the values of the SurfaceOutputStandard structure. In my case, I want to turn pixels “near” my object on, and turn all the rest of them off. I do this with a simple if statement:

  if(IN.uv_MainTex.x > _Point.x - 0.05
    && IN.uv_MainTex.x < _Point.x + 0.05     && IN.uv_MainTex.y > _Point.y - 0.05
    && IN.uv_MainTex.y < _Point.y + 0.05 ) {
  o.Albedo = _Color;
  o.Alpha = 1;
} else {
  o.Albedo = 0;
  o.Alpha = 0;

Albedo is the color of the pixel in question, and Alpha its opacity. By checking whether the current pixel’s UV coordinates (which are constrained to be between 0 and 1) are within a certain distance from my _Point property, I can determine whether to paint it or not.

At runtime, this is how that looks:

It’s a simple effect, and not necessarily useful on its own, but as a starting point it’s not so bad.

OSX & Kinect, 2017

So you have a MacBook (or something else that runs OSX) and you want to play with the Kinect sensor, but you’re having trouble because there are about 1 billion sets of wrong instructions on the internet on how to connect this Kinect. Let me save you a little grief.


I have the Kinect “v2”, aka Kinect for Xbox One, aka Kinect for Windows, aka (in my case) Model 1520. The instructions below work for my version. The only serious difference if you have the older Kinect should be that you use a different version of libfreenect, but I haven’t tested that.


You have more than one option as far as software goes. If you’re a commercial developer, you might consider trying out Zigfu’s ZDK, which has an OSX-ready image and integrates with several modern packages, including Unity3d, out of the box.

If you’re more of a hobbyist (as I am at the moment) and don’t have the $200 for a Zigfu license, the lovely folks behind the Structure Sensor have taken on maintenance of the OpenNI2 library, including a macOS build. Your first step should be to download the latest version of that library and unzip it somewhere.

Unfortunately, their package isn’t quite complete, and you’ll also need a driver to connect the Kinect (I know, it’s getting old to me too). This is where our ways may diverge, gentle reader, for in my case I discovered that I needed OpenKinect’s libfreenect2, whereas an older sensor would require libfreenect.

Assuming that you’re using the XBox One sensor, you’ll want to read the that comes with your copy of libfreenect2. It contains all the necessary instructions for getting the right tools + dependencies and building all the things.

There are two additional things that are currently left out of their readme file. The first is that when you want to use the OpenNI2 tools, you’ll need to copy the drivers from




for whatever you’re running. So to run NiViewer, which is in the Tools folder, you’d copy it to


I expected the “make install-openni2” command from libfreenect2’s readme would take care of that stuff, but it does not.

The second omission is the troubleshooting stuff on their wiki. In particular, for my specific MacBook, I had to plug the Kinect adapter into the USB port on the left-hand side, NOT the right-hand side, as the device requires USB3, and I had to run Protonect and NiViewer using the “cl” pipeline. The default pipeline setting can be changed by doing this:


You can also pass in the pipeline for Protonect:

bin/Protonect cl

With that setting in place, you should see a window with 2 (NiViewer) or 4 (Protonect) windows, each capturing different parts of the raw Kinect stream:


From here you’re on your own, but I hope you found this at least a bit helpful!

short, beautiful experiences