I’m gonna talk about all the faffery in a minute, but if you want the TL;DR, I invite you to take a look at feature video 0.0.10 – Brush Strokes;
Why “Brush Strokes”, Cousin?
One of the folks in my local gamedev community, the incredible artist rsvpasap, asked me a while back what kind of “painting” there was in Paint By Monsters, and at the time I didn’t have any kind of sensible answer for them.
Well, now I do, at least in the first-proof-of-concept sense. The Brush Strokes feature was directly inspired by that conversation, and I hope Angie would be proud to have sent me down this road.
On Assets, Asset Markets, and General Faffery
I’ve been mostly buying assets for PBM from itch.io artists, but there’s only so much content you can find there. I’ve looked at a bunch of new spots, particularly ArtStation and the Unity Asset Store, and in the latter I happened across the Gesture Recognizer asset.
This asset, on its face, is exactly what I wanted for Brush Strokes – a clean, simple asset focused specifically on recognizing a range of gestures, with good editor integration and the ability to create new gestures without a lot of faffing about. So, you know. Good job Raphael Marques on that front.
Unfortunately, Gesture Recognizer has a few problems out of the box that make it less than ideal to work with as of this writing. Maybe the creator will update it sometime, but in the meantime let’s talk about how I approached the issues that I encountered.
Fixing Gesture Recognizer
The first problem – and really, this is as much an indictment of Unity as a vetting agent as it is of the asset creator – is that unless you’re starting with the example level included with Gesture Recognizer, creating a new Gesture and trying to fill in the ID field causes a whole boatload of exceptions to show up in the Console Log. These will lead you back to the GesturePatternEditor.OnInspectorGUI method, and in particular to this line of code:
var closedLineProp = gestures.GetArrayElementAtIndex(currentGestureIndex).FindPropertyRelative("closedLine");
There are two exceptionally unfortunate things about this line of code. The first is that nowhere does the editor code check to see whether currentGestureIndex
is a valid index for the gestures
array. The second is that it is only really required in a small subset of gestures, because closedLine
is only a concern when you’re closing a gesture that contains a closed loop of some kind.
My solution to this issue was relatively simple. I put an if statement around the entire editor block containing this piece of code:
if (gestures.arraySize > 0)
{
//do stuff, including the array access
}
This fixed the editor errors, at least for the cases I care about.
I ran into some issues with the OnRecognize event that the asset uses, but I can’t 100% rule out PEBKAC for those, so I’ll just state for the record that Dynamic Events are tricksy, and you should look for them at the very top of the event list for the handler object.
The Other Problem With OnRecognize
The really tricky bit with this event, however, is that the parameter for the event, RecognitionResult
, only includes the points from the recognized gesture. The asset seems to be focused on some kind of cleaned-up vector drawing use case, so I’m sure that works great in that circumstance where you’re doing the scaling and whatnot in another component (for the record, you probably need to inherit from DrawDetector
to do that properly).
But if you’re more interested in the gestures themselves – place, size, all that kind of stuff, and if you want to keep things nice and centralized to boot – you’re going to want the points as measured at the time of input.
Now, I’m not proud of my solution to this one, but I will say that it serves the purpose, and that’s enough for the kind of rapid feature iteration I’m doing on PBM. What I did was this: I changed the array of points that define the input being examined – which is a member variable in the object that invokes OnRecognize
(that is to say, DrawDetector
) – from private
to internal
.
internal List<UILineRenderer> lines;
This lets me do the following in my event handler:
var cam = FindObjectOfType<Camera>();
var dd = transform.GetComponentInParent<DrawDetector>();
var lines = dd.data.lines;
var line = lines[lines.Count - 1];
var point1 = cam.ScreenToWorldPoint(line.points[0]);
var point2 = cam.ScreenToWorldPoint(line.points[line.points.Count - 1]);
Vector3 gestureVec = point2 - point1;
Vector3 gesturePos = ((point1 + point2) / 2.0f);
This works for basically any straight-ish line, and gives me the two primary things I care about:
- The position of the gesture in world space
- The orientation of the gesture in world space
I expect to have to do more complex analysis for future specializations, but these two vectors allowed me to specify
- Where my brush stroke effects appear
- How big they are
- What their orientation is.
Of course, then I had to figure out how particle system shapes work and how they interact with the Position, Rotation, and Scale of their host GameObject…but more on that next time.