Category Archives: Uncategorized

Hobbyism and a new game

I’ve backed off from taking PMG too seriously these days, as I don’t have time for all the things I want to do, and taking things less seriously is one way of coping. My goal is to keep creating games under the Perfect Minute banner, but without the pressure to make money. I think that’s doable.

With that in mind, I’ve started fiddling with an idea I’ve had for a pen and paper roleplaying game. This isn’t related to the Card RPG (at least not yet). Where that project is about finding a genuinely interesting and novel (to me) diceless and possibly GMless system that can be used for quick RPG-lite style play, this one is much more rooted in older games I’ve played and loved for one reason or another.

The new thing is more setting-focused, for one thing. One of my first experiences at the table, the best part of 30 years ago now, was a longish campaign in the Rifts setting. Rifts is a funny beast; its setting, in particular, is impossible to pigeonhole, combining as it does super-tech and mecha and Nazis and gods and magic and dragons and entirely too many other things. It famously discards any notion of balance between player characters in favour of OH MY GOD DID YOU SEE WHAT I CAN DO? It does this kind of stuff well, at least at the level of setting.

But. Rifts also has some really massive holes in it. Some are mechanical, and they’re simple enough to paper over for an individual game, maybe even a full campaign. If you’re willing to deal with the cut and paste books and some of the…difficult…writing, and and and. Or If you buy the Savage Rifts books.

And if you’re not, say, a member of a First Nation. Or a kid from Africa. Or basically any non-white person.

But I digress.

I’ve been thinking about what my take on a Kitchen Sink setting like Rifts would be for years. I’ve had individual notions about what a “real” mecha suit might look like, and I’m seeing it show up in media over time. I’ve played with different incarnations of mashup settings and mechanics.

But the seed of something good finally clicked for me recently when I posed myself the question What Do I Like About High Magic and High Tech?

I’m an avid reader of science fiction, and so I have a really strong notion of what I want the tech half of that to look like. At its core, the tech of the new thing is rooted in the works of folks like Greg Egan and Charlie Stross. They write almost unimaginable futures that challenge the concept of selfhood in the face of immortality. They frequently push the limits of your imagination as a reader; I can’t even imagine what it’s like to live inside their heads. How could I not use them as the starting point for my vision of Highest Possible Technology?

For magic, on the other hand, there was only ever one candidate for my core inspiration. See, shortly after that first game of Rifts, a bunch of us started fooling with White Wolf’s World of Darkness games. Our “main” DM started with Vampire, and then our “off” DM got into Werewolf, and then I bought Mage: the Ascension, and it changed how I think of magic and just about everything else.

Mage, particularly the Ascension incarnation, isn’t really a roleplaying game per se. It’s more of a whole-brain metaphysical workout regimen. The notion of magic it purveys is rooted in the concept that belief makes reality, which sounds like something Tony Robbins might say, but it’s a deeply powerful idea in roleplaying terms. The game hit me at the height of my adolescent powers while I was on a whiplash trajectory of life changes, and instead of calming me down it kicked me up about four notches. I can never be anything but grateful for its influence.

So that’s the seed: Taking my lead from Rifts’ gonzo, go-for-broke mashupisms, I’m going to try to design a game where magic that directly incarnates reality interacts with tech that pushes the limits of possibility. I’ve already started tweaking that mix, throwing in a few doses of my own creative energy and stuffing the whole works into a “bright forest” universe (like a Dark Forest, but less murderey). We’ll see where it takes me.

It’s called Demiurges, at least for now. I hear that Kult uses that word as a pretty key part of its setting, but, you know. It’s a word. I like the sound of it. It means what I need it to mean. So. Demiurges. Watch this space for more details.

If you’re interested in being more involved, you can sign up for the mailing list that I just created.

SJBOT & Patch Prototype 2.5

I had a good, though tiring, day at the BDC Entrepreneurship booth at the St. John’s Board of Trade Trade Show & Conference. I met a few new folks, like Ashley from Fundamental Inc, who was kind enough to talk about her work in renewable institution planning, and Armin from AS Works, who had a really cool drone on show.

Also saw a few familiar faces, including co-boothee Julie Lewis from SassyTuna Studio, which was nice – another game dev was good to see!

I wasn’t sure what to expect from the experience, but I talked to a number of folks about what I’m doing, and had a few people who wanted to connect either on the recruitment side or on the contracting side of things, both of which are encouraging signs – getting new devs is a challenge, and finding work is always good!

I spent a lot of my day, however, just showing off Beat Farmer and watching people play it, which lead to the refinement of yet another prototype, which I’m calling 2.5. This is the first version guided by a substantial amount of user feedback, and I feel really good about the result.

Check it out!

OSX & Kinect, 2017

So you have a MacBook (or something else that runs OSX) and you want to play with the Kinect sensor, but you’re having trouble because there are about 1 billion sets of wrong instructions on the internet on how to connect this Kinect. Let me save you a little grief.

Hardware

I have the Kinect “v2”, aka Kinect for Xbox One, aka Kinect for Windows, aka (in my case) Model 1520. The instructions below work for my version. The only serious difference if you have the older Kinect should be that you use a different version of libfreenect, but I haven’t tested that.

Software

You have more than one option as far as software goes. If you’re a commercial developer, you might consider trying out Zigfu’s ZDK, which has an OSX-ready image and integrates with several modern packages, including Unity3d, out of the box.

If you’re more of a hobbyist (as I am at the moment) and don’t have the $200 for a Zigfu license, the lovely folks behind the Structure Sensor have taken on maintenance of the OpenNI2 library, including a macOS build. Your first step should be to download the latest version of that library and unzip it somewhere.

Unfortunately, their package isn’t quite complete, and you’ll also need a driver to connect the Kinect (I know, it’s getting old to me too). This is where our ways may diverge, gentle reader, for in my case I discovered that I needed OpenKinect’s libfreenect2, whereas an older sensor would require libfreenect.

Assuming that you’re using the XBox One sensor, you’ll want to read the README.md that comes with your copy of libfreenect2. It contains all the necessary instructions for getting the right tools + dependencies and building all the things.

There are two additional things that are currently left out of their readme file. The first is that when you want to use the OpenNI2 tools, you’ll need to copy the drivers from

libfreenect2/build/lib

into

{bin-folder}/OpenNI2/Drivers

for whatever you’re running. So to run NiViewer, which is in the Tools folder, you’d copy it to

{openni-base-folder}/Tools/OpenNI2/Drivers

I expected the “make install-openni2” command from libfreenect2’s readme would take care of that stuff, but it does not.

The second omission is the troubleshooting stuff on their wiki. In particular, for my specific MacBook, I had to plug the Kinect adapter into the USB port on the left-hand side, NOT the right-hand side, as the device requires USB3, and I had to run Protonect and NiViewer using the “cl” pipeline. The default pipeline setting can be changed by doing this:

export LIBFREENECT2_PIPELINE=cl

You can also pass in the pipeline for Protonect:

bin/Protonect cl

With that setting in place, you should see a window with 2 (NiViewer) or 4 (Protonect) windows, each capturing different parts of the raw Kinect stream:

 

From here you’re on your own, but I hope you found this at least a bit helpful!

Shadertoy

So you want to write code for a living, but you also have a wee bit of graphic artist in you? Maybe shaders are your glovely medium!

Here’s a small sample of the insanity on display at ShaderToy.

Garage – I haven’t parsed this completely, but I think that entire scene – including the cars – may be a single shader. Note that the car goes both up AND down the decks.

Tentacle Thing – More traditional, akin to the hair shaders supposedly used on the Monsters movies from Pixar, but still amazing to see it running in realtime in a browser.

Seascape – What can I say, except maybe why the hell aren’t there games that have this as part of their experience???

Flame – Simple, but also easy to mess with (try changing the numbers in the mainImage function and hit the Play button below the code window to see what I mean).

It’s hard to imagine the world of the elder statesmen in this field; they were stuck decoding research papers from a variety of obscure journals. But it’s also nice to know that there is such an incredibly rich ocean of knowledge out there to scour while learning to do cool things.

Working!

New stuff happening. I’ll be giving a talk about programming and technical creation in Unity at NGX2016. I’m working on separating the Contension network code from the conflict code so I can start doing up some AI behaviours. I’ve got a bunch of candidate attributes to try out in-game.  And I’ve got a rough idea of the narrative arc of the game.

Next up, I’ll be trying to figure out how to apply the Non Combat episode of Extra Credits to my presentation of the game.