Exploring Hardware Interfaces for Game Development

If you are a game developer, you know how painstaking it can be to tune the various aspects of your game. From gameplay to visual, there are thousands of sliders and checkboxes to manage.

What would it be like if instead of sitting down at a mouse and keyboard, you could sit down with a custom hardware device designed for tuning your game? Instead of clicking through menus and dragging sliders, you would simply twist knobs and press buttons. How could this change our work style? How could it improve our craft? I set out to explore these topics by integrating the best piece of custom hardware I have into Unreal Engine 4.

Here you can see the results of my work as I play a game of space invaders and live-tune several aspects of the game.

If you are interested in how this was implemented, continue reading.

The Inspiration

One of my hobbies is producing music. I’ve used the Abelton Live Digital Audio Workstation (DAW) software for several years now. About 2 years ago I got an amazing deal on a used Push 2 controller from my local synth shop. The Push 2 is made by the same company (Ableton) that makes my DAW, and as a result, it has a very tight integration into the DAW. It’s basically a midi controller with a 960×160 display, a mix of buttons (some with velocity sensitivity), and some rotary encoders.

The Ableton Push 2

The Push 2 is a beautiful piece of German engineering, it is a pleasure to use for music production and it simply draws the eye and screams “play with me” when you see it sitting on a table. Beyond this, Ableton has posted the full hardware specifications for the device, making it an ideal controller to hack for various purposes.

The Dream

While chatting with some ex-LucasArts developers, they were describing to me how they set up a midi keyboard to trigger character animations. They then were able to literally compose different animation patterns by pounding away on the keyboard. The end result is that they were able to create compelling animations in a more organic and creative way, potentially even creating things that they never would have achieved programmatically.

This story was the impetus for this project. It got me thinking, what if I could use the Push 2 inside of Unreal Engine 4? Would tuning game parameters with a physical controller be more convenient, fun, or compelling? Is there a use-case for experimenting with live procedural game design by triggering various scenarios or code snippets from the controller?

How would this be an improvement over a computer keyboard? I wondered. Well, a keyboard does not have a screen, it also does not have encoders or a pitch bend strip. So a final feasibility test would need to implement some of these unique hardware features in a compelling way.

Doing the Work

After reading through the hardware spec for the Push 2, it seemed pretty clear that I would need a USB library to facilitate rendering to the device’s screen as well as a MIDI library to handle input and commands.

Here is the final architecture that I landed on for my Plugin.

My Push UE4 Plugin Architecture

Input Implementation

I saw that UE4 had a midi plugin called MIDIDevice that used portmidi as the base. Cool, this should save me some time! Unfortunately, it was not flexible enough for me. To read midi events from the plugin you had to register a custom delegate. Because I wanted really tight UE4 input integration, I had to subclass from IInputDevice, which is not a UObject, which means it can’t register its functions with the custom delegate.

Luckily portmidi is so tiny and easy to use that I just went ahead with implementing it fully by hand in my custom input class. The Push 2 primarily sends Note On/Off and Control Change MIDI commands as shown below.

The MIDI mapping for the Push 2

The sheer number of buttons was a slight cause of concern for me. The amount of boilerplate code needed to bind a single button to the game engine * all of the buttons on the device… Well, I did what any engineer would do in this circumstance, which is to generate C++ code using python. 👍😅

ranges = [range(0, 11), [12], range(36,100)]
for r in ranges:
    for i in r:
        print "const FKey FKeysPush2::NN%d(\"NN%d\");" % (i, i)

Which generates something like:

...
const FKey FKeysPush2::NN10("NN10");
const FKey FKeysPush2::NN12("NN12");
...

Overall I was able to generate about 350 lines of C++ with about 20 lines of python, which in my opinion is a great time saver. It also made it easier to visually validate that numeric ranges for various input types were accurate since they are clearly visible at the top of each for-loop.

The final implementation allows reading input from the Push 2 and binds all buttons to the UE4 input system. It also allows changing the RGB color of supported buttons. Here are the results as seen in the UE4 input binding screen, naturally there are several pages of these and you need to reference the Ableton docs to know where some of the buttons are located.

The UE4 input binding screen showing all of the Push 2 bindings.

UE4 Rendering Implementation

Some areas of the rendering implementation were challenging, others were extremely straight-forward.

Custom Push Driver

The first step was to be able to render raw pixels to the Push 2 display. This was actually very straight-forward using libUSB along with the Ableton hardware specification.

A few notes on the libUSB implementation. Any time that you allocate for a USB transfer, you need to deallocate after you transfer the data (duh). This seems obvious, except that in order to deallocate, you need to register a callback to the USB transfer and the kicker is that the callback will never be called unless you spin up a thread that endlessly calls the check for callbacks method.

This isn’t really spelled out in any of the examples but you start to piece it together as you read through all the docs. So follow that little tip if you want to avoid running out of memory.

The only other significant piece of this code was my color space conversion to go from RGB 888 to RGB 565 in the data order that the Push 2 expects. Also relevant is that this is for a render target with format RTF RGBA8.

// Push pixel format
// b4 b3 b2 b1 b0 g5 g4 g3 g2 g1 g0 r4 r3 r2 r1 r0

// Conversion from RGB 888 to 565 storing MSB/LSB seperately
Pixel(uint8 R, uint8 G, uint8 B)
{
	uint16 Red    = (R >> 3);
	uint16 Green  = (G >> 2) << 5;
	uint16 Blue   = (B >> 3) << 11;
	uint16 RGB565 = (Red | Green | Blue);
	
	LSB = RGB565 & 0xFF; // uint8
	MSB = RGB565 >> 8;  // uint8
}

This custom Push driver is compiled as a static library that I load in from the main UE4 plugin.

UE4 Components

This step is where most of my effort was spent. I had 2 goals:

  • Render a UMG widget on the Push 2
  • Render the output from a camera on the Push 2

After searching and reading several hundred lines of UE4 source, I found what looked like a promising entry point in FRenderTarget::ReadPixelsPtr which gives you an array of pixels from a render target. Cool! From there, it was looking for examples of use across the engine which lead me to UWidgetComponent and slate’s WidgetRenderer, both of which I had to customize to reroute their final output.

The two most common rendering scenarios.

The final implementation is pretty simple, conceptually. It allows reading pixels from any custom UUserWidget, copying the pixels from that widget and sending them to the Push 2 display. The case of rendering a camera is similar in that you configure a Scene Capture 2D with a custom render target and then create a UMG widget that displays said render target.

Render Target settings optimized for Push 2

One potential downside of this method is that Scene Capture components are rasterized, so you can’t do ray tracing. Not that anyone was asking for ray tracing on a 960×160 display, but it’s something I’ve been playing with since UE4 4.24.

The Last Mile

The last mile was binding inputs to debug commands. I spent considerable effort trying to do this right. I figured the tool isn’t as useful if it takes an hour to setup the ability to live-tune by painstakingly writing a bunch of binding scripts. So I started writing a system that allowed selecting specific FProperty‘s from a UClass and binding those to an encoder.

I mimicked Ableton’s interface for the Push, showing all spawned actors and allowing the user to add an actor to a list of watched actors.

While this works, it is a complicated system to expose to the user and I eventually shelved this idea because I couldn’t make the decisions necessary to implement the UI. For instance, you have 3 contexts for every property that you tune:

  • Adjust all currently spawned actors that have this property
  • Adjust all future spawned actors that have this property (modify the Class Default Object (CDO))
  • Adjust only the actor that is selected

Depending on what was happening in the level, you could have dozens of actors getting spawned and destroyed in the few seconds that you were trying to add actors. I think there’s a good solution to be had here, but it will take some careful thought and design.

After this project had sit on the back burner for about 5 months, I decided to move past my brain-freeze with brute force so that I could finally come full circle. I hand-coded the debug parameters for Invaders, which took all of about 30 minutes, mostly thanks to SkookumScript of course.

I assigned the buttons directly under the encoders to switch between banks and then each encoder calls a function based on the bank selected. Passed into the function is a boolean parameter specifying whether the encoder was twisted left or right (false and true). A typical function calls through to a couple lines of SkookumScript:

(Boolean increase?)
[
    !val : @sky_atmosphere_component.@rayleigh_exponential_distribution

    // Args
    // Value to modify
    // whether we are increasing or decreasing
    // increase increment
    // decrease increment
    // min value
    // max value
    val := modify_and_clamp(val, increase?, 0.05, -0.05, 0.1, 20)

    @sky_atmosphere_component.@rayleigh_exponential_distribution := val
    send_menu_debug_info("Atmosphere Rayleigh Exp", val>>)
]

Conclusion

After having gone full-circle from idea to realization I can say that the results have merit. Tuning look-related aspects of a game by turning knobs and hitting buttons is incredibly compelling and fun and keeps you in a creative flow.

The Push 2 might not be the perfect hardware for game development, but it establishes a good baseline. I think encoders and a few buttons are the bare minimum for a candidate device. A screen isn’t necessary as it could be replaced with UI feedback in the game, however, it allows you to do fancy things like having a persistent debug UI or defining a custom camera viewpoint for each tuning parameter.

I’d love to use a polished and well-integrated hardware device that is purpose built for this use-case. Who knows, maybe Ableton’s Push 3 could have Unreal Engine support out of the gate 🙂 Please contact me if you are a hardware manufacturer that wants to explore this potentially untapped market!

Be the first to comment

Leave a Reply