
BEATBox
Role:
Lead Programmer
Software:
Unity, C#, Qualysis, Ableton, Chataigne
"In a galaxy where you can’t jam to a funky beat.... one alien named Caesura wants to take sound for himself! Arm yourself with one of the two Gloves of Justice - Bass or Treble - and take back sound once and for all!"
BEATBox is an immersive 3D Motion Capture and rhythm-based gaming experience developed by a team of Arts and Entertainment Technologies students, including myself. Players engage in this experience by using motion-capture gloves to deflect musical and rhythmed attacks from the enemy, Caesura.
What I Built
Motion-Capture Control System via Qualysis
BEATBox's game input centered around the use of motion-captured "gloves" made to resemble boxing gloves. Included in this system's functionality is a calibration mechanism that syncs gameplay targets with the player's height, arm reach, etc., allowing for fair and accessible gameplay.
MIDI-Based/Music Synced Attack Launching
The central feature of BEATBox is built around the music-synced attacks from the mischievous Caesura. These attacks are synced through a MIDI-based communication bridge between Ableton Live Suite and Unity using the MidiJack Unity plugin
In-Depth Development Process
Challenge 1: Making Motion-Capture Gameplay Consistent Across Players
One of the first major hurdles in development was understanding how to position motion-captured gloves accurately within Unity space. After connecting Qualysis to Unity through QTM, our team experimented with glove positioning and projectile paths to determine how attacks should travel toward the player.
From this experimentation, we identified three key constraints:
-
We needed a hitbox system to prevent players from simply holding their gloves where notes would land.
-
Note launchers had to be positioned far enough from the player for attacks to arrive in time with the music.
-
Once these positions were finalized, they had to remain reliable for every player regardless of height or arm length.


To solve these challenges, I developed a calibration sequence that positioned hitboxes and launchers relative to a player’s punches in multiple directions. This ensured that each note would launch close enough to be reachable, while still requiring the player to punch outward with intention. The system helped create a more readable and accessible interaction space for a wide range of players.
Challenge 2: Syncing Enemy Attacks to Live MIDI Input
The next main hurdle centered around creating a MIDI-synced attack for the player to defeat. After developing the hitbox-based . The first step involved getting basic MIDI data from Ableton, where we’d eventually launch the song live, into Unity, which was achieved via a Unity plugin called MIDIJack and LoopMIDI. From our experimentation we gathered:
Since we were able to get each note to produce a unique numerical ID, we were given the idea to have each note in a separate MIDI track represent a direction we wanted the player to punch it, very similar to stepping in a specific direction in Dance Dance Revolution. We then made a list of MIDI IDs for Unity to recognize and map to each launcher to fire upon note playing. With the creation of this system, along with cleaning and finalizing our motion-capture positioning system, the basic mechanic of our game was created.


Key Takeaways
Calibration had to be treated as part of the core mechanic, because motion-capture gameplay only felt fair once I started designing around how different players move, reach, and react in space.
Connecting MIDI, OSC, lighting, and gameplay taught me how much rhythm-game satisfaction depends on clean timing and reliable sync across multiple systems.
This project showed me that unusual interfaces take a lot of iteration, because the technology means very little if the interaction does not feel natural to the player.