top of page
Screenshot 2025-04-28 010419_edited.jpg

BEATBox

Role:
Lead Programmer
Software:

Unity, C#, Qualysis, Ableton, Chataigne

"In a galaxy where you can’t jam to a funky beat.... one alien named Caesura wants to take sound for himself! Arm yourself with one of the two Gloves of Justice - Bass or Treble - and take back sound once and for all!"

​

BEATBox is an immersive 3D Motion Capture and rhythm-based gaming experience developed by a team of Arts and Entertainment Technologies students, including myself. Players engage in this experience by using motion-capture gloves to deflect musical and rhythmed attacks from the enemy, Caesura. â€‹

​

​

Core Mechanic Contributions

Motion-Capture Control System via Qualysis

BEATBox's game input centered around the use of motion-captured "gloves" made to resemble boxing gloves. Included in this system's functionality is a calibration mechanism that syncs gameplay targets with the player's height, arm reach, etc., allowing for fair and accessible gameplay.

MIDI-Based/Music Synced Attack Launching

The central feature of BEATBox is centered around the music-synced attacks from the mischievous Caesura. These attacks are synced through a MIDI-based communication bridge between Ableton Live Suite and Unity using the MidiJack Unity plugin

In-Depth Development Process

Motion-Capture
Motion-Capture Control System via Qualysis

The first hurdle in developing this experience was understanding the motion capture positioning in Unity space. After connecting Qualysis with Unity via the QTM, our team played around with positioning of the glove and how to send projectiles in the direction of the glove towards the player. We had two key takeaways:

  • We’d need some sort of hit box to prevent the player from simply keeping their gloves placed where the notes would land.

  • The note launchers would need to be positioned at a certain distance away from the player’s glove in order to reach the player in time for the music

  • Once the hitbox and launcher positions are finalized, they must remain the same for every player, no matter the height, arm length, etc.

​

Screenshot 2025-04-28 071001.png
Motion-Capture Control System via Qualysis

The first hurdle in developing this experience was understanding the motion capture positioning in Unity space. After connecting Qualysis with Unity via the QTM ___, our team played around with positioning of the glove and how to send projectiles in the direction of the glove towards the player. We had two key takeaways:

  • We’d need some sort of hit box to prevent the player from simply keeping their gloves placed where the notes would land.

  • The note launchers would need to be positioned at a certain distance away from the player’s glove in order to reach the player in time for the music

  • Once the hitbox and launcher positions are finalized, they must remain the same for every player, no matter the height, arm length, etc.

​

In order to solve these positioning challenges, some sort of calibration system before gameplay would have to be created. I worked on a calibration sequence that places the hitbox and launcher positions relative to a player’s punches in various directions. This would make sure a note is launched close enough for the player to hit it, but far enough so the player must punch outward and attempt to hit the note, determined by the hitbox. This also guaranteed any 

MIDI Stream
MIDI-Based/Music Synced Attack Launching

The next main hurdle centered around creating a MIDI-synced attack for the player to defeat. After developing the hitbox-based . The first step involved getting basic MIDI data from Ableton, where we’d eventually launch the song live, into Unity, which was achieved via a Unity plugin called MIDIJack and LoopMIDI. From our experimentation we gathered:

​​

​

​

​

Since we were able to get each note to produce a unique numerical ID, we were given the idea to have each note in a separate MIDI track represent a direction we wanted the player to punch it, very similar to stepping in a specific direction in Dance Dance Revolution. We then made a list of MIDI IDs for Unity to recognize and map to each launcher to fire upon note playing. With the creation of this system, along with cleaning and finalizing our motion-capture positioning system, the basic mechanic of our game was created.

Screenshot 2025-04-28 154307.png
bottom of page