About The Project
One of the classes I attended while at CSUMB was an "Advanced Unity Topics" course which was essentially a semester-long project
that everyone in the class would contribute toward. The idea, pitched by the professor, was to create a multiplayer sword fighting
game using Axis Neuron Motion-Capture suits and Samsung GearVR Virtual Reality headsets. While more of a tech-demo than a fully
featured game, the project was sucessfully implemented and was a stunning centerpiece at our university' showcase event.
Unfortunately, I failed to get any good footage of it going when it was all set up.
My role in the project was to modify the Axis Nuron suit's Unity API to work with the mobile Android platform we were using. The android-powered
GearVR is great for staying mobile and not attatched to the computer, but the processing power and hardware limitations were
definitely a large hurdle. I was also responsible for creating a procedural terrain generation system, acted as the go-between
for artists and other programmers, and debugged and compiled all of the modules and systems from other teams into the project.
Progress

At the end of the semester, we had created a surprisingly impressive demonstration of real time motion tracking on the Android platform. Both players, dressed in motion-sensor-covered gear, could join the game and use their body and arm motion to naturally slash and poke and each other from fighting distance in game, and from a safe distance in real life. The one problem we didn't get to solve was handling sword strikes, and what should happen if the virtual swords touch.
How It Works
The suits themselves transmitted their location and motion data to a virtual model that is normally hidden from players' view. The "real" textured models would then mimick the animations of the virtual model, and when calibrated to the player's real world location, would mimick the player's motions almost perfectly.
