NoisePong was the project Jim and I made for our "Play" course.
The goal of the course was to make a game or application with a physical interface. Some did projects with webcam tracking, some projects were multi-touch based, and Jim and I used the ReacTIVision platform to create a tangible tabletop surface application.
We wanted to make some kind of music instrument (in analogy to the original ReacTable) but also introduce some kind of playful physics element to it. And the concept to NoisePong was born.
The ReacTIVision platform uses tracking markers and below-the-surface projecting to make a table like Microsoft Surface, but much cheaper. Their tracking software is open and free to the public so we made use of it.
Unfortunately, we didn't succeed in building a ReacTIVision table, because the light conditions and camera were far from ideal. But we made the application using a Java based ReacTIVision simulator, so our application would be able to be used with the real deal.
The concept is that you place physical, tangible blocks on a surface, and under these blocks, these colorful squares pop up. Every square has it's own function.
The BallCannon (blue square) shoots balls that are influenced by other blocks. It's this influencing of the balls that produces sound.
The Yellow blocks are "bouncers", they bounce balls off their sides and produce a tone with each bounce.
The Purple blocks are what we called "piano keys". They don't influence the path of the balls, they only emit a tone.
The Red and Blue pulsating circles are our gravity fields. The Red circles attract, while the Green circles push away. They both modulate a violin/cello soundscape.
And last but not least, the Orange block and the spiral are a Trashcan and a Black Hole. They are used for the destruction of balls. The Orange block doesn't have a gravity field, so it could be introduced at the end of a ball loop, without interfering with the loop path. The Black Hole is used for rapid destruction of balls.
The ReacTIVision application (or in our case, the simulating application) sends out OSC messages that are picked up by Processing. Processing does all the graphical rendering, and sends out OSC signals to Pure Data. Pure Data then converts these signals to MIDI signals for Ableton Live and that produces the sound.
I used Ableton Live so we would be able to switch instantly between sounds. The good MIDI mapping of Ableton allowed good control over sound settings.
Me and Jim both programmed about equal parts of the application. I was also in charge of the OSC and MIDI routing of the sound and the sound itself.