OpenFrameworks Music/Interactive Visualizer

Intro:

Recently I was asked to fill in as a “VJ/video jockey” for a local club this past Friday. Dabbling quite a bit with openFrameworks recently it seemed a natural fit to create an interactive experience, with some controls so that I could still claim to be a VJ and not just a programmer! The following motion detection/sound detection visualizer is what I came up with …

The Beginning

While learning OpenFrameworks a while ago, and building an interactive environment project in school, a friend of mine asked to create a visualizer to go along with a cello concert he was playing near Toronto. At the time I was playing with particle and vector fields so using these tools seemed a natural fit. I came up with the following: trying to engage the audience by using their motion to move the falling particles ( which fall from the cellist playing at the top of the 8 foot stage/projection ).

Another Try

Again recently I was asked to create a music visualizer for a local club that would stretch across three rear-projected screens as opposed to just one front projection as above. Noticing how slow the Toronto gig projection was, with maybe only 300 particles, I knew I had a lot of optimizations to do! I decided to stick with particle fields but I also wanted to make sure the sound detection was much clearer, that I utilized some openGL effects for a more interesting and efficient look, and that there were controls built in so that one could control the visualizer during the performance as programming a computer to control everything by the sound is a difficult and expensive task. Also, I feel being part of the live performance in a more visceral sense can allow for greater creativity.

The live performance and setup of the three screens at the club

Features

  • camera motion detection is transferred to a vector field which then is transferred to particles motion
  • wandering particles ( the big glowing ones ) wander randomly
  • ribbons are affected by camera motion
  • some basic FFT ( Fast-fourier Analysis ) analysis is done on mic sound to get an overall feel for the volume of the sound to detect beats which affect the size of the ribbons and wandering particles
  • colour change is affected by user which then translates smoothly between HSB hues using RGB to HSB and it’s reverse

Control

  • ability to create or destroy the “wandering particles”
  • ability to make wandering particles spit out other “fireworks” particles
  • ability to change hues of background
  • ability to toggle on or off the ribbon field

Optimizations

  • Threading: This is huge as most modern processors utilize more than one core. Using threading to split up the workload more efficiently helped quite a bit. I used Rui Madeira’s thread classes to extend certain classes ( http://code.google.com/p/ruicode/downloads/list ).  Basically you put all your updating code into the updateThread() class and call updateOnce() during your main class’s applications update to keep everything relativily in sync with main applications set framerate. The classes I put in threads are as follows:
    • RibbonField ( the ribbons in the background )
    • ParticleField ( the “fireworks” particles that flow from the wandering particles )
    • WanderingParticleField
    • opticalFlow
  • Optimizing opticalFlow. This includes:
    • putting it into a thread
    • changing the input camera resolution to 160×120 as opposed to 320×240 or 640×480 ( this makes a huge difference as there are less pixels to process )
    • blurring ofxCvImages not only looks cool but also decreases the amount of optical flow points while keeping everything relatively accurate
  • Using direct openGL calls whenever possible such as called glBegin( GL_QUAD_STRIP ) to create trails as opposed to one quad at a time or creating my own function.
  • Passing an image by reference to all particles that share a certain PNG ( This would be done within the “Field” classes wghen they create the individual particles ).
  • Setting a max on the number of particles on the screen so that when I am controlling it I do not add too many and slow things down to a crawl. In this case the max number was especially important on the fireworks particles as they took quite a bit of processing to look the way they do and often were created together in very large clumps.
  • FBO’s ( Frame Buffer Object ): using ofxFBOTexture I was able to create several FBO textures to try to lessen communication with GPU. It also provided useful for tinting the background image as one can use ofSetColor before drawing a FBO to tint it.

Equipment

  • 3 rear-projection screens ( approximately 8 x 6 feet  )
  • 3 projectors at 800×600 resolution each
  • 1 Matrox Triple-Head-To-Go to bind all three projectors into one wide ( 2400×600) screen
  • 1 MBP 2.4gHz to run everything on stage
  • 1 PS3 EyeToy camera
  • Another laptop to control laptop on stage near DJ station

Conclusion

This was definitely a great and enjoyable experience getting everything working using so much equipment. I was definitely a bit worried about getting three projectors running this piece smoothly but the optimizations above helped quite a bit! I think it is a simple example of a visualizer but I believe in its simplicity it worked well. The owner of the club liked it and my friend and girlfriend liked playing with during the show so much that I never actually controlled it all night lol. Anyways, I am nowhere close to an expert in openFrameworks, openGL, or C++ but if you have any questions please feel free to throw them in the comments section and I’ll do my best to answer them.

————————————————–

References

Comments

  • Christian Louboutin Shoes

    May 17, 2010 at 5:00 am

    Thank for this great post, i like what you

    read.

  • Victor

    November 26, 2013 at 10:11 am

    Really nice project and examples

  • Cameron

    December 22, 2013 at 1:45 am

    Could you provide more details for:
    “some basic FFT ( Fast-fourier Analysis ) analysis is done on mic sound to get an overall feel for the volume of the sound to detect beats”

    Very well done visualization.

  • Anthony

    December 25, 2013 at 1:18 pm

    Thanks Cameron!

    It has been a while since I have looked at this project but I used a FFT class developed by Dominic Mazzoni for OpenFrameworks; and I was just passing an average amplitude of all frequencies detected to the particle engine to adjust size. “Detecting beats” is a bit of a mis-statement as many pieces (such as the featured Cellist, Kirk Starkey’s track) have a beat that is well defined by a volume increase as well.

Your email address will not be published. Required fields are marked *