ICM Final Concept

Inspired by that people dance in bars along with music and beats, Yifan and I are considering that if we can control the music and beats by our body language.

And this another example of dancing with visual change and music beats.

So basically our ICM final is :

  • Dance Movement Tracking & Visual Representation
  • Sound Control with Body Movement

That being said, we are going to use Kinect to capture people’s motion.

We tried to get the raw depth data from Kinect, and draw a ellipse at the average position of all pixels that are showed. If people moved, the average value would change. We count the distance between the previous ellipse and the current ellipse and map the distance to a larger number, giving it to the color of the ellipse. So when people move faster, the average position change faster, distance of two ellipse would change faster, accordingly, color of ellipse would change faster. Ideally, if we use the change of distance to measure the intensity of people’s movement and give it to a sketch as an input, we can control the visual and music by our body movement.

There is a test video.

This is the code:

Challenges:

  1. How to more accurately measure the intensity of people’s movement?
  2. The sketch part. How to make sketch change, move.
  3. The sound. How to change beats? different song?

 

Leave a Reply

Your email address will not be published. Required fields are marked *