Monday, November 14, 2011

Kinect Option 3: jit.freenect.grab

Using Max/MSP/Jitter, and the newly created object, jit.freenect.grab that we downloaded, images generated by Kinect can be grabbed, controlled and manipulated. jit.freenect.grab makes use of the OpenKinect project’s libfreenect library, which grabs frames from a Kinect device. It is being developed by Jean-Marc Pelletier, Nenad Popov and with help from Andrew Roth.

What jit.freenect.grab can retrieve:


  • Retrieve RGB images.
  • Retrieve depth maps.
  • Retrieve images from the infrared camera.
  • Retrieve accelerometer readings.
  • Control the Kinect’s tilt motor.
  • Use multiple Kinects simultaneously. (Depth sensors may interfere with each other.)


An example using jit.freenect.grab:

http://www.youtube.com/embed/wvJKaViF7p0


Kinect Option 2: TUIO - TuioKinect


TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of hand gesture enabled applications with any platform or environment that supports TUIO. Gesture in this context can be understood as hand movements that are tracked which are then mapped in a way that represents the movement of a cursor in an interface.



An example using TuioKinect

http://www.youtube.com/embed/vZSEEnMP6pg


Kinect Option 1: Synapse

Synapse is an application we were able to download from the Internet that works on Mac and Windows. It easily allows us to use Kinect to control Max/MSP/jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC, giving us the ability to track different gestures.


What Synapse does is, it takes information from the camera and binds a skeleton to the body when the user stands in front of the kinect in "psi". It then sends OSC messages out on ports 12345 and 12347, and receives OSC messages on port 12346.


The messages that Synapse sends are:


/tracking_skeleton - Sent with 1 when we start tracking a skeleton, and 0 when we lose it.

/_pos_world - The x,y,x position of the joint in world space, in millimeters.

/_pos_body - The x,y,x position of the joint relative to the torso, in millimeters.

/_pos_screen - The x,y,x position of the joint as projected onto the screen, in pixels.

/ [up,down,left,right,forward,back] - Sent when we detect a "hit" event, such as a punch forward, which would be "righthand forward".

/_requiredlength - The current tuning value of how far you must move in a straight line (in mm) before a hit event is generated.

/_pointhistorysize - The current tuning value of how many points are being tracked for hit event detection.


The following are a list of joints that are tracked on the skeleton is mapped to the user's body:


  • righthand
  • lefthand
  • rightelbow
  • leftelbow
  • rightfoot
  • leftfoot
  • rightknee
  • leftknee
  • head
  • torso


The following is a list of valid messages to send to synapse:

/_gettuninginfo - This causes the _requiredlength and _pointhistorysize messages to be sent from Synapse, so that you can see what the current values are.

/_requiredlength - Use this to change how far you must move this joint in a direction (in mm) to trigger a hit event. Default to 150.

/_pointhistorysize - Use this to change how many points are being tracked for his event detection. This is essentaially a control for how fast you must move your hanf to cause a hit event, lower means you must do it faster. Defaults to 5.

/_trackjointpos - This is the keepalive to cause joint positions to continue being spit out. Valid values to pass are: 1, to get_pos_body positions; 2, to get_pos_world positions; and 3, to get_pos_screen positions.

/depth_mode - This allows you to cut the background our of the depth image and only see the user. Valid values to pass are: 0 to see the whole depth buffer, 1 to se only this tracked user (or all person-shaped things if no one is tracked), 2 to see all person-shaped things even if a user is tracked.


An example of Synapse being used with Ableton Live to create music:

http://www.youtube.com/embed/1ge7RcIBWsc


Wednesday, November 9, 2011

First iteration of Flowchart



The Gestures



The following are possible gestures users might undertake when interacting with the project.

The format is: Gesture - Corresponding Response from Project

Point/Circle - Select/pick an option from a list of choices. Example: selecting a character to speak from a set of possibilities
Flip Through - Moving between content
Open Curtain - Further explains the current content the user is viewing
Wipe - Erases unrelated words/definitions during a game
Punch/Kick - Hits charactersSparkling Hands- Shows tweets that were essential in the rise of the revolution

Sketches










The Revolution of Education: Project Concept

The people in Egypt can be divided into groups and then subgroups as follows: The first group is the Educated group. People within this category vary in social status, but with this variance comes a level of Political awareness. The second category is Uneducated Egyptians. Very few of those who fall within this category are politically aware. Most of these individuals’ main concern is to provide food and shelter for their families, thus following the latest news concerning the politics of the nation is of least importance. Also due to their lack of knowledge, concepts such as “democracy” and “freedom of speech”, etc. is not fully understood.

The objective, in this case is to educate the Uneducated individuals so that not only do they become aware, but so that notions such as “freedom” and “democracy” are better grasped. We also aim to include background information on the ex-President, Hosni Mubarak, and his dictatorial regime. We hope to explain the way in which the Government was corrupt, how the police force neglected basic human rights, why a great percentage of the population lives below the poverty line, and where the nation’s money disappeared. We also intend to explain the transitional situation Egypt is currently in, the importance of elections, and why people should vote.

The end goal is to mold the Educated, politically unaware, and the Uneducated, politically unaware individuals so that they are no longer “followers”. We plan to force the people to critically analyze situations or ideas, such that they no longer follow “leaders” they assume have knowledge.

We would like to note however, that considering this piece would be implemented in Toronto, Canada, the manner in which the interaction takes place, as well as the information displayed would be tailored within the Canadian context.