FAAST Website and Download Link

440 views
Skip to first unread message

Evan A. Suma

unread,
Dec 19, 2010, 3:08:18 PM12/19/10
to openn...@googlegroups.com
Hello everyone,

We have officially launched the project website for the Flexible Action
and Articulated Skeleton Toolkit (FAAST), the software we have used to
drive the applications shown in several of the videos that I have
recently sent out to this email list.

FAAST is middleware to facilitate integration of full-body control with
games and VR applications using OpenNI-compliant depth sensors
(currently the PrimeSensor and the Microsoft Kinect). The toolkit
incorporates a custom VRPN server to stream the user's skeleton over a
network, allowing VR applications to read the skeletal joints as
trackers using any VRPN client. FAAST can also emulate keyboard input
triggered by body posture and specific gestures. This allows the user
add custom body-based control mechanisms to existing off-the-shelf games
that do not provide official support for depth sensors.

FAAST is free to use and distribute for research and noncommercial
purposes (for commercial uses, please contact us). If you use FAAST to
support your research project, we request that any publications
resulting from the use of this software include a reference to the
toolkit (a tech report will be posted on the website within the next
week or so). Additionally, please send us an email about your project,
so we can compile a list of projects that use FAAST. This will be help
us pursue funding to maintain the software and add new functionality.

The preliminary version of FAAST is currently available for Windows
only. We are currently preparing to release code as an open-source
project. Additionally, we plan to develop a Linux port in the near future.

FAAST can be downloaded at:
http://people.ict.usc.edu/~suma/faast/
<http://people.ict.usc.edu/%7Esuma/faast/>

Best regards,
Evan

--
Evan A. Suma, Ph.D.
Postdoctoral Research Associate
Institute for Creative Technologies
University of Southern California
www.evansuma.com


Vanni Rizzo

unread,
Dec 19, 2010, 3:38:33 PM12/19/10
to openn...@googlegroups.com
Thank you very much! It seems a very useful project!
I'll try it tomorrow!

--
Vanni G. Rizzo
 → mobile: +39 329 3247242
 → casa: +39 0110866799
 → web: www.rivict.com

Darien

unread,
Dec 19, 2010, 6:37:50 PM12/19/10
to OpenNI
Great stuff.

Can't wait for mouse support!

-Darien

Zachary Lipps

unread,
Dec 20, 2010, 3:20:07 AM12/20/10
to OpenNI
Here is an example of a game that is incorporating FAAST to track the
users position, http://www.youtube.com/watch?v=RCvNvo-6FlM

wayfarer_boy

unread,
Dec 20, 2010, 8:40:29 AM12/20/10
to OpenNI
Congratulations, Evan. Great news about the source code (I'll
consider my email answered!)

On Dec 20, 8:20 am, Zachary Lipps <zlip...@gmail.com> wrote:
> Here is an example of a game that is incorporating FAAST to track the
> users position,http://www.youtube.com/watch?v=RCvNvo-6FlM

SyRenity

unread,
Dec 20, 2010, 2:54:21 PM12/20/10
to OpenNI
Hi.

Looks very interesting, but can you clarify, what exactly FAAST adds
on top of NITE?

I.e., why not work with NITE directly?

Thanks.

Evan A. Suma

unread,
Dec 20, 2010, 2:54:16 PM12/20/10
to likeBVH, openn...@googlegroups.com
Hi Jim (cc'd to OpenNI group since it may be of interest to them),

For the virtual human skinned mesh, I am only using joint positions
reported from OpenNI. I'm then calculating joint orientations on the
client-side based on the angles between the joint vectors.

Here's a snippet of sample code (this is all rendered in OpenSceneGraph):

osg::Vec3d torso(torsoTracker->x, -torsoTracker->z, torsoTracker->y);
osg::Vec3d head(headTracker->x, -headTracker->z, headTracker->y);
osg::Vec3d neck(neckTracker->x, -neckTracker->z, neckTracker->y);
osg::Vec3d leftShoulder(leftShoulderTracker->x,
-leftShoulderTracker->z, leftShoulderTracker->y);
osg::Vec3d leftElbow(leftElbowTracker->x, -leftElbowTracker->z,
leftElbowTracker->y);
osg::Vec3d leftHand(leftHandTracker->x, -leftHandTracker->z,
leftHandTracker->y);
osg::Vec3d rightShoulder(rightShoulderTracker->x,
-rightShoulderTracker->z, rightShoulderTracker->y);
osg::Vec3d rightElbow(rightElbowTracker->x,
-rightElbowTracker->z, rightElbowTracker->y);
osg::Vec3d rightHand(rightHandTracker->x, -rightHandTracker->z,
rightHandTracker->y);
osg::Vec3d leftHip(leftHipTracker->x, -leftHipTracker->z,
leftHipTracker->y);
osg::Vec3d leftKnee(leftKneeTracker->x, -leftKneeTracker->z,
leftKneeTracker->y);
osg::Vec3d leftFoot(leftFootTracker->x, -leftFootTracker->z,
leftFootTracker->y);
osg::Vec3d rightHip(rightHipTracker->x, -rightHipTracker->z,
rightHipTracker->y);
osg::Vec3d rightKnee(rightKneeTracker->x, -rightKneeTracker->z,
rightKneeTracker->y);
osg::Vec3d rightFoot(rightFootTracker->x, -rightFootTracker->z,
rightFootTracker->y);


// torso
osg::Vec3d shoulderVector = rightShoulder - leftShoulder;
shoulderVector[1] = 0.0;
shoulderVector.normalize();
osg::Quat torsoYawRotation;
{
osg::Vec3d baseVector = osg::Vec3d(1.0, 0.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ shoulderVector;
torsoYawRotation = osg::Quat(axis[0], axis[1], axis[2], 1.0
+ (baseVector*shoulderVector));
}

osg::Vec3d spinePitchVector = neck - torso;
spinePitchVector[0] = 0.0;
spinePitchVector.normalize();
osg::Quat torsoPitchRotation;
{
osg::Vec3d baseVector = osg::Vec3d(0.0, -1.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ spinePitchVector;
torsoPitchRotation = osg::Quat(axis[0], axis[1], axis[2],
1.0 + (baseVector*spinePitchVector));
}

osg::Vec3d spineRollVector = neck - torso;
spineRollVector[2] = 0.0;
spineRollVector.normalize();
osg::Quat torsoRollRotation;
{
osg::Vec3d baseVector = osg::Vec3d(0.0, -1.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ spineRollVector;
torsoRollRotation = osg::Quat(axis[0], axis[1], axis[2],
1.0 + (baseVector*spineRollVector));
}

controller->setJointOrientation(SkeletonController::TORSO,
torsoYawRotation * torsoPitchRotation * torsoRollRotation);
controller->setJointPosition(SkeletonController::TORSO,
(rightHip + leftHip) / -2.0);


// left upper arm
osg::Vec3d leftUpperArmVector = leftElbow - leftShoulder;
leftUpperArmVector.normalize();
osg::Quat leftUpperArmRotation;
{
osg::Vec3d baseVector = osg::Vec3d(-1.0, 0.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ leftUpperArmVector;
leftUpperArmRotation = osg::Quat(axis[0], axis[1], axis[2],
1.0 + (baseVector*leftUpperArmVector));

controller->setJointOrientation(SkeletonController::LEFT_SHOULDER,
leftUpperArmRotation);

controller->setJointPosition(SkeletonController::LEFT_SHOULDER,
-leftShoulder);
}

// left lower arm
osg::Vec3d leftLowerArmVector = leftHand - leftElbow;
leftLowerArmVector.normalize();
osg::Quat leftLowerArmRotation;
{
osg::Vec3d baseVector = osg::Vec3d(-1.0, 0.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ leftLowerArmVector;
leftLowerArmRotation = osg::Quat(axis[0], axis[1], axis[2],
1.0 + (baseVector*leftLowerArmVector));


controller->setJointOrientation(SkeletonController::LEFT_ELBOW,
leftLowerArmRotation);

controller->setJointPosition(SkeletonController::LEFT_ELBOW, -leftElbow);


controller->setJointOrientation(SkeletonController::LEFT_WRIST,
leftLowerArmRotation);

controller->setJointPosition(SkeletonController::LEFT_WRIST, -leftHand);
}

// right upper arm
osg::Vec3d rightUpperArmVector = rightElbow - rightShoulder;
rightUpperArmVector.normalize();
osg::Quat rightUpperArmRotation;
{
osg::Vec3d baseVector = osg::Vec3d(1.0, 0.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ rightUpperArmVector;
rightUpperArmRotation = osg::Quat(axis[0], axis[1],
axis[2], 1.0 + (baseVector*rightUpperArmVector));

controller->setJointOrientation(SkeletonController::RIGHT_SHOULDER,
rightUpperArmRotation);

controller->setJointPosition(SkeletonController::RIGHT_SHOULDER,
-rightShoulder);
}

// right lower arm
osg::Vec3d rightLowerArmVector = rightHand - rightElbow;
rightLowerArmVector.normalize();
osg::Quat rightLowerArmRotation;
{
osg::Vec3d baseVector = osg::Vec3d(1.0, 0.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ rightLowerArmVector;
rightLowerArmRotation = osg::Quat(axis[0], axis[1],
axis[2], 1.0 + (baseVector*rightLowerArmVector));


controller->setJointOrientation(SkeletonController::RIGHT_ELBOW,
rightLowerArmRotation);

controller->setJointPosition(SkeletonController::RIGHT_ELBOW, -rightElbow);


controller->setJointOrientation(SkeletonController::RIGHT_WRIST,
rightLowerArmRotation);

controller->setJointPosition(SkeletonController::RIGHT_WRIST, -rightHand);
}


// left upper Leg
osg::Vec3d leftUpperLegVector = leftKnee - leftHip;
leftUpperLegVector.normalize();
osg::Quat leftUpperLegRotation;
{
osg::Vec3d baseVector = osg::Vec3d(0.0, 1.0, 0.0);
osg::Vec3d axis = baseVector ^ leftUpperLegVector;
leftUpperLegRotation = osg::Quat(axis[0], axis[1], axis[2],
1.0 + (baseVector*leftUpperLegVector));

controller->setJointOrientation(SkeletonController::LEFT_HIP,
leftUpperLegRotation);
controller->setJointPosition(SkeletonController::LEFT_HIP,
-leftHip);
}

// left lower Leg
osg::Vec3d leftLowerLegVector = leftFoot - leftKnee;
leftLowerLegVector.normalize();
osg::Quat leftLowerLegRotation;
{
osg::Vec3d baseVector = osg::Vec3d(0.0, 1.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ leftLowerLegVector;
leftLowerLegRotation = osg::Quat(axis[0], axis[1], axis[2],
1.0 + (baseVector*leftLowerLegVector));

controller->setJointOrientation(SkeletonController::LEFT_KNEE,
leftLowerLegRotation);
controller->setJointPosition(SkeletonController::LEFT_KNEE,
-leftKnee);
}

// right upper Leg
osg::Vec3d rightUpperLegVector = rightKnee - rightHip;
rightUpperLegVector.normalize();
osg::Quat rightUpperLegRotation;
{
osg::Vec3d baseVector = osg::Vec3d(0.0, 1.0, 0.0);
osg::Vec3d axis = baseVector ^ rightUpperLegVector;
rightUpperLegRotation = osg::Quat(axis[0], axis[1],
axis[2], 1.0 + (baseVector*rightUpperLegVector));

controller->setJointOrientation(SkeletonController::RIGHT_HIP,
rightUpperLegRotation);
controller->setJointPosition(SkeletonController::RIGHT_HIP,
-rightHip);
}


// right lower Leg
osg::Vec3d rightLowerLegVector = rightFoot - rightKnee;
rightLowerLegVector.normalize();
osg::Quat rightLowerLegRotation;
{
osg::Vec3d baseVector = osg::Vec3d(0.0, 1.0, 0.0);
baseVector.normalize();
osg::Vec3d axis = baseVector ^ rightLowerLegVector;
rightLowerLegRotation = osg::Quat(axis[0], axis[1],
axis[2], 1.0 + (baseVector*rightLowerLegVector));

controller->setJointOrientation(SkeletonController::RIGHT_KNEE,
rightLowerLegRotation);

controller->setJointPosition(SkeletonController::RIGHT_KNEE, -rightKnee);
}

Cheers,
Evan

--
Evan A. Suma, Ph.D.
Postdoctoral Research Associate
Institute for Creative Technologies
University of Southern California
www.evansuma.com


On 12/20/2010 6:00 AM, likeBVH wrote:
> Hi Evan, very interested to see if you use Joint positions or
> Orientations in driving the skinned weight mesh.
>
> Thanks for sharing the codes. JIM

Evan A. Suma

unread,
Dec 20, 2010, 3:11:07 PM12/20/10
to openn...@googlegroups.com
It depends on what your goal is.

If you want to control off-the-shelf games using pose and gesture that
can be custom-configured on the fly, then FAAST provides that
functionality. This is of great interest to our collaborators in motor
rehabilitation, who want to provide the capability for a therapist to
custom-tailor a game to fit an individual patient's therapy needs -
obviously, these practitioners are not programmers! For this to be
useful in practice, though, we need to add a lot more high-level
gestures, and perhaps provide the capability for gesture training and
classification.

For accessing the skeleton joints, FAATS's VRPN server is more of a
convenient delivery system for the skeleton data from OpenNI/NITE rather
than added functionality. It allows you to have a client/server
architecture, running the server and sensor on one machine and the
client on another, or perhaps many clients, running on different
platforms (VRPN has bindings for java, python, etc.). In the VR
community, many of our applications and engines are already built on top
of VRPN for interfacing with motion tracking systems, so this seamlessly
fits into the existing pipeline for VR researchers.

We are currently working on adding new functionality beyond what NITE
can do, for example, real-time face/head tracking, but that's future work.

- Evan

likeBVH

unread,
Dec 21, 2010, 3:56:18 AM12/21/10
to OpenNI
Hi Evan,
I try to capture BVH motion capture data directly from NITE skeletal
data.
Two kinds of data needed for BVH.
The position offsets of the joints with respect to the Center of Mass
The rotations along 3 axis for each joint.

I am trying to figure out if I could directly use the joint
orientation from NITE skeletal data or I need to convert the direction
vectors of the two adjacent joints into the rotation data needed to
construct a BVH File.

Question
a) I noticed that both you and the MikuMikuDance are converting
direction vectors into rotation matrix to drive the avartar.Whereas
the Sinbad example is using the Joint orientation data directly.

I wonder under what situations that is it better to use the first or
the second method. What could be the advancetages/disadvantages.

My experience: I could perform arm rolling action in front of the body
through the MikuMiku program but not the Ogre Sinbad. I wonder if
there is use of IK that make it possible.

Anyone has feedbacks?

Anyone has a link to OpenGL code for converting direction vectors into
rotation matrix for driving avartar?
Reply all
Reply to author
Forward
0 new messages