Hand Tracking Orientation
I'm trying to get the rotation of some hand gestures. For example, I want to detect a Thumb Up or Thumb Down. I'm trying to make sense of the euler or quad angles but the values seem to be the same whether the hand is upside down or not. Is there a way to report the orientation of a gesture when it is completely upside down?
function response(type) {
data.descriptor = type;
}
script.tracker.registerDescriptorStart("thumb", response);
var event = script.createEvent("UpdateEvent");
event.bind(function (event) {
if (!script.tracker.isTracking()) return;
switch (data.descriptor) {
case "thumb": gestureThumb(); break;
}
});
function gestureThumb() {
var pos = script.tracker.getTransform();
var rot = pos.getLocalRotation().z.map(1, -1, 0, 360);
print(pos.getLocalRotation().toEulerAngles());
}
The local orientation appears to change with left and right hands. If I were to hold my thumb up on my left hand, I'd get a Euler Z of 257. If I hold my right hand up I'd receive a Z of 77. Turning my left hand down I'd receive something around 77, and right hand down would get a 257. I've also tried getting world rotation coordinates instead of local.
Should I create a tracker for upper body? Would that help me determine if my left or right hand is up and I can rotate the coordinate system from there?
Do you think it has to do with what object your tracker object is parented to? i.e. do you see the same behavior if you check world rotation instead of local rotation? Since an image will orient to the hand as it's tracked, you should definitely be able to retrieve its rotation, but the value you receive from getLocalRotation() will of course depend on how your parent object is behaving. It might be worth a try to attach some image to the tracked object and check that image's world rotation...?
Would be nice if someone has an answer. Im also curious how to do this :D
Cheers,
SirQu3ntin
Interesting point Michael, It doesn't look like the Hand Tracker is parented to the Upper Body tracker, but I did manage to come up with this method which I think works...
So for this example I create an Upper Body Tracker and pass that into the script as WristLeft and WristRight. Then I take the Hand Tracker position and since it only tracks one hand I measure the distance from the wrist. The minimum distance to the wrist would allow me to assume it's a Left or Right hand.
I'm also noticing it's taking a bit more CPU power to calculate this, so I'm going to try and delete some of the Upper Body trackers to reduce the load.
For a feature request it would be nice to detect both hands and populate them as an array. This would more closely replicate the Upper Body Left/Right detections.
Not sure if this is the best way to do it, but I'm open to suggestions!
Oh nice one Brent. I finally understood the behavior you were explaining after trying some things in a project where I'm tracking an open hand. You're of course right that the thumb up/down doesn't offer an easy way to get orientation. One simple way that seems to work for me is to put two hand trackers in: one is set to track the hand center, and one is set to track the thumb joint. That way you can simply compare world y positions to know if the thumb is above the hand center or below it, giving you thumbs up/down, regardless of which hand (when combined with the gesture recognition of course).
Not sure if the double object tracker would be considered good practice officially, or what performance implications might be, but it does seem to work. Obviously if your solution is doing it for you, this may not be so useful, but anyway:
Thanks Michael, that works much more consistently than my attempt. Appreciate the collaboration.