Kinect (Simple And Complex) Gesture Recognition Plugin

Official forum for open source FreePIE discussion and development.
Post Reply
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

I have been looking to see if there is any forum or repository dedicated to FreePie Plugins but I have not found any (beyond the core Plugins). As such I am posting a link to my plugin on the forum for others to enjoy if they wish. If there is a more appropriate place for me to place this in, please let me know.

KINECT (Simple And Complex) GESTURE RECOGNITION PLUGIN for FreePie:

This plugin takes most of the hard work out of Kinect gesture recognition. It allows the user to define gesture in terms of steps and relationship conditions for success and failure for each step. The gesture definitions can be loaded automatically on startup (using the plugin settings), loaded from a file using a method in the script, or created dynamically at runtime. Once the gestures are define, the script only needs to subscribe to the update event and read (and probably do something with) the recognized gestures when the update event fires.

GitHub Repository Link:

https://github.com/LordAshes/KinectGest ... in4FreePie

Features:

- Gestures can include references to the following Joints: AnkleLeft, AnkleRight, ElbowLeft, ElbowRight, FootLeft, FootRight, HandLeft, HandRight, HipCenter, HipLeft, HipRight, KneeLeft, KneeRight, ShoulderCenter, ShoulderLeft, ShoulderRight, Spine, WristLeft, and WristRight
- Gestures can use the following relationships: Above, Behind, Below, InfrontOf, LeftOf, RightOf, XChange, YChange, ZChange
- Gestures can compare joints relationships to other joints or to predefined static reference points
- Can load gestures from XML file or create them dynamically during runtime
- Can load gestures from XML file automatically on startup (using plugin settings)
- Gesture recognition by player id allows script to determine which player generated the gesture
- Can choose to subscribe to recognized gestures only or to processing events which include partially completed gestures
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

Sounds fantastic! I will test it as soon as I have some time...
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

Hmm, my Visual Studio installation seems to be shot... I am afraid this will put off the tests for longer.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

Somehow I got it compiled and it seems to work. I was trying out the example scripts and I got up to the fifth step of the wave, so I would call it a success :) The other script works as well, just a minor issue: the default value for the config file seems to be 'Gexture.xml', while in the package it is Gesture.xml. But FreePie nicely reports the problem, so it should be easy to figure out.

I will now try to do my own config (which, incidentally, I have tried to do on my own just a week ago): this will be a set of punches/kicks in four directions for KickBeat.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Jabberwock wrote:Somehow I got it compiled and it seems to work. I was trying out the example scripts and I got up to the fifth step of the wave, so I would call it a success :) The other script works as well, just a minor issue: the default value for the config file seems to be 'Gexture.xml', while in the package it is Gesture.xml. But FreePie nicely reports the problem, so it should be easy to figure out.
Thanks for noting the config file typo. I will correct that in the repository. The sample scripts are just some quick scripts more for the purpose of demonstrating the syntax. Afterwards I realized that the wave script might have been better written comparing hand to shoulder than hand to wrist. I guess it all depends if you intend a small hand wave or a bigger arm wave.
I did try it with Phase Shift (a Rock Band like app for PC) and was able to hit notes based on arm gestures.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

I am trying to do punches and it is a bit tricky... Mostly because for straight punches the orientation of joints basically does not change: the fist (hand) is in front of other joints already and does not move left or right. I used the XChange relation, but it needs a first step, so I had a problem with doing a series...

I wondered whether for such stretches a Near/Far relations (with radii defined in deviation) might help, but I understand that getting the distances in the space right might be difficult.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Jabberwock wrote:I am trying to do punches and it is a bit tricky... Mostly because for straight punches the orientation of joints basically does not change: the fist (hand) is in front of other joints already and does not move left or right. I used the XChange relation, but it needs a first step, so I had a problem with doing a series...

I wondered whether for such stretches a Near/Far relations (with radii defined in deviation) might help, but I understand that getting the distances in the space right might be difficult.
I am happy to investigate additional options but I am not sure I follow your suggestion. The plugin (under the hood) works on a frames basis. When the Kinect sensor generates a new frame, the plugin recalculates all the relationships. However, the XChange, YChange and ZChange relationships are all in reference to the last completed step because the XChange, YChange and ZChange between frames would likely be very small and thus not very accurate. For example, a gesture of punch where the hand shakes a bit could actually have a frame (or so) in the opposite direction. Again, if you have some ideas around this let me know and we can discuss it.

However, what I would try, in the meantime, is using static reference points. This should work if the person remains stationary and is just doing kicks or punches...but it will not work if the person is moving around. The basic idea is that you add a static reference point which you can then use in creating the initial step. For example, lets assume that the starting position of a person is 0,0,200 (I'm just making that up, you would need to see what values reflect your desired starting point). Then you can set up an initial step which compares the Hand (for example) via ZChange to the static reference point. By strategically setting up the reference point somewhere halfway between a pulled punch and a extended punch you should be able to use the ZChange relationship to capture the punch.

This solution may require reserve calibration at the beginning of your "session". This concept is discussed in Section 7.0 (Known Limitations) of the plugin documentation. Basically because there is no support for setting reference points on-the-fly (or more accurately set static reference points based on a joint position) you may need to set a static reference point and then calibrate the person's position by having then punch and see if it trips...and if get them to move until they in the correct position.

I will be adding support for setting reference points based on joint's current position soon...but it may take some time depending on my time availability.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Added GetPlayerInfo() and GetJointInfo() to the plugin. See added documentation (Section 8.0).

These methods allows obtaining information about the position of a player (i.e. the player's skeleton) and the position of each players' joints (i.e. the player's skeleton's joints).

This information can be used to calibrate static reference points using a player's starting location.

Re-pull the source code and re-compile to get this latest update.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

I will check it out, although now my time is also somewhat limited...

As for my suggestion, I imagined referencing two spheres around a particular joint - if another joint is within the closer sphere, it would be Near the referenced joint, if it is outside the larger sphere, it is Far. I believe two spheres would need to be used as on the boundary there might be some jitter, as you mentioned. So if I draw my fist (hand) toward my shoulder, it would register as Near, if I stretch it far outside the second sphere, it would register as Far. That would allow gestures in which no lines relative to other joints are crossed, e.g. a hand wave where the hand does not cross the joint lines, similar to this one (OK, it might cross the shoulder line, but I think you'll get the idea):

Image
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

As I wrote, I have no time to test it, so I did, of course. I am not sure if I am using the methods correctly... I am using a test script:

Code: Select all

def update():
	hand = KinectGestures.GetJointInfo(KinectJoint.HandLeft)
	
	if not hand is None:
		diagnostics.watch(hand.X)

if starting:
	KinectGestures.update += update
	KinectGestures.RecognitionStart()
Without the RecognitionStart reference no info is ever displayed. If RecognitionStart is used, the info seems to be updated only when the predefined gestures are being recognized (i.e. in the middle of a gesture).
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Jabberwock wrote:Without the RecognitionStart reference no info is ever displayed. If RecognitionStart is used, the info seems to be updated only when the predefined gestures are being recognized (i.e. in the middle of a gesture).
Thanks for the comment. While the way that you are using it a possibility, it only updates, as you said, when a gesture is complete. This is because the Kinect Gesture Recognition plugin is designed around recognizing gestures and thus the update event is only fired when a gesture is recognized. What you want to use is the processing event. This event fires more frequently because it fires when there is any processing message (such as partial gesture recognition, gesture timeouts, etc).

However, I take full responsibility for the error because that is a key piece of information that I did not put in the documentation. Try switching to the processing event instead and I will update the documentation.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Jabberwock wrote:As for my suggestion, I imagined referencing two spheres around a particular joint - if another joint is within the closer sphere, it would be Near the referenced joint, if it is outside the larger sphere, it is Far.
I think the same concept could be achieved with a distance relationship. A set distance away from the join basically defines the sphere that you were talking about. However, if I implement it as a distance instead of a Near and Far, the plugin user will have more control. For example, the user would be able to configure 3 distances (e.g. near, mid, and far) to distinguish between a pulled punch (near, mid, near) and a full punch (near, mid, far, mid, near).

I will have a look into it to see how difficult it would be to add.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

LordAshes wrote: Thanks for the comment. While the way that you are using it a possibility, it only updates, as you said, when a gesture is complete. This is because the Kinect Gesture Recognition plugin is designed around recognizing gestures and thus the update event is only fired when a gesture is recognized. What you want to use is the processing event. This event fires more frequently because it fires when there is any processing message (such as partial gesture recognition, gesture timeouts, etc).

However, I take full responsibility for the error because that is a key piece of information that I did not put in the documentation. Try switching to the processing event instead and I will update the documentation.
It makes sense, I have not thought of that.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

LordAshes wrote: I think the same concept could be achieved with a distance relationship. A set distance away from the join basically defines the sphere that you were talking about. However, if I implement it as a distance instead of a Near and Far, the plugin user will have more control. For example, the user would be able to configure 3 distances (e.g. near, mid, and far) to distinguish between a pulled punch (near, mid, near) and a full punch (near, mid, far, mid, near).

I will have a look into it to see how difficult it would be to add.
Yes, that would give even more options!
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

UPDATE:

Added additional processing event messages which include "Player N Added" and "Player N Removed" (see added documentation A.0).
The sensitivity of this feature (as discussed in the documentation) can be set using the plugin settings.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

UPDATE: Version 2 - Major Overhaul

Features:

- Optimized recognition processing to only evaluate relationships that are part of configured gestures.
- Added Distance relationship allowing conditions based on the distance between two objects (two joints or a joint and a static reference point). Both less than and more than are supported.
- Simplified obtaining plugin results by including the information directly in the event handler (see Section 4 of documentation for sample code).
- Separated plugin information into 4 different event handlers. Most users only need to use the one update handler but event handlers for processing, player adds/removes and framing are available.
- Added version method to determine which version of the plugin you are using

Section B of the documentation discusses the changes but other sections of the documentation have been updated to reflect the changes.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

Wow, that was quick!

I did a test run: FistBumps work OK, although sometimes the middle step is missed, especially when done quick - but this might be due to the fact that my computer is not a speed demon... But with a little practice I can time the gestures just right. I also did my own punching configuration with great success... well, almost, see the question below.

Just to let you know, the two original scripts are not updated with the new procedure syntax, so they complain about lacking parameters. And shouldn't InfrontOf be InFrontOf?

Also, I have two questions: as you use the gestures for rhythm games, how do you handle holds? I suppose I might add a gesture that checks whether the conditions of the final step of the held gesture have failed (i.e. the final step is no longer maintained), but maybe there is a simpler way...

The second one: I cannot figure out the syntax of the config file. I have the file given below - for some reason it is seen as one gesture with four steps instead of two gestures with two steps...


Code: Select all

<?xml version="1.0" encoding="utf-8"?>
<ArrayOfGestureSequences xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <GestureSequences>
    <gesture>PunchLeft</gesture>
    <timeout>5000</timeout>
    <steps>
      <GestureSequenceStep>
        <SuccessConditions>
          <JointRelationship>
            <actor>HandLeft</actor>
            <relation>Distance</relation>
            <relative>ShoulderLeft</relative>
            <deviation>-200</deviation>
          </JointRelationship>
        </SuccessConditions>
      </GestureSequenceStep>
      <GestureSequenceStep>
        <SuccessConditions>
          <JointRelationship>
            <actor>HandLeft</actor>
            <relation>Distance</relation>
            <relative>ShoulderLeft</relative>
            <deviation>300</deviation>
          </JointRelationship>
        </SuccessConditions>
      </GestureSequenceStep>
    </steps>
    <gesture>PunchRight</gesture>
    <timeout>5000</timeout>
    <steps>
      <GestureSequenceStep>
        <SuccessConditions>
          <JointRelationship>
            <actor>HandRight</actor>
            <relation>Distance</relation>
            <relative>ShoulderRight</relative>
            <deviation>-200</deviation>
          </JointRelationship>
        </SuccessConditions>
      </GestureSequenceStep>
      <GestureSequenceStep>
        <SuccessConditions>
          <JointRelationship>
            <actor>HandRight</actor>
            <relation>Distance</relation>
            <relative>ShoulderRight</relative>
            <deviation>300</deviation>
          </JointRelationship>
        </SuccessConditions>
      </GestureSequenceStep>
    </steps>
  </GestureSequences>
</ArrayOfGestureSequences>
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Jabberwock wrote:Wow, that was quick!
Just to let you know, the two original scripts are not updated with the new procedure syntax, so they complain about lacking parameters. And shouldn't InfrontOf be InFrontOf?
Yes. I need to update those. I realize that late last night and did not have time to update or remove.
Jabberwock wrote:Wow, that was quick!
Re: Rythm Games
When I tried it with Rythm games, I didn't implement a hold. However, it should not be too hard to implement a hold. I would try one of the following methods:

A. Dual Gesture

Create one gesture for the key down condition (e.g. a punch gesture that activates the key) and then a second gesture for the key up condition (i.e. a retract punch gesture).

B. Single Gesture With Processing

Create a single gesture which contains both the set and reset condition (e.g. a punch gesture with a retract afterwards). Then subscribe to the processing event (instead of the regular update event) so that you can determine when the set condition has been met on which you set the key and then release the key when the gesture completes. This is basically the same solution as the first solution except that it allow you to configure one gesture for the entire process as opposed to 2 gestures but it requires a little bit of parsing of the processing messages.

With either of these solution you should be able to hold a note by doing the first gesture but not the second (or the partial gesture in the second case) and then doing the second gesture (or completing the gesture in the second case) to release the note.

BTW, with regards to the Distance relationship, you can probably get better results by playing around with the distance numbers. The -20 value could be changed to something like -80. This would trip the fist butt earlier (i.e. when the fists are not yet completely touching) but it may help prevent some misses. When creating a gesture you can use the GetJointInfo() to view the joint position values and adjust your values to values that work most of the time. In a future release I will probably expose the Relationships in addition to the joint position so things like the distance can be viewed to make creating the initial gesture configuration more easier.

I have used the plugin to do a proof of concept for a rhythm game but I have a feeling that the plugin won't cut it for any high speed games which may include rhythm games when playing at more advanced (i.e. faster) pace. I can't really do a proper test because I am not so good at rhythm games when player at more advanced levels and thus I am not sure if I am missing notes due to a lag with the plugin or due to my own lack of skill.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

Thanks for the explanations!

I have figured out my mistake: for some reason I thought GestureSequences comprised all gestures, even though the file structure indicated it does not.

However, I still see that the results are a quite inconsistent... For example, the following config does not seem to work...

Code: Select all

<?xml version="1.0" encoding="utf-8"?>
<ArrayOfGestureSequences xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <GestureSequences>
    <gesture>RaiseHand</gesture>
    <timeout>15000</timeout>
    <steps>
      <GestureSequenceStep>
        <SuccessConditions>
          <JointRelationship>
            <actor>HandLeft</actor>
            <relation>LeftOf</relation>
            <relative>ShoulderLeft</relative>
            <deviation>0</deviation>
          </JointRelationship>
		  <JointRelationship>
            <actor>HandLeft</actor>
            <relation>Above</relation>
            <relative>ShoulderLeft</relative>
            <deviation>0</deviation>
          </JointRelationship>
       </SuccessConditions>
      </GestureSequenceStep>
    </steps>
  </GestureSequences>
</ArrayOfGestureSequences>
However, if I change Above to Below, it works without problems...?

I have looked at the code and found no reason for that, maybe except one thing: Above and Below relations seem to be tracking the Z coordinate, however, for my Kinect (which is confirmed by GetJointInfo) the Z axis is the depth, not height... Is it possible that the coordinates are switched?
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

I am not sure what you are trying to achieve so it is a little hard to help but let me try to explain...

A gesture (such as Punch) consists of one or more steps. Each step needs to have at least one success condition relationship (but can have more) and zero or more failure conditions. As an example, a punch gesture would likely have two or three steps (depending if you want to insist on the punch retraction). The first step would be one or more conditions for the punch starting condition (maybe left hand left of shoulder and left hand distance -40 from left should). The second step would have success conditions for the extended portion of the punch (maybe left hand left of shoulder and left hand distance 200 of left shoulder). We could even add a failure condition (such as left hand right of left shoulder). This would be sufficient for a punch gesture. If we wanted to insist on the punch retraction then we could add one more step which could be exactly the same as the starting step.

It is important to understand the difference between a relationship that is not a success condition and one that is an actual failure condition. Consider the situation where we don't add the proposed failure condition in step 2. In such a case, if the user punched diagonally (bringing the left hand right of the left shoulder) then step 2 would not succeed but it would also not invalidate the gesture. Thus once the hand is extended (but right of left shoulder) the player could move it left and thereby meet the conditions in step 2. If, on the other hand, we add the failure condition, punching diagonal would trigger the failure condition and then user would have to go back and succeed at step one again (i.e. if the user just move the hand left to correct the mistake step 2 would not succeed).

In your example you have multiple success conditions in one step so all of these conditions needs to be true in order for the step to succeed. As such in order for the step to succeed Left hand must be left and above the left shoulder. If on the other hand each of the success conditions were in their own step then the first success condition (in the first step) would need to be true (i.e. left hand left of left shoulder) in order to move on to the next step and evaluate that condition (i.e. left hand above left shoulder). With 2 conditions, in the end, the two versions end up completing the gesture under the same conditions but when more than 2 conditions are involved the results can be different. This is because when all the success conditions are part of one step they all need to occur at once. When they are part of multiple steps, conditions for previous steps do not need to be true anymore for future steps.

Typically you should try to define gestures with more than one step. Otherwise you can get a looping condition. Normally as you work through the steps of a gesture, the plugin tracks which step you are at. When the gesture is completed (or failed or timed out) this tracker is reset to the first step requiring the whole gesture to be repeated in order to be completed (again). If, however, your gesture only has one step then after completing it, the tracker will get reset but the gesture will immediately succeed again and doing so until the the player moves in such a way as to not be succeeding at the gesture anymore. This is typically undesirable because, for example, if the gesture completion is mapped to pressing a key on the keyboard then the key will get pressed many times.

Going back to a punch example, if the first step is a distance -40 relationship and the second step is a distance 200 relationship then completing both steps will reset back to the first step but now the first set conditions will not be met so no looping condition is set up. Even if we defined the punch with three steps (distance -40, distance 200, distance -40) thus requiring the punch to be pulled back, it would still work because at the end of the 3 steps the tracker would reset back to step 1. Step 1 would succeed immediately but step 2 would not so no loop condition would occur.

BTW, If you are concerned that your computer's performance may be a factor, you can do a fairly easy test. Add to your current script a subscription to the frame event and have the handler print out some unique text such as the time. When you run your script, you will be able to see how fact the frame events are tripping. Each frame event is a recalculation of all the actors-relatives relationships. So if you see the frame related text scroll by fairly fast in the console, your computer does not seem to have a performance issue. If the frame related text show up slowly then you computer is likely a factor. Note that when doing this test make sure you have loaded your desired gestures and turned the recognition on. If you don't load any gesture the frame event will get tripped more faster since there is no relationships for the plugin to calculate.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

I have made the gesture in one step just so it is easier to check how it works. If I have the same config, but with Below condition, I can trigger it consistently and easily (and yes, it loops, but it is not a problem). If I have the config with Above condition, I cannot trigger it at all, no matter what I do with my hand.

But there is something else: if I put GetJointInfo in the process event, the info is updated ONLY in the config with the Below condition, not with the Above condition. When I move my hand, the values change, but the Z value changes when I move my hand forward and back and not up and down.

When I put GetJointInfo in the frame event, it gets updated in both configs, with Above and Below condition. However, here is another thing I noticed - for hand values the value reported by GetJointInfo is within the range of 0.7-1.2. The relativePos before the Above check seems to be normalized (i.e. multiplied by 1000), and then it is compared with skeleton.Joints pos - is that normalized also? Because if it is not, the >= would never trip. But I might be completely wrong here, if I am, ignore me and forgive me, I am just trying to help...
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Jabberwock wrote:If I put GetJointInfo in the process event, the info is updated ONLY in the config with the Below condition, not with the Above condition. When I move my hand, the values change, but the Z value changes when I move my hand forward and back and not up and down.

When I put GetJointInfo in the frame event, it gets updated in both configs, with Above and Below condition. However, here is another thing I noticed - for hand values the value reported by GetJointInfo is within the range of 0.7-1.2. The relativePos before the Above check seems to be normalized (i.e. multiplied by 1000), and then it is compared with skeleton.Joints pos - is that normalized also? Because if it is not, the >= would never trip. But I might be completely wrong here, if I am, ignore me and forgive me, I am just trying to help...
I will open GitHub issues for these items and look into them. Thanks for your testing assistance.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

UDPATE: Relationship bug fixed.

The relationship bug mentioned previously was discovered to be a result of some non-normalized joint data. The code was reviewed to ensure all joint data is normalized including data returned by the GetJointInfo() method. A sample script "RelationTest .py" was added into the scripts folder which tests the LeftOf, RightOf, Above, Below, InfrontOf and Behind relationships using HandLeft as compared to ShoulderLeft except for the InfrontOf and Behind which uses HandLeft as compare to HandRight (since I am not sure how well the Kinect sensor would deal with detecting HandLeft behind ShouldeLeft).

As a bonus for troubleshooting the GetRelationshipInfo(actor,relationship,relative) method was added. For any actor-relationship-relative used in the gesture configuration it will return 1 if the relationship is currently true, 0 if it is currently false or the relationship value for Distance, XChange, YChange and ZChange relationships. If the actor-relationship-relative does not exist, it returns float.NAN. This function can be used in the progress or frame event handler to check the status of various relationship when a step may have many success conditions but one of them is not being met (i.e. the step fails to pass).
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Jabberwock,

I really appreciate your patience and your troubleshooting. I try to address issues as quickly as possible but since I have only myself working on this, it is more likely that bugs slip through. Although I am a little embarrassed with the last bug because that was actually a large bug that happened to have slipped by because I did a very small test script and happened to have chosen a condition that managed to pass even with the bug. You guess at the problem about normalization was correct but also the Y and Z coordinates were swapped in the Version 2.0 code (thus even if the code was evaluating the relationship correctly it would have been wrong because Above/Below would have been InfrontOf/Behind and vice versa). Both issues should be resolved now and I included a test script that tests all of these relationships for HandLeft. They all passed for HandLeft so I believe they should be working correctly now.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

No problem, it took me only about two hours of wild arm flailing to figure it out and I need the workout anyway (that is one of the points of my Kinect gaming) :) I am glad I could help! In the meantime I have noticed one more thing: is the Head joint omitted on purpose? Because I see it is included the library...
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

Thanks for identifying another issue. I will open an GitHub issue and look into it. It is not intentional. I didn't manage to figure out how to expose the Microsoft Kinect Joint class directly, so I created a mimic enumeration which is visible to the plugin. Most likely I accidentally left out the head in the mimic class.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

If Head could be added, it would be great, as it would allow for greater flexibility in the shoulder area, i.e. for more gestures that would not trip the 'hands up' triggers.

Unfortunately, I am leaving for a few days, so I will not be able to develop the scripts for a while. I think I have noticed one more issue, i.e. Below HipRight tripping much higher than expected, but I could spend little time on it, so I am not really sure. After I am back I will try to fhish one complete script for a game, so that I could really test it in live conditions. I have tested the performance with GetJointInfo in the frame event and most of the time it tracks the positions in real time, although there is a lot of variations in the reading (but I expected it, as I have noticed it already working with my Kinect before, unfortunately I am still using v1).
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

UPDATE: Head bug fixed.

As I suspected it was just missing from the plugin enumeration. This meant that while it did not show up in the joint list it was actually still usable if you knew its joint int value.
As of the latest update, the head has been added to the enumeration and thus you can now select it like any other joint.

I also used this opportunity to place the C# file in a sub-folder, provided a proper C# solution file and added assembly properties. This means that when the user compiles the code, unless they change the assembly properties, the version reported back for the plugin will be consistent among users.

P.S. The bug was actually fixed a few hours after being identified but for some reason I was having troubles posting to the forum.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

I am back to testing...

I believe there must be another bug somewhere, but I cannot quite nail it. If you load the config I list below, you will see that the LowHand gesture fires up much too high. It is not the problem with that specific gesture - if it is tested in isolation, it works as intended. In that configuration the gesture fires even if the hand's position is way above the hip as shown by GetJointInfo.

Somehow the other gesture, the distance one, is breaking the recognition of the LowHand.

Code: Select all

<?xml version="1.0" encoding="utf-8"?>
<ArrayOfGestureSequences xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <GestureSequences>
    <gesture>RightHand</gesture>
    <timeout>15000</timeout>
    <steps>
      <GestureSequenceStep>
        <SuccessConditions>
		  <JointRelationship>
            <actor>HandRight</actor>
            <relation>Distance</relation>
            <relative>ShoulderRight</relative>
            <deviation>400</deviation>
          </JointRelationship>
       </SuccessConditions>
	   <FailureConditions>
		  <JointRelationship>
            <actor>HandRight</actor>
            <relation>Above</relation>
            <relative>ShoulderRight</relative>
            <deviation>0</deviation>
          </JointRelationship>
       </FailureConditions>
	   <FailureConditions>
		  <JointRelationship>
            <actor>HandRight</actor>
            <relation>Below</relation>
            <relative>HipRight</relative>
            <deviation>0</deviation>
          </JointRelationship>
       </FailureConditions>
      </GestureSequenceStep>
    </steps>
  </GestureSequences>
  <GestureSequences>
    <gesture>LowHand</gesture>
    <timeout>15000</timeout>
    <steps>
      <GestureSequenceStep>
        <SuccessConditions>
		  <JointRelationship>
            <actor>HandRight</actor>
            <relation>Below</relation>
            <relative>HipRight</relative>
            <deviation>0</deviation>
          </JointRelationship>
       </SuccessConditions>
      </GestureSequenceStep>
    </steps>
  </GestureSequences>
</ArrayOfGestureSequences>
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

I tried your configuration and I am getting a consistent response from LowHand. I do agree that it seems to get detected around the thigh area as opposed to hip but I don't believe that has anything to do with the plugin code...that is where the Kinect is guessing the hip is. If you want to try to create the LowHand trigger around the actual hip level, you may need to put in a hip level reference point and use that as the reference instead. For example, at a reference point called HipLevel with the point having a height position to whatever you want the hip level at. Then just use the RightHand Below HipLevel instead. I realize this is not ideal because if you have users of greatly different heights the preset hip level may not work for everyone. You could also create the HipLevel reference point by reading the Kinect Hip position and then adding some to the height value.

Note: You can sometimes get some odd joint locations if the Kinect is not able to see the whole body in the sensor range. I have seen this with some of the Kinect sample programs which draw the skeleton. If the feet and lower legs are not in the sensor view sometimes it ends up drawing them at odd locations as if the player was squatting.

I did find, however, that there was a bug with Failure Conditions on the first step. Normally when Failure Conditions are met the progress gets reset to the first step. As such there was some code that prevented the processing of Failure Conditions on the first step (since progress was already at the first step). However, this allowed the processing of Success Conditions on the first step even if Failure Conditions we met on the first step. This is why in your configuration, the RightHand gesture was tripping even when it was below the hip or above the shoulder.

This has been fixed identified as a GitHub issue and resolved with a GitHub commit.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

The problem is that for me LowHand is detected around the chest area... And it is not a matter of faulty joint placement, as that is quite consistent. I have simply put the GetJointInfo debug reports in the update procedure and that is what I am getting:

PROCESS: Player 18 Has Completed Gesture LowHand Step 1 Of 1
PROCESS: Player 18 Has Completed Gesture LowHand
UPDATE: Player 18 completed gesture LowHand
Hand position: 10.3343
Hip position: -176.882


When I track the hand/hip positions in the watch window in frame(), the positions are consistent and correct (well, the hip is around the waist level, but I can live with that).

When I switch the first gesture to HandLeft, the problem is gone:

PROCESS: Player 16 Has Completed Gesture LowHand Step 1 Of 1
PROCESS: Player 16 Has Completed Gesture LowHand
UPDATE: Player 16 completed gesture LowHand
Hand position: -495.971
Hip position: -482.675

The condition trips only when the hand position is actually below the hip level, as indicated by GetJointInfo.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

I have raised the issue on GitHub with a possible solution.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

I verified the suspected cause of the bug and you are correct. ActorPos was being modified and then re-used. I have modified the code to use a temporary variable for calculating the distance instead. I have not yet closed the GitHub issue because I have not done full testing to see if it fixes the issue but you can grab the latest copy if you want to try it yourself. Once again, thanks for helping troubleshoot.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

I have tried the other change (i.e. moving the assignment to the third loop) and it seems to work, too. I will test your code later.

The plugin seems to work OK now, so I have tried actually playing with it. As I have some conceptual problems with KickBeat, I have made a simple profile for Melody's Escape. The good news is that I had 50 hits. The bad news is that I had 200 misses :D

I have made the holds by adding 'release' gestures that are opposite of the final step of the 'press' gesture (i.e. success conditions become failure conditions and the other way round), i.e. your method A. Method B would be more problematic, as e.g. with a gesture with two or more success conditions (or with success and failure conditions) I could 'leave' the final step by failing either of them (e.g. for a hand that is Above Head and LeftOf LeftShoulder I could either move the hand below the head OR right of shoulder) or fulfilling any of the failure conditions.

The 'release' gestures become especially problematic when there are two (or more) failure conditions. For example, I have gestures for Left and Right which check the distance from shoulder for success and Above Head/Below Hip for failure. I can then move the hand from that position above (for Up gesture) and below (for Down gesture). If I do the 'release' gesture only with checking for distance, it migh not release at all, if I move the straight arm from left to the upper or lower position. I also need two separate gestures checking for Above Head and Below Hip, to release the button triggered in Left, as each failure condition of the original 'press' final step has to be checked separately.

I still think it might be more efficient to handle that on the plugin side (i.e. to have the update procedure to announce entering the final step and leaving it), but I realize it might not be easy to add.
User avatar
LordAshes
One Eyed Hopeful
Posts: 24
Joined: Sat Jan 27, 2018 10:27 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by LordAshes »

> I still think it might be more efficient to handle that on the plugin side (i.e. to have the update procedure to announce entering the final step and leaving it), but I realize it might not be easy to add.

The problem with the idea of having the plugin handle it is that the "leaving it" is very vague. Consider a very simple gesture like raising a hand (e.g. Hand Left Above Shoulder Left). What constitutes leaving the final step? Obviously if Hand Left drops below Should Left then the that means the final step has been left...but what if, for example, Hand Left is move significantly left or right? I would say that is not longer raising the hand but it is not an intuitive condition that could be derived from the raising the hand condition (i.e. so that the plugin could automatically generate some "leaving" conditions).

So, in short, "Yes" it would not be easy to add such functionality to the plugin.

> Alternative

You indicated that you are setting up failure conditions as the last step of the gesture which are opposite of the previous step to produce the "release gesture". I assume you are then using the Processing event to capture the 2nd last step to determine when the gesture is done at which point you hold, and when the gesture is reset you process the release.

A few things to note:

1. Using the above method a gesture can be completed by either completing the last step success conditions or meeting any of the failure conditions. In either case the gesture gets reset.
2. If there are more than one conditions for the last step of a gesture, using success conditions basically does an AND (i.e. all success conditions must pass in order for the gesture to complete).
3. If there are more than one conditions for the last step of a gesture, using failure conditions basically does an OR (any one or more failure condition needs to be true in order for the gesture to reset).
4. You may be able to define more generic reset conditions instead of reversing the success conditions of the previous step. I assume that during the hold the joint is not expected to move significantly. As such you may be able to set up generic failure conditions for the last step which check ChangeX, ChangeY and ChangeZ. If any of these change "significantly" then we can assume that the hold gesture is over and (meeting the failure conditions) we reset the gesture. In this case you want to use failure conditions so that a change in any direction will cause the reset.
Jabberwock
Cross Eyed!
Posts: 197
Joined: Mon Mar 02, 2015 3:58 pm

Re: Kinect (Simple And Complex) Gesture Recognition Plugin

Post by Jabberwock »

For 'leaving the gesture' I had something very specific in mind. For the last step (and the gesture itself) to be triggered it has to match all success conditions and not match any of the failure conditions. Thus leaving it would be failing any of the success conditions and/or matching any of the failure conditions.

I was not clear with my description of what I do now: for each of the last step of any gesture I have set up another gesture that is the reverse of the last step. The problem occurs when there are failure conditions.

Suppose the last step of the gesture is this:

Image

It has one success condtion (distance of hand to shoulder) and two failure conditions (above head and below hip).

I reverse the success condition to get the most obvious release gesture (failures do not come into play as I want the release to occur wherever the distance is shortened):

Image

However, I can also make the following move (which would be more convenient if I want to do the Up gesture):

Image

Here the distance is not shortened, so I need to setup a second release gesture. The same goes for the other failure condition of the original last step:

Image

So I need three release gestures for a single press gesture. Handling the checking of the conditions on the plugin side has the advantage that the reverse conditions might be checked in one step and ANY match (i.e. any failure for success conditions and any success for all failure conditions) would release the gesture. Doing it on the gesture config side requires separate gestures, as the conditions might be disjointed.

Thank you for the suggestion with the Change condition, that might work!
Post Reply

Return to “FreePIE”