Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Kinect v2 with MS-SDK

Discussion in 'Assets and Asset Store' started by roumenf, Aug 1, 2014.

  1. iesswl

    iesswl

    Joined:
    Aug 29, 2019
    Posts:
    4
    Anyone? I need help.
     
  2. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I'm not sure I understand your issue. Please e-mail me with more details about what exactly you did and what's not working. If possible, export the scenes and scripts you are using and send them over to me, so I can check what's wrong. Please also mention your invoice number in the e-mail.
     
  3. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Look at the KinectGesturesDemo1-scene in KinectDemos/GesturesDemo-folder and its CubeGestureListener-component.
     
  4. enginarer

    enginarer

    Joined:
    Mar 7, 2017
    Posts:
    5
    Hi, I did a search but couldn't see if it's been asked before. I'm trying to update the Mesh collider on my avatar when controlling over kinect, but couldn't do it so far. I tried mesh update scripts but it's not working because I guess the actual mesh is not changing itself, so the reference is actually not changing. Any particular way of doing it within the existing scripts?
     
  5. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    It's not recommended to use mesh colliders at all. Manipulating them will slow down the overall performance of the scene. Instead, as you can see in the KinectAvatarsDemo1-scene, the avatars there have capsule colliders around the body and sphere colliders around the hands. Of course, to make collisions happen, the avatars (or at least one scene objects) need to have Rigidbody-components, as well.
     
  6. enginarer

    enginarer

    Joined:
    Mar 7, 2017
    Posts:
    5
    Thanks for the answer! I actually try to get the framerate low on purpose for my project, so I thought that it would make sense. But otherwise, you are correct.

    One more question; is there a way to implement third-person grab object feature for the avateering type of scenes? I tried the interaction & grab scripts but they work for first person point-of-view only I guess, I couldn't get the hand coordinates actually match the third-person avatar's hands, so that it grabs the rigidbodies it touches at the grab gesture. I am trying to implement an overall object interaction for my scene (push, grab, hit, slap, etc.), as you may have guessed :) This is why I tried to get the full body mesh collider work as well.
     
  7. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Regarding 3rd person interaction: As long as the 3rd person is tracked by the sensor, you should be able to detect its hand grips, as well. The InteractionManager (and GrabDropScript, etc.) component has a setting called 'Player index' that can be used to specify the tracked user. In this case you may need more than one InteractionManager (and more than one other components) in the scene, to track the interactions of users 0, 1, etc. simultaneously.
     
  8. Chuvi

    Chuvi

    Joined:
    Jul 14, 2013
    Posts:
    14
    Hello. I have a problem with a virtual fitting room. When the model dresses, the upper part (arms, torso) dresses normally, but the legs behave somehow strange. Either they are too small, then they are turned back, as if a person were standing stretching his stomach forward.
     

    Attached Files:

    • 1.png
      1.png
      File size:
      349.7 KB
      Views:
      411
    • 2.png
      2.png
      File size:
      348.5 KB
      Views:
      401
    • 3.png
      3.png
      File size:
      350.9 KB
      Views:
      396
    • 4.png
      4.png
      File size:
      347.7 KB
      Views:
      396
  9. Chuvi

    Chuvi

    Joined:
    Jul 14, 2013
    Posts:
    14
    I hope the coronovirus does not overtake the developers and they will be able to answer my question!
     
  10. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please use KinectFittingRoom2-scene to test and adjust your model(s). Place the model in the scene and make it similar to the ModelMF-object in the scene, and with the same components.

    Disable the 'Continuous scaling'-setting of the AvatarScaler-component. Then adjust the 'Body scale factor' (start with 1), and if needed the 'Leg scale factor' (start with 1 either), in order to have the best match between the model and the user. Then you can set the same values to the same settings of the ModelSelector-component in KinectFittingRoom1.
     
  11. Tyndareus

    Tyndareus

    Joined:
    Aug 31, 2018
    Posts:
    37
    Having an issue with getting the colour and depth textures from an orrbec astra, i'm just getting the manager.GetUsersClrTex2D and manager.GetUsersLblTex2D but they seem to be returning grey. What am I missing, the camera works and the specific textures work because the example scenes work, even after attempting to set it up exactly the same (two cameras one rendering post-processing) it returns grey.

    The alternates work even less, GetUsersClrTex will give a completely transparent texture and GetUsersLblTex just returns null.
    The options on the manager are set (Compute User Map is on Raw Depth and Compute Color Map is enabled).
    Seems to work if the manager is always spawned, which could be problematic as we have multiple scenes and not all scenes need the kinect available.

    Update; the issue mentioned at first was because the camera was being used by another asset.
    However, with the solution (having the kinect manager always spawned) doesn't seem to work between scenes as the texture for the image suddenly freezes and stops updating, the rest of the asset is fine (because gestures work and body detection etc)
     
    Last edited: Jun 10, 2020
  12. Chuvi

    Chuvi

    Joined:
    Jul 14, 2013
    Posts:
    14
    Thanks for the help. Strange, KinectFittingRoom2 works just fine. Apparently there is some kind of problem in instantiating the model prefab and assigning the components AvatarController and AvatarScaller
     
  13. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    For using the KinectManager across multiple scenes, look at 'Howto-Use-KinectManager-Across-Multiple-Scenes.pdf' in K2Examples/_Readme-folder, and the demo scenes in KinectDemos/MultiSceneDemo-folder.

    Set 'Compute User Map' to one of the texture options, if you want GetUsersLblTex() to return anything. 'Raw-depth only' means you will process the raw depth frame by yourself.
     
  14. ryantriyatna

    ryantriyatna

    Joined:
    Sep 30, 2019
    Posts:
    3
    i need help, my app seem cant detect cloth or pants/jeans with black color
     
  15. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Unfortunately this is a sensor specific issue. But as far as I know from other customers, if you put an extra light in front and above the place, where the user stays, this may minimize the "black clothes" issue.
     
  16. ryantriyatna

    ryantriyatna

    Joined:
    Sep 30, 2019
    Posts:
    3
    ahh i see, thank for the reply :)
     
  17. Chuvi

    Chuvi

    Joined:
    Jul 14, 2013
    Posts:
    14
    Nevertheless, the problem remains if the kinekt is raised to a height of 1.9 meters. The avatar becomes an arc, the stomach stretches forward and the legs become small. In general, I have had this problem for a long time and I thought that it was the influence of some external factors. But recently he conducted an experiment and set the kinekt to a height of 1.9 meters and the avatar began to behave awfully. He returned the kinekt to a height of 1.6 meters, restarted the project, but anyway the distortions remained.

    At the same time, if you include the SkeletonOverlayer script, then the position of the feet, elbows and knees shows normal, but the model is distorted, the legs diverge to the sides. I tried to squeeze legs together in a prefab and it became a little better, but it is far from a normal display.
    2020-07-11_13-36-25.png 2020-07-11_13-36-36.png
     
    Last edited: Jul 11, 2020
  18. balzar29

    balzar29

    Joined:
    Jun 27, 2017
    Posts:
    7
    Hello,
    how can i draw the bone stickman created by the default scene of the Unity Kinect package on the color camera?
    Any idea?

    Thaks!
     
  19. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hm, this looks odd. Do you update the 'Sensor height' and 'Sensor angle'-settings of KinectManager-component in the scene, after you move the sensor?

    I'm having summer holidays now, but if the issue persists, please e-mail me with some more details (and screenshots) on how the sensor position and rotation affects the model. Please also check the behavior of ModelMF-model in KinectFittingRoom2, as well as U_Character-models and the Cubeman in KinectAvatarsDemo1-scene.
     
  20. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please look at the KinectOverlayDemo2-scene in KinectDemos/OverlayDemo-folder.
     
  21. bngames

    bngames

    Joined:
    Jul 3, 2012
    Posts:
    67
    Hey

    1) how do you make the skin.jpg.bytes textures?

    2) how to make more than one category, the documentation says "Raise Hand to Change Category" but I can only see a way to set one category and not multiple categories.
     
  22. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    To your questions:

    1) 'preview.jpg.bytes' is just a jpg-image renamed to .jpg.bytes. The last part is required by the Unity resource-subsystem, to indicate a binary resource.
    2) You need multiple model selectors (and the respective resource folders and models). Please look at this tip, if you need more information: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t12
     
  23. kosowski

    kosowski

    Joined:
    Jun 19, 2014
    Posts:
    15
    Hi! Is there any way to find when sensor data has been updated? If app runs at higher framerate than the sensor, there would be frames when the data is not updated, right?
    Thanks.
     
  24. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Yes, that's right. There are methods of the KinectManager-component in the scene (i.e. KinectManager.Instance) you could use to check if the frame is still the same or a new one. These are: GetColorFrameTime() for the color frames, GetDepthFrameTime() for the depth frames and GetBodyFrameTimestamp() for the body frames. If the values returned by these methods are different that the current ones, then the respective frames have been updated in the meantime.
     
  25. shiki_2022

    shiki_2022

    Joined:
    Aug 21, 2020
    Posts:
    11
    Is there any way to stop character's hand from clipping through body?
     
  26. HnS_Alexa

    HnS_Alexa

    Joined:
    Jan 23, 2019
    Posts:
    10
    Please someone reply for below questions ...

    1. Do they provide any sample projects or .exe before purchasing the product.
    2. Whether they provide any demo videos
    3. Are the bones editable (like importing customized bones) ?
    4. I would like to know whether removing part of the bones will affect stability.
     
  27. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Replied by e-mail.
     
  28. Realgar

    Realgar

    Joined:
    Sep 30, 2012
    Posts:
    52
    Hi everybody, I must have missed something. I installed sdk v2 kinect, and made the exemples work great in Unity 19, but when I make a build in windows, it does not light the kinect, and it doesn't track anything... Is there something to do with he build ? It's certainly something silly... Thank's for your help !
     
  29. Realgar

    Realgar

    Joined:
    Sep 30, 2012
    Posts:
    52
    Well, I found the solution in the FAQ of the well documented blog. I'm sorry. I just had to import MS SDK Packages before build anything... I'm a newbie with this library, but the asset is wonderful !!!
     
    rfilkov likes this.
  30. chiyuwang2004

    chiyuwang2004

    Joined:
    Aug 6, 2020
    Posts:
    2
    Hi, I am new to "Unity" and also new to "Kinect v2 Examples with MS-SDK", I am having an issue that I made a extreme simple scene, with only ground and a character which is controlled by Kinect v2 Example with MS-SDK. Every works fine if click the play button. But my small game does not seem to find Kinect Device if I build the whole game and run as an exe file.

    I am not sure where to look for the cause, please help.

    Added:
    1. Read some comments above, Then I tried x86 for Archtecture, Target Platform is Windows, leave everything else as default. Which did not help.
    2. Tried to import MS-Kinect SDK again which causes compilation error
     
    Last edited: Dec 16, 2020
  31. mindfulmx

    mindfulmx

    Joined:
    Mar 21, 2017
    Posts:
    42
    Hello, I would appreciate if someone could share some indications or suggestions to capture the animation of someone playing the guitar, and the possibility of manipulating an avatar at the same time in unity. Thanks in advance!
     
  32. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Please check in Unity build settings, if the 'Target platform' is set to 'Windows' and the 'Architecture' - to 'x86_64'. If this doesn't fix the issue, find the Player's log-file after you run (and stop) the exe, and look inside for errors and exceptions. Here is where to find the Unity log-files: Unity - Manual: Log Files (unity3d.com)

    If it's too difficult for you to find out what's wrong, feel free to e-mail me the log-file, so I can take a closer look.
     
  33. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    There is currently no animation capturing scene in the K2-asset, but I think to add one soon. If you have the K4A-asset, there is a Mocap-Animator scene in it, and it works with Kinect-v2 sensors as well.

    The avatars are controlled by the AvatarController-component. They are not related to animation capturing, so you can do both at the same time (as in the above mentioned scene). To see the AvatarController in action, please look at the demo scenes in 'K2Examples/KinectDemos/AvatarsDemo'-folder.
     
    mindfulmx likes this.
  34. mindfulmx

    mindfulmx

    Joined:
    Mar 21, 2017
    Posts:
    42
    Hello, and Thanks for your answer,
    I have you asset "Kinect v2 Examples with MS-SDK" and a kinect 2 sensor.
    Will try the demo scenes in: K2Examples/KinectDemos/AvatarsDemo'-folder.
    and will be great the animation capturing scene!
    Alan
     
    rfilkov likes this.
  35. MagaSganga

    MagaSganga

    Joined:
    Nov 5, 2020
    Posts:
    3
    Hi Roumen! Is this package written in c#? I´m developing a project in unity with kinect v2 and visual studio in c# and i need to use an avatar. As I read this asset is good for it. What I could not find is in what language is it written.
    Thanks!
     
  36. MagaSganga

    MagaSganga

    Joined:
    Nov 5, 2020
    Posts:
    3
    Hi Roumen! I already bought your asset, but I´m working on how to implement an avatar in my project and it seem that your web is not working. I´ve been trying to access it for 2 days now. It says it´s taking too long...
     
  37. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Thank you! I'll contact the cloud admins, to check the connectivity issue. In the meanwhile, here is a link to the pdf-variant of the online documentation.
     
  38. MagaSganga

    MagaSganga

    Joined:
    Nov 5, 2020
    Posts:
    3
    Thank you. It doesn´t seem to be working yet, but the pdf was very helpfull.
     
  39. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    It works, but (as far as I know) the firewall has limited the access to the server from some parts of the world, because of many DoS attacks coming from there. Please e-mail me your IP address, if you like, and then I can tell the admins to check again your specific case.
     
  40. mindfulmx

    mindfulmx

    Joined:
    Mar 21, 2017
    Posts:
    42
    Hello,
    Same problem here, can´t connect to your site (from México).
     
  41. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Again, please e-mail me your IP address, so I can give it to the server admins to check. In the meanwhile, you can use the PDF-help instead.
     
  42. asdfauger

    asdfauger

    Joined:
    Jan 4, 2021
    Posts:
    1
    Hi, i want to ask about to use my own gesture.
    I only constructed two gestures,and I want to discriminate when it is not these two gestures or when the user does not make any movements.
    I tried to use "gesture==null" or only use "else", but it didn’t work.
    I need this judgment to immediately recognize whether the current operation is correct,In order to implement my follow-up function.
    How can I represent it correctly?


    Code (CSharp):
    1.  public bool GestureCompleted(long userId, int userIndex, string gesture, float confidence)
    2.     {
    3.      
    4.         if (gesture == "try_Right")
    5.         {
    6.             Right = true;
    7.             Left = false;
    8.             uiText.text = "Right";
    9.        
    10.         }
    11.         else if (gesture == "try_Left")
    12.         {
    13.              Left = true;
    14.              Right = false;
    15.             uiText.text = "Left";
    16.          
    17.         }
    18.         else if(gesture==null)
    19.         {
    20.             Left = false;
    21.             Right = false;
    22.             uiText.text = "false";
    23.         }
    24.         return true;
    25.      
    26.     }
    Sorry for newbie question
     
  43. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    OK, let me explain. The GestureCompleted()-method is called only when one of the currently detected gestures types is completed (finished) by the user. For instance: Let's say the gesture detection system is instructed to detect SwipeLeft and SwipeRight gestures (in the UserDetected()-method of the gesture listener). Then the user performs a SwipeLeft gesture. After the gesture gets detected, the GestureCompleted() is called with 'gesture'-parameter = SwipeLeft. The method is not called when no gesture (or undefined gesture) is detected, that's why your else-part never executes.

    If you explicitly want to know when no gesture is detected (which would be in 99% of time), you can add one more method to the GestureListenerInterface definition /e.g. GestureNotDetected()/ and call it at the end of the KinectManager's Update()-method, in case GestureCompleted() was not called in the for-cycle before that.
     
  44. jdstudiokr

    jdstudiokr

    Joined:
    Mar 30, 2021
    Posts:
    1
    Hello guys...

    I'm new to Kinect v2, I'm strugling with this project.

    1. I have to know when player's foot touch the floor and the location, I tried to get them from colliders in the floor and the foot joint.... but the collider is continues to collisionEnter event... I don't know how to this touch get one as the real world.

    2. I have to detect who foots of player touch the ground... is it possible? I know Kinect manager distinguish the player index, but I don't know how to get it..

    Thanks... for reading, I really hope to get received answer..
     
  45. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Hi, to your questions:
    1. If you set correctly the 'Sensor height' & 'Sensor angle'-settings of the KinectManager-component in the scene, then when the user's foot is on the ground (and it's visible to the camera and correctly detected), it should have a joint position with Y-component around 0, because 'Sensor height' is the distance from the sensor to the ground.

    2. To get the foot's position, please look at the tip here: Kinect v2 Tips, Tricks and Examples | RF Solutions - Technology, Health and More (rfilkov.com) Instead of 'KinectInterop.JointType.HandRight' you can use 'KinectInterop.JointType.FootLeft' or 'KinectInterop.JointType.FootRight', and instead of 'manager.GetPrimaryUserID()' you can use 'manager.GetUserIdByIndex(userIndex)' to utilize the player indices.
     
  46. pkgingo

    pkgingo

    Joined:
    Apr 3, 2021
    Posts:
    1
    Hi, the later versions of unity (I am using 2020.3.f1) seem to have done away with the store target and replaced it with either a UWP target or an xbox target. Following the directions here https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t33 the sensor does not activate even when requesting microphone and webcam permissions. What changes need to be made to use a V2 kinect with this versions?
     
  47. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Sorry, I don't have any more stamina, time and desire to fight with UWP :(
     
  48. DoPie

    DoPie

    Joined:
    Jun 20, 2013
    Posts:
    64
    Not working RGBA view in android build. i was using nuitrack activation key to enable USB Orbbec Driver.
     
  49. LuggiK

    LuggiK

    Joined:
    Oct 23, 2018
    Posts:
    2
    Hi,
    I'm trying to get the Kinect Asset working across multiple scenes. I have a simple script, which let's me switch between scenes, but as soon as I switch, the player can't be controlled via the kinect anymore. What would I need to do?
     
  50. rfilkov

    rfilkov

    Joined:
    Sep 6, 2018
    Posts:
    87
    Please e-mail me, and send me a bit more detailed description of your issue.