Search Unity

Kinect v2 with MS-SDK

Discussion in 'Assets and Asset Store' started by roumenf, Aug 1, 2014.

  1. jeffcrouse

    jeffcrouse

    Joined:
    Apr 30, 2010
    Posts:
    18
    Is anyone else having problems using Kinect Studio? I have recorded some footage of all of the data streams and am playing it on loop, but when I open the demo scenes in Unity, nothing happens. I have unplugged my Kinect 2 just to make sure there is no confusion which interface it should be using. Is there some setting I am missing? Thanks!

    Edit: I am using version 2.19.2 of "Kinect v2 with MS-SDK" package, with Kinect Studio version 2.0.1410.19000 with a Kinect v2 and Windows 10
     
    Last edited: Aug 29, 2019
  2. asd_seisuke

    asd_seisuke

    Joined:
    Mar 28, 2019
    Posts:
    3
    Hello again, i want to ask about fittingroom again when i build it on PC Settings, its fine no error but when i open it the 3d clothing not in the same position with my body but it appear in the left side of the screen not in my body, when i play it in Unity 2018.4.5f1 its fine ?
     
    Last edited: Sep 5, 2019
  3. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Answered per e-mail, I think.
     
  4. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    You need to run the demo scene, press the Connect-button is Kinect studio and then play the recording. Hope you have not forget to "connect".
     
  5. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please e-mail me and attach some screenshots or photos, if possible. Please also mention your invoice number in the e-mail.
     
  6. Ky0jr

    Ky0jr

    Joined:
    Mar 22, 2019
    Posts:
    1
    Hello, are the FaceShapeAnimation is define by the Kinect? Or I can add my own FaceShapeAnimation?
     
  7. VironITKalatski

    VironITKalatski

    Joined:
    Aug 13, 2019
    Posts:
    2
    Good day everyone, i'd like to ask about fitiing room demo, where we are not using avatars< instead we re fitting clothing on out selves. The question is: I'd like to make fitting room in 9:16 resolution, but when i turning off UserBlender script - it's making my 9:16 too compressed, just like this: I need some help (advise) how to use portrait mode with UserBlender script turned off. Please help. And another question, why with userblender script there are a lot of "broken pixels" on clothes, but without UserBlender turned on it's totally fine

    upload_2019-9-17_19-23-53.png upload_2019-9-17_19-25-16.png
     
  8. VironITKalatski

    VironITKalatski

    Joined:
    Aug 13, 2019
    Posts:
    2
    I've found the solution in changing the canvas parameters in BackgroundImage1 - to world scale and setting my own nessesary sizes. Thank you for this asset!
     
    roumenf likes this.
  9. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    The FaceShapeAnimation values come from the face-tracking subsystem of Kinect SDK.
     
  10. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    You could enable the PortraitBackground-component of the background and foreground images. See this tip: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t19 But as I understand, you managed to solve the issue by yourself. Congrats and thank you, too!
     
  11. ZeFirestarter

    ZeFirestarter

    Joined:
    Oct 24, 2014
    Posts:
    17
    Hi all.
    I have run into a problem and I can't figure out how to solve this.
    I need to generate a Texture based on the depth sensor data and in grayscale (black -> white).

    Luckily there is an example that does this.
    I just had to change the color in the DepthShader from yellow to gray.

    The problem is the following.
    I need a fix distance for the 0% (white) and 100% (black) values.
    Let's say I want to put the minimum value at 500 ushort (=0,5meters) and maximum value at 2000 ushort (=2meters).
    500 is then maximum white and 2000 is maximum black.

    The problem with the demo is that the max and min values are relative to where the player is.
    As an example

    DefaultValues.jpg
    The furthest back is my face and is the darkest, but when I put my hand even further back my face turns whiter because my hand is the new "maximum" black.

    HandBehindCut.jpg

    So it should not be relative to the player, but constant max/min values.
    Also I need the information all the time, not only when a Player is recognized, but that is secondary.

    I tried using the RawUserDepth in the ComputeUserMap selection, but it is not giving me anything.

    Sorry, but I am a total noob with everything Kinect.
    I really like the Plug-in, thank you for all the hard work.

    Just in case my KinectManager options:
    KinectManager.jpg
     
    Last edited: Sep 26, 2019
  12. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi,

    If you don't need the player, but the whole environment, then (in the DepthShader again) comment out the 'if (playerIndex != 255)'-part and uncomment its else-part, without the else-statement, of course. And finally, instead of dividing to 5000, divide to 2000 (your maximum). I hope this will do what you need.
     
    ZeFirestarter likes this.
  13. ZeFirestarter

    ZeFirestarter

    Joined:
    Oct 24, 2014
    Posts:
    17
    Perfect it helped a lot. Thank you very much. Really good work with the plug-in :)
     
  14. vzheng

    vzheng

    Joined:
    Dec 4, 2012
    Posts:
    45
    Hi, i want show the depth image with boder like this, is there a easy way, many thanks.

    upload_2019-10-5_0-28-23.png
     
  15. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, there is a similar functionality in the BackgroundRemovalManager-component (see KinectBackgroundRemoval1-demo scene in KinectDemos/BackgroundRemovalDemo-folder), when 'Dilate iterations' are more than 'Erode iterations 0'. The difference between them determines the contour width and the 'Body contour color' sets the color of the contour. I'm not sure though, if it fits your use case. Please check it out.
     
  16. danidiazr

    danidiazr

    Joined:
    Dec 27, 2016
    Posts:
    24
    Hey!

    It's my first time using this asset. I just imported it and I got some compile errors.

    What can I do? I'm using unity 2019.2 and I have all the drivers for Kinect v2 installed.

    Thanks!
     

    Attached Files:

  17. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, I just tried to import the K2-asset into new Unity 2019.2.0f1 project and there were no compilation errors. As far as I see, the errors say there are duplications of Kinect SDK scripts in your project. In this regard, please create a NEW, EMPTY Unity project and import the K2-asset in there, then try it again. You don't need to import anything else.
     
  18. RasKrishna

    RasKrishna

    Joined:
    Oct 9, 2019
    Posts:
    2
    Help! My project works but when I build it the kinect doesn't turn on.
    This is my second time posting and getting no reply.
    What's going on? I'm just trying to figure out why my project works in the editor but not when it builds
     
    Last edited: Oct 10, 2019
  19. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, sorry if I missed your previous post!
    If your project works in the Editor but not in the build, this may be because the folder K2Examples/Resources is missing in your project, or if you build for Architecture 'x86' (i.e. 32 bit) instead of 'x86_64' (64 bit).
     
  20. RasKrishna

    RasKrishna

    Joined:
    Oct 9, 2019
    Posts:
    2
    Thanks, Roumenf. I switched to x86 _64 and it worked.
    Thanks for the help and for a great and fun tool.
     
    roumenf likes this.
  21. lisasims

    lisasims

    Joined:
    May 28, 2013
    Posts:
    21
    Hello,

    I'm currently having a weird problem... I'm using Kinect v2 and everything is working fine but then when I quit unity and restart, Kinect fails to start. The scene is playing but Kinect doesn't turn on with error message :

    Failed trying Kinect2Interface (p0,r:False)
    UnityEngine.Debug:Log(Object)

    System.DllNotFoundException: KinectUnityAddin

    Failed trying OrbbecAstraInterface (p1,r:False)
    UnityEngine.Debug:Log(Object)

    And just goes straight to opening a dummy (meaning the white light on Kinect never turns on) This keeps occurring. So I having to start a new project and re-import the package that I made of the project when it was working and restart, which means all the edits made are lost and I have to keep starting over again, let alone the time it takes to export and import packages,....

    Please help!!!! My project is due in 2 days!
     
    Last edited: Oct 22, 2019
  22. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hm, this sounds really odd. What happens between Unity editor restarts? Are the native libraries still there? I have never heard of any similar issue before.

    For a workaround, you can try to apply this alternative approach: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t32 After this, all native libraries will be embedded in the project.
     
  23. lisasims

    lisasims

    Joined:
    May 28, 2013
    Posts:
    21
    Hi thanks for the response! Ok, I'll try it and see what happens. Thanks again~
     
  24. lisasims

    lisasims

    Joined:
    May 28, 2013
    Posts:
    21
    OMG it works!!! Thank you so much!!!! Weird that I'm the only one who experienced this. Hmm, I'm still learning so maybe I did something wrong. But if the same problem arises, I guess I'll just repeat the process of deleting the dlls and what not.
     
  25. technorocker

    technorocker

    Joined:
    Jun 7, 2018
    Posts:
    10
    first off hello and thank you for this asset. I recently started playing with the Kinect V2 and am able to move my avatar around the map by click external root motion(took at bit to figure that out). Now I have a data glove that I have made with arduino. The glove works with its own model to open close fingers and also does the glove rotations in a prefab so all the information about the glove is accessible.

    How do you fuse my glove orientation data into the avatar controller script so my glove coordinates are used by the avatars hand. I do see the external hand rotation option in the avatar controller script in the inspector but I dont know where to "insert" the hand data.

    Any help is most appreciated
     
  26. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, if you enable the 'External hand rotations'-setting, the avatar's wrists, hands and fingers will not be controlled any more by the AvatarController (i.e. by the Kinect body tracking). This means you can control them with your own script. In your specific case this may be the same script that controls the prefab you mentioned above. There is no need to fuse anything into the avatar controller.
     
  27. technorocker

    technorocker

    Joined:
    Jun 7, 2018
    Posts:
    10
    Hello and thank you for the quick response. I guess my better question is how to make my glove coordinates be fed to the avatar model.

    Right now my glove has a hand model it feeds data too. I have that model sitting at my avatars hand position. Id like to get rid my my hand model and feed the hand of the avatar the rotations. I guess just feed the rotation position data right to the avatars child hand object itself?
     
  28. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I think you should only apply the rotations detected by the glove to the avatar's finger joints. If you need an example how to do that, open AvatarController.cs, find TransformSpecialBoneFingers()-method and look at the code near the end of the method. Unfortunately, I can't help you much more than this. You need to look at the glove's documentation and/or the provided sample scripts, and then experiment a bit.
     
  29. technorocker

    technorocker

    Joined:
    Jun 7, 2018
    Posts:
    10
    Yes I got the integration figured out. I made the data glove so there is no Doc for it. I was getting raw euler angles sent in through arduino into Unity and now have it working. Thank you for your response. On a side note as I'm looking over the code and features. I seen a mention of leap motion which I also own. Have you played around with one to use for the hand tracking at all? I am building an AR headset and have the leap motion mounted on the headset and havent attempted to fuse and data together with the kinect. Not sure if I'll even need to use it now that I have glove data working for orientation and tracking from the kinect. My initial problem before I got he kinect was trying to find a tracking solution to track the hands when they left the leaps field of view. If its a somewhat simple fusion I might try and incorporate it on top the glove. The glove would basically be for grabbing and holding objects outside of the leaps view when it stops tracking the hands in that case.

    Anyways thank you for your help and great asset
     
  30. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Yes, I have added LeapMotion hand tracking to the Kinect body tracking for this asset, some years ago: https://rfilkov.com/2015/12/26/kinect-v2-mocap-animator/

    Once you have the correct hand & finger data from the glove(s), I don't think you would need LeapMotion hand tracking any more. It's OK, but your hands need to be in LM field of view. And this is not always possible, as you mentioned.
     
  31. pinklover91

    pinklover91

    Joined:
    Apr 10, 2014
    Posts:
    3
    Hello, thank you for providing great asset and best value. ^^

    I'm currently working on a project, I'm using InteractionManager to control the position of the cursor.
    Is there any way to control the speed of the cursor? Like I want to move my hand farther to reach the top left of the screen. Because currently I feel like it's so sensitive that I only move a bit of my hand to move the cursor.
    Thank you :)
     
  32. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, as far as I see, the cursor sensitivity is determined by two methods of KinectManager.cs - GetLeftHandInteractionBox() & GetRightHandInteractionBox(). You could extend these boxes in X or Y direction a bit, to adjust how far the user should reach, to get to the screen edges.
     
  33. VTOLEE

    VTOLEE

    Joined:
    Oct 13, 2014
    Posts:
    4
    hi can i use nuitrack interface with kinect v2 device? As currently it will always just enable kinect2 interface
     
  34. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    On Windows with Kinect-v2 you should use Kinect2Interface. It is also better than the NuitrackInterface. On the other platforms NuitrackInterface is usually the only option.
     
  35. gra1n

    gra1n

    Joined:
    May 19, 2016
    Posts:
    1
    Hi Rumen, thanks so much for this asset. I'm a masters student from London creating a VR project and using Kinect v2.

    I am looking at your face example which takes the colour texture from the RGB and applies it to a generated face mesh. I'd like to create the same effect but instead take the texture for the entire figure and project this onto a simple Avatar once, so that the user has a personalised avatar. Is this possible to do, and if so could you point me in the right direction?

    I have some programming experience but have only been using unity for a couple of months.

    Thanks again.
     
  36. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, please look at the KinectUserVisualizer-demo scene in KinectDemos/VisualizerDemo-folder. I suppose this is what you are looking for.
     
  37. LucyBW

    LucyBW

    Joined:
    Jan 11, 2016
    Posts:
    5
    Hi Roumen
    I've been using this asset with several past versions of Unity, Unity 2017 and 2018.
    And VERY much appreciate what you provide with it! Thank you so much.

    I am trying to use it now with 2019.2.5 - using the builtin renderer (not HDRP or URP).
    The background removal demo (#1) works in the editor, but when I build and run the project, it displays a message saying "Background removal not supported".

    Also, with the Avatar Demo #1, it runs fine in the editor.
    But when I build and run, the scene loads and looks fine (with the 2 avatars), but the kinect lights don't go on. Seems that Kinect is not getting initialized. Anything I need to do to set up the project properly for the build? Do I have to include the Kinect libraries somewhere?
     
  38. LucyBW

    LucyBW

    Joined:
    Jan 11, 2016
    Posts:
    5
    Do you have any plans to make this tech compatible with HDRP or URP?
    In particular, it would be SO awesome if you could provide a shader graph that does the background removal!
     
  39. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please make sure that:
    1. The Resources-folder is included in your project, and 'KinectV2UnityAddin.x64.zip.bytes'-file is in this folder.
    2. The Architecture in 'Build settings' is set to 'x86_64'.

    Regarding HDRP & URP: I'm not an expert in the scriptable render pipelines yet, but plan to learn more this year.
     
    LucyBW likes this.
  40. LucyBW

    LucyBW

    Joined:
    Jan 11, 2016
    Posts:
    5
    Thank you so much for the quick response! The build problem was the architecture (set to x86 rather than x86_64) :-(
    Everything works great now!
     
    roumenf likes this.
  41. LucyBW

    LucyBW

    Joined:
    Jan 11, 2016
    Posts:
    5
    Very interested to see how/what you explore with the scriptable render pipelines!
    Achieving Background Removal using a shader graph would be most welcome :)
     
    roumenf likes this.
  42. spelafort

    spelafort

    Joined:
    May 8, 2017
    Posts:
    37
    hi, been using this package for years and wanted first to thank you for it. It's helped me immensely.

    I have a question. I'm trying to use the kinect v2 for something very simple. All I need to do is track the user's z-depth and map this onto camera position so that you can manually 'zoom' by walking.

    I'm running into a problem, however. when the kinect manager finds the user, the kinect avatar's z-position in the world at first seems to jump in a somewhat random way. After that it stays fairly consistent, but this initial jump makes it quite difficult to accurately have a min/max value to map onto the camera. I've tried:

    -re-parenting the avatars (did nothing)
    -clamping everything to the values I need. This works but it often results in not being able to fully zoom in/out
    -using a rigidbody with all constraints checked until after avatar is found (did nothing, avatar doesn't seem to be affected by it)
    -manually calibrating the min/max each time in my script. This works, but it's not tenable to ask of a user each time a new avatar is found

    any ideas of how to work around this?
     
  43. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, why don't you use the Z-part of 'Vector3 userPos = KinectManager.Instance.GetUserPosition(userId);' directly? If needed, you could smooth it then with Mathf.Lerp(). I'm not sure though how this Z-position would determine the min/max range you mention. Please provide some more details on this issue.
     
  44. spelafort

    spelafort

    Joined:
    May 8, 2017
    Posts:
    37
    Thank you! Somehow it hadn't occurred to me that that value would be different. Working fine now.
     
    roumenf likes this.
  45. adelgeit

    adelgeit

    Joined:
    Jun 27, 2019
    Posts:
    2
    Hello, I got a problem with switching between scenes. I have a scene with menu settings and the actual game with Kinect on the other one, so when I first switch from menu to the game it's fine, but after repeating this operation the second time Kinect doesn't detect the user anymore.
    I will take a look at using Kinect across multiply scenes, but my case seems to be a bit different since I have only one scene with Kinect Manager.
     
  46. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please look at 'Howto-Use-KinectManager-Across-Multiple-Scenes.pdf' in the _Readme-folder of the K2-asset. This is demonstrated by the demo scenes in KinectDemos/MultiSceneDemo-folder, as well. If you try theses scenes, don't forget to add them (in the same order) to the 'Scenes in build'-list setting of the project's Build settings.
     
  47. adelgeit

    adelgeit

    Joined:
    Jun 27, 2019
    Posts:
    2
    Fixed it. Thank you!
     
  48. iesswl

    iesswl

    Joined:
    Aug 29, 2019
    Posts:
    4
    I need help. I am doing a perspective projection with Kinect V2. I use this script (Headtrack.cs) and it worked. So I have multiple scenes and I want users to swipe left while the tracking is still tracking them. I followed the multiplescenes examples. I selected WhenUserIsDetected and select Swipe Left. However, this doesn't work properly. It kind of overlapped with the tracking function. When I implemented it, I need to swipe first then it will tracked and changed the scene at the same time. I tried to fix it by creating a new game object and placed the change scene script there but it yield the same result. I even modified the scripts and merged them into one and it yield the same result as well. Any help will be appreciated.
     

    Attached Files:

  49. BennyTan

    BennyTan

    Joined:
    Jan 23, 2014
    Posts:
    141
    Hi, could i check, if i want to use the gesture detect script, eg KinectGestures.Gestures.SwipeLeft,
    In a multiplayer game, how do i check which player performed the swipe (player 1 or 2)?
     
  50. iesswl

    iesswl

    Joined:
    Aug 29, 2019
    Posts:
    4
    By tracking their Body ID.