Search Unity

Kinect v2 with MS-SDK

Discussion in 'Assets and Asset Store' started by roumenf, Aug 1, 2014.

  1. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi there, if you have the K2-package, you can use the KinectRecorderDemo-scene, to record your movements. Then copy and re-play the saved file in the K2-animator, just by enabling the KinectRecorderPlayer-component of KinectController-game object in the scene (+ optionally changing the file-path setting). If you insist on xml-files you would need to create your own xml-player-script (similar to the player in KinectRecorderPlayer.cs), where you would need to convert the xml frame-data to text lines similar to those recorded by the recorder, and feed it to the respective KM function. Please email me, if you need the detailed format of the text line. Don't forget to mention your invoice number, too.
     
  2. vivalavida

    vivalavida

    Joined:
    Feb 26, 2014
    Posts:
    85
    Hi @DarknessPhoenix ,
    I've put up my scene and my scripts here https://github.com/joshferns/KinectHolographic
    You will first have to import the Kinect v2 with MS-SDK asset in your project before you copy over the files from the repo.

    There are just two main scripts that you will need, ProjectionMatrix.cs and Headtrack.cs there is also a demo scene for reference.
     
    DarknessPhoenix and roumenf like this.
  3. DarknessPhoenix

    DarknessPhoenix

    Joined:
    Aug 17, 2016
    Posts:
    3
    @vivalavida

    Thank you very much!! I tested it and it works! :))
     
    vivalavida likes this.
  4. Yang_Sir

    Yang_Sir

    Joined:
    Sep 19, 2015
    Posts:
    6
    Hi,when i run the AvatarsDemo in my unity ,the result is "OpenDefaultSensor failed".What is wrong?
     
  5. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    You probably forgot to install Kinect SDK 2.0. See 'How to run...'- and 'Downloads'-sections here: https://rfilkov.com/2014/08/01/kinect-v2-with-ms-sdk/
     
  6. kenny-lin

    kenny-lin

    Joined:
    Jul 9, 2013
    Posts:
    1
    Does facial tracking of this package work well under low light, and with people with dark-skin, glasses, shades, or beard?
     
  7. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    This package uses the standard Kinect-v2 face tracking. It worked pretty well, when I tested it long time ago (better than OpenCV face tracking for instance). Anyway, you can use the free 'Face basics' and 'HD Face Basics' samples that come with the Kinect SDK 2.0, to test how the face tracking works under the conditions you need.
     
  8. cel

    cel

    Joined:
    Feb 15, 2011
    Posts:
    46
    a quick question, does it work on the xbox 1 as an uwp app?
     
  9. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
  10. cel

    cel

    Joined:
    Feb 15, 2011
    Posts:
    46
    Does that mean not yet, but it will soon?
     
  11. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    It will not be soon, unless I missed something in the meanwhile. Microsoft should provide replacement for their Kinect UWP libraries, for Windows 10. This could mean days, months, years or never.
     
  12. agramma

    agramma

    Joined:
    Jun 20, 2011
    Posts:
    13
    Hello!

    I am trying to import the kinect recording demo scene to an existing page but when I try to play the project after the
    import is done I get the error

    Assets/Standard Assets/Microsoft/Kinect/VisualGestureBuilder/VisualGestureBuilderDatabase.cs(9,88): error CS0246: The type or namespace name `Helper' could not be found. Are you missing a using directive or an assembly reference?

    Do I need to add a reference or sth?

    Any help would be much appreciated :)
     
  13. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    In order to use the K2-asset functionality in your Unity project, apart from the needed scene(s) and their specific scripts, materials and other assets, you need to copy KinectScripts-, 'Standard Assets'- and Resources-folders to your project. More information is available here: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t2
     
  14. this-play

    this-play

    Joined:
    Sep 8, 2016
    Posts:
    5
    Hi,

    I wrote a script in VisualStudio that does some calculations with Kinect Joints.
    But I's quite the CS noob, I don't know where to tap into your code.

    My script initializes the kinect on its own, I guess initializing twice is a bad idea.
    And I use the kinect.dll which I can't find in your assets.

    I would appreciate some hints on how to get access to [JointType.Head] for example.

    Thanks :)
     
  15. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi there,

    Unity works with objects and components. This is the way the K2-asset works, too. Most of the Kinect-related components reside on KinectController-object. The most important and the only always-needed component is KinectManager. It opens and closes the sensor and does the most of sensor data processing. It also has public API that could be invoked from other scripts like this: 'KinectManager.Instance.FunctionToCall();'. In this regard, here is a tip on how to get access to a joint position: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t7 And here is what you need to copy to your project, in order to get the same functionality: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t2 Please check the other tips on that page, too. Also, please mind some Kinect-related components may reside on other objects. For instance the AvatarController-object resides on the avatar model's object in the scene (because it controls its transform and its joints' transforms), and not on KinectController.
     
  16. jpine

    jpine

    Joined:
    Feb 8, 2015
    Posts:
    26
    Please forgive the noob question. I want to create a live training application that is accessed over the internet. The end user (the trainee) will wear a Vive. The person at the other end of the connection will sit/stand in front of a Kinect and be represented as an avatar to the trainee. Is your package a possible solution to my needs? Having the avatar mimic facial expressions would be a huge plus. At the very least, the avatar needs to have lip movement. The lip movement need not be phonologically correct.
     
  17. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    This is not noob, but rather tough question :) Theoretically this should be possible, although 1. currently the Kinect-data-server does not transfer any face-tracking data (because Kinect face-tracking is meaningless when the user wears HMD), and 2. transferring data over the Internet will be slow and may cause dropping of many packages, which could worsen the experience at the Vive end. Look at the 1st video here to see how the avatar in VR may look like (data transfer is over LAN there): https://rfilkov.com/2015/08/20/made-with-kinect-v2-asset/

    Regarding the face-tracking, you can see the 4th face-tracking demo scene in the K2-asset, to check how it may look like. This would require some additional coding on server and client side though, in order to transfer the needed FT data, but I'll not go into details right now. You can ask me further about it, if you decide to use these packages.
     
  18. helios

    helios

    Joined:
    Oct 5, 2009
    Posts:
    308
    Hello, I'm trying to do the same thing (except using the same CSV that the recorder exports). I'm wondering if it's possible to play back the recorded data on an AvatarController that is *not* player index 0? Thanks!
     
  19. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    The data should contain all recorded bodies (and play them back, with respect to their current player indices). It is up to you to change the player-index of the avatar in the scene, or add another one to the scene. I hope I understood your problem correctly.
     
  20. helios

    helios

    Joined:
    Oct 5, 2009
    Posts:
    308
    Hi, thanks for the reply. This is what I'm doing, however it seems to always play back on player index 0. I can verify that player index 2 is the one recording, but it doesn't play back on the corresponding index.. always 0. Is there something I'm missing perhaps? Thanks again.
     
    Last edited: Nov 10, 2016
  21. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please e-mail me the recorded file and tell me what to expect, so I could try to reproduce the issue you're experiencing. Also, I just noticed this detail: Are you playing the file in KinectRecorderDemo-scene or in the K2-animator scene?
     
  22. Pointcloud

    Pointcloud

    Joined:
    Nov 24, 2014
    Posts:
    37
    I was wondering if there is a more immediate solution for grabbing users through frame differencing with blob tracking, similar to this: tsps.cc

    The issue is that sometimes with the background removal example it takes awhile for the user to be recognized and grabbed, if at all. My preference would be to have a more immediate blob tracking solution that references the current frame of the camera with a frame that has nobody in it so that blob tracking will grab someones outline immediately.

    Let me know if this is possible, Thanks.
     
  23. Pathawut_P

    Pathawut_P

    Joined:
    Oct 16, 2015
    Posts:
    3
    Hello,
    I have problem about background removal demo on Unity 5.4.2f1.
    It work fine on Unity 5.3.0 but It doesn't show anything on Unity 5.4.2.
    Could you tell me how to fix it?

    Thank you.
     
  24. agramma

    agramma

    Joined:
    Jun 20, 2011
    Posts:
    13
    Hello!

    the recording demo is working great. I would just like to ask you if there is an easy way when you save the
    joints positions at the bodyrecording.txt if you could also save the image of the frame. Could you give me
    some info on how to achieve this.

    Any help is highly appreciated!
     
  25. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I just checked it with Unity 5.4.3f1 and all background removal demos work as expected here. Please check again with the latest versions - v2.11.1 of the K2-asset & v5.4.2 of the Unity Editor.
     
  26. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, this may be possible, but would need additional code in KinectRecorderPlayer.cs & KinectManager,cs-scripts. My intention with body recordings was to avoid saving huge amounts of data, instead - the body motion data only. If you need the depth, body-index or color camera data as well, I would recommend to use the Kinect Studio 2.0 (part of the Kinect SDK 2.0) recordings. Do mind that depending on the selected streams it may produce very large files.
     
  27. Pathawut_P

    Pathawut_P

    Joined:
    Oct 16, 2015
    Posts:
    3
    Thank you for response.
    Let's me check on 5.4.3f1 because We have problem on 5.4.2f1.
     
  28. this-play

    this-play

    Joined:
    Sep 8, 2016
    Posts:
    5
    Hi,

    I want to calculate user heigth, according to this tutorial.
    They determine wether a joint actually gets tracked.
    Otherwhise it just gets interpolated I guess?
    I see, that there is already a condition within your code:

    if(manager.IsJointTracked(userId, (int)joint))

    So does this refer to this SDK "isTracked" property?
    How can I check which of my joints are tracked?

    Thanks
     
  29. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    manager.IsJointTracked() will do the job. But let me explain a bit. As you noticed, there are 3 possible tracking states for the joint - not-tracked, tracked and inferred. There is also a setting of the KinectManager (component of the KinectController-game object), called 'Ignore inferred joints'. If this setting is enabled, IsJointTracked() will return true when the joint is tracked, and false when the joint is not-tracked or inferred. If the setting is disabled, IsJointTracked() will return true when the joint is tracked or inferred, and false when the joint is not-tracked. That said, the choice is yours. ;)
     
  30. Pointcloud

    Pointcloud

    Joined:
    Nov 24, 2014
    Posts:
    37
    Hi, I'm playing around with trying to align the Kinect UserMesh example with the users head using a HMD like the Oculus or the Vive. I haven't tried the Vive yet but have used the oculus,
    . I feel like I'm getting close I just need there to a 1:1 correspondence between the location of the users head from the kinect with the location of the HMD. I think this might be easier to do with the vive, however I noticed there is also camera smoothing going on in the UserMesh script that will displace the location of the mesh on occasion. Is there away to disable this smoothing feature so the UserMesh is always firmly grounded to its location in the world so I can try to align an Oculus or Vive more accurately? Thanks.
     
  31. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, sorry for the delayed response. To disable the smoothing do as follows:
    1. Open UserMeshVisualiser.cs (it is a component of the UserMesh-object in the scene), and replace this line: 'transform.position = Vector3.Lerp(transform.position, newUserPos, smoothFactor * Time.deltaTime);' with this one: 'transform.position = smoothFactor != 0f ? Vector3.Lerp(transform.position, newUserPos, smoothFactor * Time.deltaTime) : newUserPos;'
    2. Set the 'Smooth factor'-parameter of the component to 0. Then the smoothing will be ignored.
     
  32. ferenc

    ferenc

    Joined:
    Nov 2, 2015
    Posts:
    2
    Hi, I really like this stuff! Great Work! ;-) I would like to ask for some help.

    I have a main scene(A) where I don't want to put the Kinect related things. Scene A is on don't destroy mode so it's active every time. There is an other scene(B) where the KinectManager is located. I have to load and unload scene (B) many times during the gameplay. After loading (B) Kinect v2 lights on and most of the time is working as it should, but sometimes it's not turning on. I can't figure out why this is happening. I've removed the dontdestroy related DEF from KinectManager. Thnkx!
     
  33. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    There are two options:
    1. Single KinectManager: Add the KinectManager-component to a (don't destroy) game object in scene A, and delete it from all other scenes. In this case they'll use the KM-instance from scene A. This is demonstrated in the MultiSceneDemo and described in 'Howto-Use-KinectManager-Across-Multiple-Scenes.pdf' in the _Readme-folder.
    2. KinectManager per scene: Uncomment '#define USE_SINGLE_KM_IN_MULTIPLE_SCENES' at the beginning of KinectManager.cs. This will instruct it not to make the game object a don't-destroy one. This way it may be used multiple times, in the Kinect-related scenes only. This will also turn on and off the sensor, depending on whether KinectManager is present in the respective scene or not.
     
  34. KeithT

    KeithT

    Joined:
    Nov 23, 2011
    Posts:
    83
    Just bought this asset and looking through the demos. The main view in FittingRoom1 is upside down whereas in other demos it's the correct way up. Can't see an obvious reason. Any ideas?
     
    roumenf likes this.
  35. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you for the feedback! The issue was already reported to me, and I'll submit a fixed release to the Unity asset store next week. As a workaround in the meanwhile, you could either disable the UserBodyBlender-component of MainCamera-object in the scene (this will turn off the user body blending), or leave the body blender and set 'Anti aliasing' to Disabled in the 'Quality Settings' (Edit / Project Settings / Quality Settings). Sorry for the bug anyway! The scene worked fine during my tests, but later I saw the Antialiasing in my settings was disabled.
     
  36. alex_evp

    alex_evp

    Joined:
    Jan 29, 2017
    Posts:
    7
    Hi Roumenf,

    Great plugin, thanks for your great work!! you rock :)

    only 1 issue so far, I am using it with some fuse characters and its a really nice and easy to use system you have created. Only problem is when the character closes the hands the thumb rotations are looking correct local to the thumb itself but the overall angle of the thumb is incorrect. It looks like the same rotation is being applied to all the fingers and the thumbs, is that the case?
    If so can you consider adding a line or two to just angle the the thumb so it makes a proper fist? Also after using some other kinect v2 tools I see sensing of a finger point is possible, is this something you might consider adding to this toolset? Im still fairly beginner to coding and had a look at the code, I found the area relating to this but didn't want to break things.. :) if not any tips on how to fix this would be great!

    cheers!

    Alex
     
  37. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Oh, I think I just answered the same question on my blog: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/comment-page-3/#comment-3628
     
  38. KeithT

    KeithT

    Joined:
    Nov 23, 2011
    Posts:
    83
    Is there a API/SDK reference anywhere where you can get an overview of all the classes, methods etc ? Have looked at the examples, etc but not found one.
     
  39. KeithT

    KeithT

    Joined:
    Nov 23, 2011
    Posts:
    83
    I'm trying to get an image of the user with the background removed so have added the photoshooter from the fitting room into the backgroundremoval1 demo. The resultant image only however contains the background and if i disable the output of just the background and leave the foreground, I get nothing. How can I get access to the image (and depth) data of just the user? Thanks in advance for any suggestions.
     
  40. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Look here: http://ratemt.com/k2-doxyhelp/annotated.html The API reference is still not final. The idea is to be part of the upcoming online documentation.
     
  41. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    As far as I remember, the foreground image in BR-demo1 is drawn at the top level GUI (i.e. not rendered by the main camera). I would suggest that you use the BR-demo2 instead, remove the cubes and halo, add photo-shooter, and in its settings use BackgroundCamera1 and BackgroundCamera2 as 'Background camera' & 'Foreground camera'.
     
  42. KeithT

    KeithT

    Joined:
    Nov 23, 2011
    Posts:
    83
    Thanks for the suggestion re this, it worked as an approach. What I am really trying to do though is change pixels in the texture representing the user before it is rendered. I have gone through several of the examples including BR-demo2 trying to figure out how to access this texture. I found a load of code commented out re polling the foreground frame (Kinect Interop PollForegroundFrame), which is referenced in BackgroundRemovalManager, but always returns false. As far as I can see there is an assignment of backManager.GetForegroundTex() to guiTexture.texture in the Update() of ForegroundToImage, but this only fires once at startup, so can't be used to make ongoing changes to the texture.

    Can you provide some clues how this works please and any suggestions how to access and change this texture very gratefully recieved. Thanks in advance.
     
  43. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Sorry for the delayed response. I don't provide support at weekends.

    To your question: You can always get the foreground texture by calling the GetForegroundTex()-method of BR-manager, or get the alpha-texture by calling the GetAlphaBodyTex()-method. Then you could copy and manipulate the texture the way you need, with your own shaders or post-processing algorithms.

    Regarding PollForegroundFrame: This function is used by the background-removal procedure of Kinect-v1. I used it at the beginning for K2-BR too, but then moved the texture processing to shaders. That's why its code is commented out. The main method that deals with background removal is KinectInterop.UpdateBackgroundRemoval().
     
  44. KeithT

    KeithT

    Joined:
    Nov 23, 2011
    Posts:
    83
    Thanks for responding, I've got access to the textures through KinectInterop.UpdateBackgroundRemoval() as suggested.

    One thing that I've not been able to work out is how to change the color of the alpha-texture. The body silhouette is always white and I am trying to make it dark grey. As far as I can see this is controlled in the Color2DepthShader, but I've tried modifying the color values in it to no avail. Any info re where this is set and how to change it very gratefully recieved.
     
  45. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you for the positive review! The alpha texture is rendered by the Color2BodyShader.
     
  46. KeithT

    KeithT

    Joined:
    Nov 23, 2011
    Posts:
    83
    Thanks, it does indeed help if you update the right shader :eek:)

    I was looking at trying a couple of ideas with the infrared image but it does not seem to be used anywhere, although there seems to be calls to reference it. Is there a reason for this or just something you have not got to yet?
     
  47. agramma

    agramma

    Joined:
    Jun 20, 2011
    Posts:
    13
    hello,

    great package! I have a problem though with the animations' recording and replaying. When a user walks and the game is
    recording him, the animation at the avatar is a little bit different. It's like the avatar is sliding on ice. Is there a paramater I can change so I can make the walking motion more natural?

    thanks in advance!
     
    Last edited: Feb 14, 2017
  48. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    You can enable the 'Compute infrared map'-setting of KinectManager, and then get the raw infrared image by calling KinectManager.Instance.GetRawInfraredMap(). Unfortunately I have not converted the IR data to texture internally, but I suppose it should be similar to the creation of the body texture.
     
    s_unity likes this.
  49. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Which scene do you mean?
     
  50. agramma

    agramma

    Joined:
    Jun 20, 2011
    Posts:
    13
    I have created a project of my own. I used the kinect recorder, manager and the other kinect scripts. I didn't change the kinect manager and interop scripts at all.