Search Unity

Kinect v2 with MS-SDK

Discussion in 'Assets and Asset Store' started by roumenf, Aug 1, 2014.

  1. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi Pat,

    There are two fitting-room demo scenes in the K2-asset. The 2nd demo will fit better to your case, I think. This scene, created by a customer, is based on that scene, extended with some user interface and interaction.


    Regarding sensors: Definitely use Kinect-v2, if you have one. I think it is much better than Orbbec Astra, in means of quality, SDK and performance.
     
  2. Retrokat

    Retrokat

    Joined:
    Apr 23, 2015
    Posts:
    12
    We are setting up an interactive install and need the Kinect interaction to blend with the animations when they transition.

    I followed your Tips & Tricks with the section about having animations play and being over-ridden by the Kinect movement and that is working perfectly but when the Kinect tracks a person the bones just snap to the Kinect potion and don't blend from there current position and when it loses connection it snaps back to the start of the animation. Is there any way to make it a have smooth transition?
     
  3. AlbyDj90

    AlbyDj90

    Joined:
    Feb 7, 2015
    Posts:
    35
    Hi everyone.
    I have a question, i know how to show the depth map texture on a canvas....but the default color of the RawImage is a gradient from black (far) to yellow (near). I want to choose what color the DepthTexture will be or, at least, i will be happy to change the color from yellow to another one (so, if i can change the gradient from black to another color it will be great). There is a way? I Obviously tried to change the color of the image directly, but starting from yellow will be impossible to obtain some colors...
     
  4. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    First off, don't set the 'Pos relative to camera'-setting of the respective AvatarController-component. This should make the avatar start and finish at its initial position in the scene.

    FYI: Moving to start and initial position is done by the SuccessfulCalibration() and ResetToInitialPosition()-methods of AvatarController-script. Feel free to modify the code of these methods, according to your needs.
     
  5. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    The depth-map texture is created by Resources/DepthShader.shader. Open the shader and change the colors, according to your needs. You can find them between lines 66 and 83.
     
    AlbyDj90 likes this.
  6. AlbyDj90

    AlbyDj90

    Joined:
    Feb 7, 2015
    Posts:
    35
    Thanks!!! This solved my problem!! :D
     
  7. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    @roumenf re: jittering of the avatar:
    I was recently redoing a scene we have where there are several scaling factors exposed for user to customise,
    but I started from your fitting room demo where avatars are pretty much stable and jitter free

    It looks like the avatar is not completely stable when 'body scaling' factor != 0
    I set it to 0 (and removed from customisation) and models are pretty much 1 : 1 to body tracking sdk sample now - so I think this might be it
    - I'm not _entirely_ sure it was body scaling at this very moment, but it was one of the 4 (?) scales exposed (since I'm not behind pc, let me know it you can/can't replicate it, I'll look up the exact parameter)
     
  8. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please tell me what exactly to do (and in which scene), to reproduce the issue.
     
  9. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    ok, so I had to remove 'bodyWidthFactor' from being customized ( i.e. leave it to default 0 ) which finally helped with stability

    Weird thing is I can't replicate it in any of the demo scenes ( KinectFittingRoom1 or 2 ) - I haven't tried it before, too - just noted what the default parameters values are at runtime and made sure they are the same in our scenes ( arms scaling work though )

    So this 'bugreport' is more or less entirely worthless right now - will let you know if I get to the combination which caused it (if you remember the videos I sent you some time ago - this (i.e. removing body width scaling) resolved that)
     
    roumenf likes this.
  10. sam_dijtb

    sam_dijtb

    Joined:
    Jan 23, 2018
    Posts:
    1
    @roumenf Thank you for creating such beautiful asset.
    I don't have much experience, so my question might be look trivial.
    I'm trying to make interactive installation and trying to reuse Vizualizer demo that we have in asset.
    The problem is when I try to emit from mesh created by kinect(I set emitter to rate over distance), particles are emitting when whole "Mesh" object is moving. If for example I will move only one hand there is no particles emitted.
    Can you please give me some advise on how can I get the same behavior as we have here on https://www.instagram.com/p/Bw1lZhSFGXL/ ?
     
  11. Future-Action

    Future-Action

    Joined:
    Feb 23, 2017
    Posts:
    3
    Hello roumenf,

    The samples you listed are all available but I still have the problem in Unity which he mentioned above.
    I use Unity 2019.1.3 and the NVidia driver is updated
    No any idea to make the Face Tracking Sample work
     
  12. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Oh, unfortunately I'm not much of a creative designer and cannot set up this kind of beauty by myself. Please send me a sample project via WeTransfer or GoogleDrive + some instructions what doesn't seem right, and I'll try to look at what may have caused the issue.
     
  13. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Have you tried to run the 'Face Basics D2D'-sample of 'SDK Browser v2.0'?
     
  14. Future-Action

    Future-Action

    Joined:
    Feb 23, 2017
    Posts:
    3
    Sure, I've mentioned that the official samples you listed are all available.
    But I cannot make your own face samples work.
     
  15. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hm, that's odd. Please check, if the 'HD Face Basics-WPF'-sample works, as well. Then look for errors in the Unity console, after you run the face-tracking demo scenes. In this regard, may I ask you to send me the Editor log-file, after you run the scene? Here is where to find it: https://docs.unity3d.com/Manual/LogFiles.html
     
  16. qaisbayabani

    qaisbayabani

    Joined:
    Jun 20, 2018
    Posts:
    28
  17. qaisbayabani

    qaisbayabani

    Joined:
    Jun 20, 2018
    Posts:
    28
    I have make it work and sent sample to unity asset store but they didn't publish it replying that titles pics of the asset is not attractive and bla black but if u need it I can hand it to form but what will I get u have to decide as u know no one will tell u except me. a as in this movie open mouth fire, left/right eye close or twist right/left to turn right/left. and a smile to go forward.
     
  18. mirror190601

    mirror190601

    Joined:
    Jun 8, 2019
    Posts:
    1
    Hi @roumenf

    I just added your Kinect v2 package in my Unity 2018.4.1f1
    I getting the following few errors:

    1) Assets\K2Examples\KinectScripts\Interfaces\Kinect2Interface.cs(1116,49): error CS1503: Argument 1: cannot convert from 'Windows.Kinect.KinectSensor

    2) Assets\K2Examples\KinectScripts\Samples\GetFaceSmileStatus.cs(47,49): error CS0266: Cannot implicitly convert type 'Windows.Kinect.DetectionResult

    3) Assets\K2Examples\KinectScripts\VisualGestureManager.cs(643,60): error CS1061: 'VisualGestureBuilderFrameSource' does not contain a definition for 'OpenReader' and no accessible extension method 'OpenReader' accepting a first argument of type 'VisualGestureBuilderFrameSource' could be found

    4) Assets\K2Examples\KinectScripts\VisualGestureManager.cs(649,10): error CS1674: 'VisualGestureBuilderDatabase': type used in a using statement must be implicitly convertible to 'System.IDisposable'

    5) Assets\K2Examples\KinectScripts\VisualGestureManager.cs(660,41): error CS1061: 'VisualGestureBuilderDatabase' does not contain a definition for 'AvailableGestures' and no accessible extension method 'AvailableGestures' accepting a first argument of type 'VisualGestureBuilderDatabase' could be found (are you missing a using directive or an assembly reference?)

    6) Assets\K2Examples\Standard Assets\KinectVisualGestureBuilderSpecialCases.cs(47,41): error CS1729: 'VisualGestureBuilderFrameSource' does not contain a constructor that takes 1 arguments


    Any Idea?
    There are more errors, but I think most of them will be solved in a simple solution.
    I am stuck. Please reply ASAP at orn6301@naver.com

    Thanks.
     
  19. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, as I said in the response to your e-mail, please make sure you follow the instructions of this tip, when you copy the K2-package into your project: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t2 Please also stick to one channel of communication, just to spare me double work.
     
  20. Tyndareus

    Tyndareus

    Joined:
    Aug 31, 2018
    Posts:
    37
    I am trying to get a sensor to be placed at a height and rotated, this will probably be room height off the floor as well as a complete 90 with the wall, not sure if it would be -90 or 90.

    My idea is to create an interactive wall; so hand processing in the vision of the sensor. I was just hoping that the body tracking would allow me to fake a wall in Unity and I'd just use a collider to meet the requirements but I think there is a joint requirement to be tracked?
    There is a possibility that when this is set up it would be in a position to gauge enough of the body to run however I wouldn't want to risk that.
    Anything special to be done in order to push the sensor to confirm body tracking with fewer joints?

    The alternative is to just stick with OpenCV and an IR camera for the wall but the sensor is more lightweight and easier to setup since the OpenCV way requires like 4 different types of calibration provided by the user which is undesirable.
     
  21. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, I'm not sure the body tracking will still work after you rotate the sensor 90 degrees down. Please check this first.
     
  22. Tyndareus

    Tyndareus

    Joined:
    Aug 31, 2018
    Posts:
    37
    Do you know the maximum angle that body tracking would work?
     
  23. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I don't know, but I suppose no more than 30-40 degrees. Just experiment a bit and you will find out.
     
  24. jackvob1

    jackvob1

    Joined:
    Mar 2, 2018
    Posts:
    38
    I am trying to create a sample for a project like snapchat filter does any one have example of it or know how to make it ?
     
  25. underkitten

    underkitten

    Joined:
    Nov 1, 2018
    Posts:
    30
    Is there any way to track the body (or any other object) without recognizing it first? Something like Blob tracking.
    Or do I have to use OpenCV for that?
    Just need to track users from behind.
     
  26. adliawans

    adliawans

    Joined:
    Jul 18, 2019
    Posts:
    1
    I want to ask about 3d model fitting room, so when I put my own 3d I realize my 3d model is a little bit to the right so my left side of the 3d model is not fit to my hand. is there a way to move it little bit to the left with script or I have to re create the 3d model ?
     
  27. Tyndareus

    Tyndareus

    Joined:
    Aug 31, 2018
    Posts:
    37
    There anyway of accessing the floor clip plane? According to the kinect sdk it provides it through the bodyframe but it seems the asset doesn't.
     
  28. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    You can track any object as a blob in the raw depth image (get it with 'KinectManager.Instance.GetRawDepthMap()'). It's an array of shorts, 512x424 in size, containing 16-bit values of the distance to an object in each point (in mm), or 0 if there is no object detected. You can do analyze it with your own script or use some OpenCV algorithm, if you like. There is currently no such example in the K2-asset, but I plan to add one soon.
     
    underkitten likes this.
  29. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please e-mail me, so I can send you an updated version of the ModelSelector-script, containing 'Horizontal offset'-setting. Don't forget to mention your invoice number in the e-mail, too.
     
  30. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    There are 'Sensor height' and 'Sensor angle'-setting of the KinectManager you can use to set or find out the sensor height above the ground and its tilt angle. To update this information from the sensor, set 'Auto height angle' to something different than 'Dont use'. Please mind, in order to use the auto-detection of the height and angle, there should be someone (a person) in front of the sensor, and he/she should be fully visible, including the feet.
     
  31. ATMEthan

    ATMEthan

    Joined:
    Dec 27, 2012
    Posts:
    54
    Do you plan on implementing the Azure Kinect into this asset? Have you fiddled with the azure kinect c# sdk? (https://github.com/Microsoft/Azure-Kinect-Sensor-SDK/issues/136)
     
  32. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
  33. ATMEthan

    ATMEthan

    Joined:
    Dec 27, 2012
    Posts:
    54
    Well that's excellent news, thank you. Not to rush you with this question but I must ask - do you have a target release date planned?
     
  34. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I sent it already for review. When and if the Asset store team will publish it, is a good question.
     
    ATMEthan likes this.
  35. asd_seisuke

    asd_seisuke

    Joined:
    Mar 28, 2019
    Posts:
    3
    Hello i have a question about fitting room is it possible the 3d model is fit automatically person body (size) ?
     
  36. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    This should happen automatically when a new user in T-pose gets detected.
     
  37. Navtek

    Navtek

    Joined:
    May 10, 2016
    Posts:
    1
    Hi guys,
    I am facing a small problem while mapping the depth point with color point.
    When I use
    Code (CSharp):
    1. kinectmanager.MapDepthPointToColorCoords(Vector2 posPoint, ushort depthValue)
    (where posPoint is the point on depth image of the size 512x424)
    to get the x,y position on RGB Texture.

    But, the returned vector2 point has an offset and is a little far from the expected point on the Color texture of the resolution 1920x1080.

    Am I missing something? or Is there a proper way to do so?
    Any help on this is appreciated.

    Thanks.
     
  38. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, KinectManager.MapDepthPointToColorCoords() calls the CoordinateMapper.MapDepthPointsToColorSpace()-method of Kinect SDK 2.0. I.e. the returned coordinates come directly from the Kinect SDK. The mapping between the depth and color spaces could be done manually as well, if you know the color & depth camera intrinsics, as well as the extrinsics (transformation) between the depth and color spaces.
     
  39. kosowski

    kosowski

    Joined:
    Jun 19, 2014
    Posts:
    15
    Hi, whats the best way to compensate the rotation from users located off kinect's center? In my use case, users should see their avatar in a display right in front of them althought kinect is located 1.5 meters to the side of the display.

    I'm using the muscle limits options and adding an external rotation some times interfere with this.Thanks!
     
  40. kosowski

    kosowski

    Joined:
    Jun 19, 2014
    Posts:
    15
    Another question: is there an option to make user's movement relative to avatar initial position? I want the avatar to start in a certain position and follow user's horizontal movement to a certain extent , to make sure it always stays in a predetermined area. Thaks!
     
  41. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hm, in your case it looks more like a position than a rotation issue. I think you should adjust the sensor-to-world matrix to match your specific setup. Use the SetKinectToWorldMatrix()-method of KinectManager.Instance to set the positional offsets in X & Y. See the code of UpdateKinectToWorldMatrix() as well, if you need example.
     
  42. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    If you don't set the 'Pos relative to camera'-setting of the AvatarController-component, the avatar should move relative to its initial position in the scene.
     
  43. kosowski

    kosowski

    Joined:
    Jun 19, 2014
    Posts:
    15
    Thanks, although I think I may have not made myself clear: the problem is the rotation, because while the user looks straight to the display, the avatar, instead of looking straight ahead, is rotated as the user is rotated in relation to the kinect.

    Think of the setup as two separate fitting rooms, 2 meters away from each other, and the kinect centered in the middle to detect both users, and both of them seeing their avatars rotated.
     
  44. kosowski

    kosowski

    Joined:
    Jun 19, 2014
    Posts:
    15
    Thanks. I'm not setting 'Pos relative to camera' option, but when a new user is detected the avatar changes horizontal position instantly.

    After more testing, this only happens if the Apply Muscle Limits option is checked, I guess CheckMuscleLimits() is the culprit.
     
    Last edited: Aug 8, 2019
  45. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please open ModelHatController.cs (in KinectDemos/FaceTrackingDemo/Scripts-folder) and look for 'rotational fix, provided by Richard Borys'. Maybe you need something like this for 'newRotation' in the TransformBone()-method of AvatarController.cs
     
  46. asd_seisuke

    asd_seisuke

    Joined:
    Mar 28, 2019
    Posts:
    3
    hello i want to ask about the 3d model before the update asset when im using fitting room demo my hand can go infront of the 3d model but when update i cant put my hand infront of the 3d always behind it. is there any way to do it again ?
     
  47. enginarer

    enginarer

    Joined:
    Mar 7, 2017
    Posts:
    5
    Hi,

    Is there a way to hold an avatar in the scene (directly visible) with an idle animation and when a user is detected, start directly avataring that model? And maybe (hopefully) morphing from the idle animation to the actual skeleton tracking.

    Thanks in advance!
     
  48. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Do you mean after the update to v2.19.2? I tried to reproduce your issue here, but I couldn't. May I ask you to e-mail me more detailed instructions how to do it, or a zipped representative project (via WeTransfer), where I can see the issue. Please also mention your invoice number in the e-mail.
     
  49. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please try the PlayerDetectorController-component in folder KinectDemos/RecorderDemo/Scripts. It works together with the KinectRecorderPlayer-component to replay previously saved recording, while there is no user detected.
     
  50. Arcanebits

    Arcanebits

    Joined:
    Dec 18, 2013
    Posts:
    108
    Hi guys, im dealing with this issue, anyone else have had the same/similar experience?

    Im on: Unity 2018.4.5f1 / Windows 10 Pro / Every SDK to the latest version

    I managed to use my “Visual Gesture Builder EXPERIMENTAL” to create a “Hello Wave” with two hands file ended up caling hello.gbd

    Make a replacement in (Visual Gesture Manager) from Seated.gbd with my own file, and add a little code to call another scene when trust in the movement is above 90%... works like a charm… Inside Unity.

    I compiled and when starting the debug text says “Visual Gesture Tracking could not be initialized”…

    (Insert 10 lines of unnecessary troubleshooting story here)

    Ended up facing that the issue was my hello.gbd, I replaced and put pack the Seated.gdb and Works inside unity and in the compiled program.

    Anyone have this same issue? Is my Builder EXPERIMENTAL the culprit?

    Aldo