Search Unity

Azure Kinect Examples for Unity

Discussion in 'Assets and Asset Store' started by roumenf, Jul 24, 2019.

  1. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    The goal of this thread is to provide basic support to the users of 'Azure Kinect Examples for Unity'-package.
     
  2. quinkennedy

    quinkennedy

    Joined:
    Apr 3, 2015
    Posts:
    2
    Hi I just purchased the Azure Kinect package, and when I try to run anything with body tracking in it, it reports the following errors:

    Code (csharp):
    1.  
    2. Can't create body tracker for Kinect4AzureInterface0!
    3. UnityEngine.Debug:LogError(Object)
    4. com.rfilkov.kinect.DepthSensorBase:InitBodyTracking(FrameSource, SensorData, Calibration, Boolean) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/DepthSensorBase.cs:1588)
    5. com.rfilkov.kinect.Kinect4AzureInterface:OpenSensor(FrameSource, Boolean, Boolean) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/Kinect4AzureInterface.cs:293)
    6. com.rfilkov.kinect.KinectManager:StartDepthSensors() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:1948)
    7. com.rfilkov.kinect.KinectManager:Awake() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:1887)
    Code (csharp):
    1.  
    2. AzureKinectException: result = K4A_RESULT_FAILED
    3. Microsoft.Azure.Kinect.Sensor.AzureKinectException.ThrowIfNotSuccess[T] (T result) (at <e3ec297ecc8543d68cc5f17025e1e2d3>:0)
    4. Microsoft.Azure.Kinect.Sensor.BodyTracking..ctor (Microsoft.Azure.Kinect.Sensor.Calibration calib, Microsoft.Azure.Kinect.Sensor.k4abt_sensor_orientation_t sensorOrient) (at <e3ec297ecc8543d68cc5f17025e1e2d3>:0)
    5. com.rfilkov.kinect.DepthSensorBase.InitBodyTracking (com.rfilkov.kinect.KinectInterop+FrameSource dwFlags, com.rfilkov.kinect.KinectInterop+SensorData sensorData, Microsoft.Azure.Kinect.Sensor.Calibration calibration, System.Boolean bCreateTracker) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/DepthSensorBase.cs:1568)
    6. UnityEngine.Debug:LogException(Exception)
    7. com.rfilkov.kinect.DepthSensorBase:InitBodyTracking(FrameSource, SensorData, Calibration, Boolean) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/DepthSensorBase.cs:1589)
    8. com.rfilkov.kinect.Kinect4AzureInterface:OpenSensor(FrameSource, Boolean, Boolean) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/Kinect4AzureInterface.cs:293)
    9. com.rfilkov.kinect.KinectManager:StartDepthSensors() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:1948)
    10. com.rfilkov.kinect.KinectManager:Awake() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:1887)
    11.  

    I notice in DepthSensorBase.cs at line 1568 you call
    new BodyTracker...
    but in the Microsoft quickstart they use
    k4abt_tracker_create(&sensor_calibration, tracker_config, &tracker);
    and never directly instantiate the Body Tracker object.

    I'm on Windows 10, unity 2019.2.0f1, Kinect SDK 1.2 (not in the default location), Body Tracking SDK 0.9.3 (not in the default location), and I have tested and confirmed with Azure Kinect Viewer 1.2.0 and Azure Kinect Body Tracking Viewer.
     
  3. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Please make sure you have installed the Azure Kinect Body Tracking SDK to its by-default location, i.e. into 'C:\Program Files\Azure Kinect Body Tracking SDK'-folder. The K4A-asset expects to find it there. Then check if the 'Azure Kinect Body Tracking Viewer' works as expected. If the issue persists, please e-mail me, so we can look at the issue more deeply.
     
    sonofbryce likes this.
  4. copycat-asl

    copycat-asl

    Joined:
    Oct 14, 2019
    Posts:
    1
    Hi, can you add support for Linux?
    I've just purchased the package. I have Azure Kinect SDK and Body Tracking SDK installed and I verified that they works via the viewer application. However, I'm still getting errors as below.
    I'm using Ubuntu 18.04. I've tried putting the libk4a.so and other related .so in the Asset folder, but still getting the same errors.
    Code (CSharp):
    1. AzureKinectOpenDeviceException: result = K4A_RESULT_FAILED
    2. Microsoft.Azure.Kinect.Sensor.AzureKinectOpenDeviceException.ThrowIfNotSuccess[T] (System.Func`1[TResult] function) (at <e3ec297ecc8543d68cc5f17025e1e2d3>:0)
    3. Microsoft.Azure.Kinect.Sensor.Device.Open (System.Int32 index) (at <e3ec297ecc8543d68cc5f17025e1e2d3>:0)
    4. com.rfilkov.kinect.Kinect4AzureInterface.OpenSensor (com.rfilkov.kinect.KinectInterop+FrameSource dwFlags, System.Boolean bSyncDepthAndColor, System.Boolean bSyncBodyAndDepth) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/Kinect4AzureInterface.cs:146)
    5. com.rfilkov.kinect.KinectManager.StartDepthSensors () (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:2001)
    6. UnityEngine.Debug:LogException(Exception)
    7. com.rfilkov.kinect.KinectManager:StartDepthSensors() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:2025)
    8. com.rfilkov.kinect.KinectManager:Awake() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:1940)
    9.  
     
  5. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, unfortunately I don’t have a Linux machine, to do the needed tests. But if you’d like, we could work on this issue together. If you agree, please contact me by e-mail and mention your invoice number.
     
  6. JulioWabisabi

    JulioWabisabi

    Joined:
    Sep 14, 2018
    Posts:
    5
    Hi,
    First of all I have to say awesome work it has saved us a lot of time in development.

    We have been using the KinectV2 examples with good results so far, but now that we are migrating to the AzureKinect we found out that this new asset lack finger tracking support for the AvatarController.

    We did a quick dive into the code and noticed that all the finger tracking bits are commented, we wanted to ask if the feature is going to be enabled soon or if you can help us getting it working before trying to patch it up ourselves and save some time if possible ;)

    btw we are using version 1.6

    Regards
     
  7. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, the finger support should come in the next release. If you need it sooner, please contact me by e-mail and mention your invoice number. A big part of the previous finger handling code was stripped out of the current AvatarController-component, so I'm not quite sure what exactly you are going to uncomment or patch.
     
  8. JulioWabisabi

    JulioWabisabi

    Joined:
    Sep 14, 2018
    Posts:
    5
    I've sent you an email, Thanks!
     
  9. specsdev

    specsdev

    Joined:
    Apr 28, 2014
    Posts:
    2
    Hi, I have purchased this asset principally as learning material, but the quality and amount of the examples has gone beyond my expectations. It is saving me a lot of time to figure out how does the Kinect work. Thanks a lot and congrats!

    I would like to compile myself the Microsoft.Azure.Kinect.Sensor.dll in VisualStudio which is pretty straightforward for the sensor itself, but when I try to add the body tracking dependencies I obtain a dependency error because the package does not target the .Net version 4.6.1. Could you please help me with this issue?
    Thanks again.
     
  10. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you, too! I'm not quite sure what exactly you are trying to do. Please e-mail me, mention your invoice number and provide a more details what you're doing and what exactly is the issue that blocks you.
     
  11. adamiq

    adamiq

    Joined:
    Oct 10, 2019
    Posts:
    1
    Have you found that there are any settings for the Kinect controller that result in better joint tracking than others?
     
  12. sblake8140

    sblake8140

    Joined:
    May 22, 2017
    Posts:
    3
    Same issue:
    Azure Kinect Body Tracking SDK is in 'C:\Program Files\Azure Kinect Body Tracking SDK'-folder.

    Using 2019.1.10f1 Downloaded the latest Azure Kinect SDKs and Azure Kinect Body Tracking Viewer' works as expected.

    DllNotFoundException: Assets/AzureKinectExamples/SDK/Kinect4AzureSDK/Plugins/k4abt.dll
    Microsoft.Azure.Kinect.Sensor.BodyTracking..ctor (Microsoft.Azure.Kinect.Sensor.Calibration calib, Microsoft.Azure.Kinect.Sensor.k4abt_sensor_orientation_t sensorOrient, System.Boolean cpuOnlyTracker) (at <eb40c755051f403f8db8281deea67cb5>:0)
    com.rfilkov.kinect.DepthSensorBase.InitBodyTracking (com.rfilkov.kinect.KinectInterop+FrameSource dwFlags, com.rfilkov.kinect.KinectInterop+SensorData sensorData, Microsoft.Azure.Kinect.Sensor.Calibration calibration, System.Boolean bCreateTracker) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/DepthSensorBase.cs:1736)
    UnityEngine.Debug:LogException(Exception)
    com.rfilkov.kinect.DepthSensorBase:InitBodyTracking(FrameSource, SensorData, Calibration, Boolean) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/DepthSensorBase.cs:1757)
    com.rfilkov.kinect.Kinect4AzureInterface:OpenSensor(FrameSource, Boolean, Boolean) (at Assets/AzureKinectExamples/KinectScripts/Interfaces/Kinect4AzureInterface.cs:296)
    com.rfilkov.kinect.KinectManager:StartDepthSensors() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:2000)
    com.rfilkov.kinect.KinectManager:Awake() (at Assets/AzureKinectExamples/KinectScripts/KinectManager.cs:1939)
     
  13. sblake8140

    sblake8140

    Joined:
    May 22, 2017
    Posts:
    3
    It works fine when I do a build, but not when playing in the editor.
     
  14. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you for the feedback! This is an odd issue and I would need to look into more details. Please e-mail me and send me a screenshot of your Unity project's folder. This is the parent folder of the Assets-folder in your project.
     
  15. sblake8140

    sblake8140

    Joined:
    May 22, 2017
    Posts:
    3
    Not sure how to find your email. Here is a screenshot. ProjectDirectory.png
     
  16. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    You can find my e-mail here: https://rfilkov.com/about/
    The folder contents above looks OK. Now, please e-mail me and mention your invoice number, so we can continue investigating the issue. Please also check, if 'k4abt.dll' exists in the SDK/Kinect4AzureSDK/Plugins-folder and is enabled for the Editor & Standalone platforms, as well as for Windows 'x86_64'.
     
  17. KurtLorey

    KurtLorey

    Joined:
    Mar 17, 2015
    Posts:
    4

    I had the exact same issue and updated to 0.9.4 and that was the solution.
     
  18. Balours

    Balours

    Joined:
    Nov 27, 2013
    Posts:
    52
    With the last Windows Update my app crash when I start the scene, did you encounter the same issue?
     
  19. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you for providing the solution!
     
  20. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Are you the same person (or from the same company) that e-mailed me yesterday?
    I have not encountered the issue yet, but my Windows version is still 1809. Does anybody else experience the same issue?
     
  21. gino_pisello

    gino_pisello

    Joined:
    Jun 24, 2013
    Posts:
    31
    Hi Rumen,
    We have a project made using K2 asset.
    Now we want to use the same project with the Azure.
    What is the best way to import the K4A asset in a project that already have the K2 asset?
    thanks
     
  22. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, I would recommend to do as follows:
    1. Zip the existing project, just in case. Make screenshots of KinectManager's settings in each scene, where it is used.
    2. Open the project in Unity (at least 2019.1.0f2, with '.NET 4.x' selected in Player settings).
    3. Delete the K2Examples-folder.
    4. Import the K4A-asset. It will create AzureKinectExamples-folder.
    5. Copy the KinectController-game object from a K4A demo scene (AzureKinectExamples/KinectDemos-folder) to the clipboard.
    6. Go through each scene that uses KinectManager in your project, and:
    7. Delete and replace KinectController-game object in the scene with the copied one.
    8. Adjust the KM-settings to match the KM screenshot from #1, as much as possible.
    9. Check, if the other Kinect-related components in the scene look right.
    10. Test the scene, to check if it still works as expected with Azure Kinect.

    If you want to keep it working with Kinect-v2, please look at this tip on how to enable the Kinect-v2 interface.
     
    gino_pisello likes this.
  23. arocter

    arocter

    Joined:
    Aug 8, 2017
    Posts:
    1
    Hello Rumen,

    First of all, thank you for sharing this awesome asset. I'm an interactive developer now working on a Kinect base project with your asset.

    The problem I'm having is that sometimes two users share the same body info (skeleton) when detecting. I have 2 U_characterts from the sample scene and the player index is set to 0 and 1. Now I encounter this problem like 50% whenever a new user has been detected, but I can't find the reason for causing it.

    I have attached a video record showing this issue. Please let me know if you have any idea causing this issue.

    Thank you.

    Have a nice day.
    Hao

     
  24. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you for the feedback! You found a bug. To fix it, please open KinectScripts/Interfaces/Kinect2Interface.cs, then find and replace this line: 'KinectInterop.BodyData bodyData = alTrackedBodies;' with 'KinectInterop.BodyData bodyData = alTrackedBodies[(int)trackedBodiesCount];'. Then try again.
     
  25. MartinProchazka

    MartinProchazka

    Joined:
    Apr 24, 2015
    Posts:
    8
    Hello to all...Is it possible to record and play data in any point cloud format (xyzrgb) with scripts or components in this asset, please ?
     
  26. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Currently, there is no demo scene for saving or re-playing of recorded point cloud or mesh data. The SceneMeshDemo and UserMeshDemo-scenes generate the point clouds (or meshes) in real time, by using shaders for performance reasons. But you are free save the mesh data, if you have a preferred format. To do it, you would need to replace SceneMeshRendererGpu-component with SceneMeshRenderer (or UserMeshRendererGpu with UserMeshRenderer). They generate the meshes on CPU only, so you would have access to the mesh data.
     
    MartinProchazka likes this.
  27. borderlineinteractive

    borderlineinteractive

    Joined:
    Sep 20, 2015
    Posts:
    14
    Hi Rumen,

    Thanks for generating this very useful package. We are trying to use your scripts to track a user that is always facing away from the Kinect v2 sensor. However, the system appears to always expect a forward facing avatar and therefore the tracking of the skeleton is not stable. Is there any setting that would change this behaviour? Optimally, an optional setting would set the orientation of the avatar backwards and thereby help the system to improve tracking of users that always face away from the sensor. Is the expectation of a forward facing avatar part of your code, or is this something that is hardcoded in the Kinect 2 SDK?

    Best wishes,

    Leif
     
  28. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi Leif, the body tracking subsystem of Kinect SDK 2.0 was trained in a way to expect mainly front facing users. The Kinect-v2 sensor was created for Xbox One. That's why in the K2-asset there was a setting of KinectManager called 'Allow turn arounds'. It relied on detecting user's face, as far as I remember, to determine whether the user is front or back facing, and to turn the avatars accordingly. In the K4A-asset the face tracking is missing (yet). That's why this K2-specific setting is also missing. Moreover, the body tracking of Azure Kinect now tracks both front and back facing users. Do you use the K4A-asset (discussed in this forum) or the K2-asset (it has its own forum)? If you use the K4A-asset, please e-mail me, mention your invoice number, and I'll see what I can do about it.
     
  29. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    5
    I just got another question. I used a background removal5 scene. The scene worked fine. The floor is not removed but is played on the screen with the person. Is there a workaround for this problem? If not, is there a way to adjust the angle of the camera without moving the Kinect?
     
  30. lomaikabini

    lomaikabini

    Joined:
    Feb 28, 2014
    Posts:
    16
    Hi All,

    I'm not sure why but the quality of image looks way lower in Unity compare with the image in the Azure Kinect Viewer with the same quality settings, do you have any guess what might be the problem?

    Thanks in advance
     
  31. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    There is a setting of BackgroundRemovalManager-component in the scene, called 'Offset to floor'. You can play a bit with it, to adjust the floor visibility in the scene. Please also stick to one channel of communication, to save me some work answering the same question by e-mail, on the forum, etc.
     
  32. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    What image do you mean - the color camera image, the depth image, the IR image, or anything else? Would you please post some screenshots showing images that look "way lower in Unity compared with the Viewer".
     
  33. ChristophFandrich

    ChristophFandrich

    Joined:
    Apr 24, 2019
    Posts:
    3
    Hey there, is there any solution for Ubuntu?
    Greetings from Germany
     
  34. Anipen

    Anipen

    Joined:
    Oct 15, 2014
    Posts:
    8
    Hello,

    I want to use body tracking vertically with azure kinect.

    I found k4abt_sensor_orientation_t in body tracking sdk 0.9.4.
    https://microsoft.github.io/Azure-K...enums_ga8e5fb3391addee8fadaa809c601e223e.html

    And I found where to set orientation in Azure kinect unity DepthSensorBase.cs.

    Code (CSharp):
    1. bodyTracker = new BodyTracking(calibration, k4abt_sensor_orientation_t.K4ABT_SENSOR_ORIENTATION_DEFAULT, false);
    However, other values than K4ABT_SENSOR_ORIENTATION_DEFAULT do not work.

    Is it not implemented right now?

    The version I am using is:
    Azure Kinect SDK 1.3
    Azure Kinect Body tracking SDK 0.9.4
    Azure Kinect Examples for Unity v1.7.1
     
  35. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Unfortunately I'm not a Linux expert, don't have Ubuntu installed here and no way to test. Otherwise, with properly placed native libraries, it should work the same way as on Windows, as to me.
     
  36. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I have the same versions of the installed software as you. When I change K4ABT_SENSOR_ORIENTATION_DEFAULT to other values, I see there is difference in the body tracking. Why do you think it doesn't work?
     
  37. Anipen

    Anipen

    Joined:
    Oct 15, 2014
    Posts:
    8
    I expected the body tracking joint to rotate clockwise 90 by placing the azure kinect in clockwise 90 and using the K4ABT_SENSOR_ORIENTATION_CLOCKWISE90 option.

    But setting k4abt_sensor_orientation_t did not change joint rotation.

    I seem to have misunderstood.
     
  38. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    In my test the avatar's body rotated counter clockwise 90 degrees, when I used K4ABT_SENSOR_ORIENTATION_CLOCKWISE90 and turned the sensor clockwise 90 degrees.
     
  39. lomaikabini

    lomaikabini

    Joined:
    Feb 28, 2014
    Posts:
    16
    Is there a solution to remove background using User Texture? Cause whenever I set kinect into portrait mode(rotate device for 90 degrees) remove background scenes doesn't cut a lot of floor and ceiling in this position. But I noticed that User Texture looks good in both position either landscape or portrait
     
  40. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    At the moment, because of the not so good quality of the user silhouette, there is no such a solution. But I'll consider again adding this option to the background removal manager, for the next update.
     
  41. lomaikabini

    lomaikabini

    Joined:
    Feb 28, 2014
    Posts:
    16
    Thanks a lot! Btw is there an option to use azure kinect with your plugin in portrait mode for body tracking? I saw the the last sdk supports this option : https://microsoft.github.io/Azure-K...enums_ga8e5fb3391addee8fadaa809c601e223e.html
     
  42. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Open DepthSensorBase.cs in KinectScripts/Interfaces-folder, look for 'bodyTracker = new BodyTracking(' and change the 2nd parameter from 'K4ABT_SENSOR_ORIENTATION_DEFAULT' to your preferred sensor orientation.
     
    lomaikabini likes this.
  43. gino_pisello

    gino_pisello

    Joined:
    Jun 24, 2013
    Posts:
    31
    We encountered the same issue using the 0.9.5 Azure Kinect Body Tracking SDK. It doesn't happen if you use the 0.9.4 version.
     
  44. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    v1.7.x of the K4A-asset works with Body Tracking SDK v0.9.4. The next release (v1.8) will work with BT SDK v0.9.5. There are usually breaking changes between the BT SDK releases (e.g. changes in the API calls that can cause crashes), that's why it's not possible to freely replace the SDK with newer releases.
     
  45. sankeerth

    sankeerth

    Joined:
    Feb 19, 2014
    Posts:
    1
    Hi, I get an error message saying "no suitable depth sensor found." I have the Kinect one (v2) and it works fine with other apps. Please help!
     
  46. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
  47. derek9975

    derek9975

    Joined:
    Mar 17, 2014
    Posts:
    9
  48. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, here is how you could do it in the K4A-asset:
    Code (CSharp):
    1.  
    2.             ulong frameTime = 0, lastUsedFrameTime = 0;
    3.  
    4.             // in Start()
    5.             KinectInterop.SensorData sensorData = KinectManager.Instance.GetSensorData(0);
    6.             sensorData.sensorInterface.EnableColorCameraDepthFrame(sensorData, true);
    7.            
    8.             // in Update()
    9.             ushort[] transformedDepthFrame = sensorData.sensorInterface.GetColorCameraDepthFrame(ref frameTime);
    10.             if(transformedDepthFrame != null &amp;&amp; frameTime != lastUsedFrameTime)
    11.             {
    12.                 lastUsedFrameTime = frameTime;
    13.                 // do something with the transformed depth frame
    14.             }
    15.  
     
    derek9975 likes this.
  49. derek9975

    derek9975

    Joined:
    Mar 17, 2014
    Posts:
    9
    Thanks for this response and I've been messing with it and I'm having a few issues.

    The biggest is that the return type of the function is a ushort[] and a ushort is an unsigned 16 bit (2 byte) size. The original azure function outputs the converted texture into a k4a_image_t which outputs the texture data as an uint8_t* which is a byte array. I'm not sure if this is the source of the problem or not, or if I'm doing something terribly wrong with the data but I am unable to get a usable unity texture from the data as is (been trying to load the array into an ARGB4444 format texture).

    I noticed the function directly below: GetColorCameraDepthFrameBytes hoping that it returned the appropriate byte array, but it does not. It returns a byte array that is 4147200 in length, which is double the size of a 1920x1080 texture. So I assume the ushort[] is being copied into a byte array.

    Is there anything I can do as is? Am I using this wrong or is there a need for a fix?
     
  50. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Well, k4a_transformation_depth_image_to_color_camera() doesn't return a texture. It returns the same depth frame (array of ushorts representing the depth for each point, in mm), as you get here. Image is just a data wrapper in K4A SDK. If you need a texture, you would need to convert this depth array to a texture. For instance, the background-removal demo scenes convert it to alpha mask textures, VfxPointCloudDemo converts it to vertex texture, KinectScripts/SceneMeshRenderer.cs uses the functions above to build the vertices of the mesh, etc.
     
    derek9975 likes this.
unityunity