Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Azure Kinect Examples for Unity

Discussion in 'Assets and Asset Store' started by roumenf, Jul 24, 2019.

  1. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    KinectUserBodyMerger-class is in 'KinectUserBodyMerger.cs' in the AzureKinectExamples/KinectScripts-folder.
     
  2. ruvidan2001

    ruvidan2001

    Joined:
    Apr 5, 2020
    Posts:
    6
    Whoop, that was it! Thank you!
     
  3. GZMRD17

    GZMRD17

    Joined:
    Jan 25, 2020
    Posts:
    33
    Thank you roumenf. Let me study that code.
     
  4. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    @roumenf
    Hi, I just bought the asset and looking at inducting the AzureKinect in Virtual Production (experimental filmmaking)
    Can I ask if you could update the asset to HDRP?
    As it stands I get the magenta/purple material error even after trying to upgrade materials to HDRP.

    This makes it useless (at the moment) to try to learn from the point cloud example etc.

    I'm a newbie (non coder) filmmaker and looking at using the AzureKinect for an interesting experiment:
    To see if I can get some "rounded" depth on greenscreen people filmed live, while pushing a grayscale depth map of the Azure kinect into an HDRP shader graph "heightmap"

    Hoping you could help.
     
  5. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Ah, sorry, I saw your review before this comment here. Thank you for the feedback!

    Anyway, please e-mail me with some more details of what you want to do, so I could understand it better and help you, if I can. Regarding HDRP, there is a special demo scene - VfxPointCloudDemo in KinectDemos/PointCloudDemo-folder. Here is some help in this regard: https://ratemt.com/k4adocs/VFXPoint-CloudDemo.html

    You can also filter the people there, by setting the 'Point cloud player list' of Kinect4AzureInterface-component in the scene. There was a comment about this, above in this forum.

    Regarding the pink materials (sorry again, I'm not a designer), I see now many materials in the asset use legacy shaders. Maybe this is the reason for their incompatibility with HDRP. I'll take a look at this issue.
     
    Dirrogate likes this.
  6. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    Thanks so much, RoumenF. I'll certainly send in an email today.

    Yes, this is due to legacy shaders. According to Unity, built in render pipe will be obsolete, and only URP and HDRP will be used going forward.
    It's safe to assume people will be using the AzureKinect more with the higher end HDRP so it would help if the materials are migrated.

    Kind Regards.
     
  7. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    Hi @roumenf , just checking if the mail reached. Do check spam folders. Kind Regards
     
  8. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Yes, I got your e-mail. Thank you again for the feedback!
     
  9. nounte

    nounte

    Joined:
    Dec 20, 2012
    Posts:
    2
    Hello, I have several missing scripts in the VFXPointCloudDemo scene.

    I have re-imported 1.11 to try to ensure no changes to the package since the previous import. This is the first example scene I've noticed the problem in. The missing scripts are in MainCamera and Directional Light. Regards.
     
  10. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, the VfxPointCloudDemo-scene was specially designed as demo scene for High definition render pipeline in Unity. Please see this tip: https://ratemt.com/k4adocs/VFXPoint-CloudDemo.html The "missing" scripts are missing only, when the built-in render pipeline is used, because they are HDRP-specific components.
     
  11. nounte

    nounte

    Joined:
    Dec 20, 2012
    Posts:
    2
  12. UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    Joined:
    Aug 5, 2016
    Posts:
    11
    Hi there, is there a way to generate VFX point cloud using the multi-cam config? The assigned point cloud vertex and colour textures disappear on play mode. I am able to do this using 2 Azures with the multi-cam bool off, but doing so means my user positions and point clouds aren't seamless across both devices. Thanks!
     
    Last edited: May 8, 2020
  13. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, in this case I think you could manually set up the sensor interfaces in the scene. Just see the 'multicam_config.json'-file in the root folder of your Unity project and set accordingly the transform positions and rotations of the Kinect4AzureInterface-components in the scene. Then set the 'Point cloud vertex texture' & 'Point cloud color texture' of the Kinect4AzureInterface-components to point to different sets of textures, in order not to overwrite the generated images by both sensors. Also, don't forget to disable the 'Use multi cam config'-setting of KinectManager in this case.
     
  14. Deleted User

    Deleted User

    Guest

    I use the il2cpp mode to publish, the error is reported at runtime.How should I fix this.Thank you all

    The error is follow

    NotSupportedException: To marshal a managed method, please add an attribute named 'MonoPInvokeCallback' to the method definition. The method we're attempting to marshal is: Microsoft.Azure.Kinect.Sensor.Logger::OnDebugMessage
    at Microsoft.Azure.Kinect.Sensor.NativeMethods.k4a_set_debug_message_handler (Microsoft.Azure.Kinect.Sensor.NativeMethods+k4a_logging_message_cb_t message_cb, System.IntPtr message_cb_context, Microsoft.Azure.Kinect.Sensor.LogLevel min_level) [0x00000] in <00000000000000000000000000000000>:0
    at Microsoft.Azure.Kinect.Sensor.Logger.Initialize () [0x00000] in <00000000000000000000000000000000>:0
    at Microsoft.Azure.Kinect.Sensor.Logger.add_LogMessage (System.Action`1[T] value) [0x00000] in <00000000000000000000000000000000>:0
    at Microsoft.Azure.Kinect.Sensor.LoggingTracer..ctor () [0x00000] in <00000000000000000000000000000000>:0
    at Microsoft.Azure.Kinect.Sensor.AzureKinectOpenDeviceException.ThrowIfNotSuccess[T] (System.Func`1[TResult] function) [0x00000] in <00000000000000000000000000000000>:0
    at Microsoft.Azure.Kinect.Sensor.Device.Open (System.Int32 index) [0x00000] in <00000000000000000000000000000000>:0
    at com.rfilkov.kinect.Kinect4AzureInterface.OpenSensor (com.rfilkov.kinect.KinectInterop+FrameSource dwFlags, System.Boolean bSyncDepthAndColor, System.Boolean bSyncBodyAndDepth) [0x00000] in <00000000000000000000000000000000>:0
    at com.rfilkov.kinect.KinectManager.StartDepthSensors () [0x00000] in <00000000000000000000000000000000>:0
    at com.rfilkov.kinect.KinectManager.Awake () [0x00000] in <00000000000000000000000000000000>:0
    UnityEngine.DebugLogHandler:LogException(Exception, Object)
    UnityEngine.Logger:LogException(Exception, Object)
    UnityEngine.Debug:LogException(Exception)
    com.rfilkov.kinect.KinectManager:StartDepthSensors()
    com.rfilkov.kinect.KinectManager:Awake()
     
  15. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you for the feedback! Please use the Mono scripting backend instead of IL2CPP, as a workaround for now. I added this issue to my todo-list, but it may take a while until I research and (if possible) fix it.
     
  16. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    803
    Hi @roumenf I just can't get the joint rotations right. When I plot the raw joints using gizmos, they seem to be rotated wrong to start out with. Unless I flip x and z on the rotations. When I do flip the rotations, they are still slightly off (about 5 degrees, continuously).

    Try to put the attached script in a scene that has the KinectManager in it and move around a bit.

    What am I missing?

    Code (CSharp):
    1. using UnityEngine;
    2. using com.rfilkov.kinect;
    3.  
    4. public class KinectAzureExamplesRotationsTest : MonoBehaviour
    5. {
    6.     bool[] _isTracked;
    7.     Vector4[] _positions;
    8.     Quaternion[] _rotations;
    9.  
    10.  
    11.     void LateUpdate()
    12.     {
    13.         KinectManager kinectManager = KinectManager.Instance;
    14.         if( !kinectManager || !kinectManager.IsInitialized() ) return;
    15.  
    16.         int maxTrackedUsers = kinectManager.maxTrackedUsers;
    17.         if( maxTrackedUsers == 0 ) maxTrackedUsers = 8;
    18.  
    19.         // Adapt.
    20.         int jointCount = kinectManager.GetJointCount();
    21.         int jointCountTotal = jointCount * maxTrackedUsers;
    22.         if( _isTracked == null || _isTracked.Length != jointCountTotal ) _isTracked = new bool[ jointCountTotal ];
    23.         if( _positions == null || _positions.Length != jointCountTotal ) _positions = new Vector4[ jointCountTotal ];
    24.         if( _rotations == null || _rotations.Length != jointCountTotal ) _rotations = new Quaternion[ jointCountTotal ];
    25.  
    26.         // Update.
    27.         for( int b = 0; b < maxTrackedUsers; b++ ) {
    28.             if( !kinectManager.IsUserDetected( b ) ) {
    29.                 for( int j = 0; j < jointCount; j++ ) {
    30.                     int bj = b * jointCount + j;
    31.                     _isTracked[ bj ] = false;
    32.                 }
    33.                 continue;
    34.             }
    35.  
    36.             ulong userId = kinectManager.GetUserIdByIndex( b );
    37.  
    38.             for( int j = 0; j < jointCount; j++ ) {
    39.                 int bj = b * jointCount + j;
    40.                 if( !kinectManager.IsJointTracked( userId, j ) ) {
    41.                     _isTracked[ bj ] = false;
    42.                     continue;
    43.                 }
    44.  
    45.                 Vector3 rawJointPos = kinectManager.GetJointKinectPosition( userId, j, true );
    46.                 Quaternion rawJointRot = kinectManager.GetJointOrientation( userId, j, false );
    47.  
    48.                 // If I flip x and z, then the rotations follow the movement, but not perfectly.
    49.                 //rawJointRot.x *= -1;
    50.                 //rawJointRot.z *= -1;
    51.  
    52.                 _positions[ bj ] = rawJointPos;
    53.                 _rotations[ bj ] = rawJointRot;
    54.                 _isTracked[ bj ] = true;
    55.             }
    56.         }
    57.     }
    58.  
    59.  
    60.     void OnDrawGizmos()
    61.     {
    62.         if( _positions == null ) return;
    63.  
    64.         for( int bj = 0; bj < _positions.Length; bj++ ) {
    65.             if( !_isTracked[ bj ] ) continue;
    66.             Gizmos.matrix = Matrix4x4.TRS( _positions[ bj ], _rotations[ bj ], Vector3.one * 0.1f );
    67.             Gizmos.color = Color.white;
    68.             Gizmos.DrawWireCube( Vector3.zero, Vector3.one * 0.5f );
    69.             Gizmos.color = Color.red;
    70.             Gizmos.DrawLine( Vector3.zero, Vector3.right );
    71.             Gizmos.color = Color.green;
    72.             Gizmos.DrawLine( Vector3.zero, Vector3.up );
    73.             Gizmos.color = Color.blue;
    74.             Gizmos.DrawLine( Vector3.zero, Vector3.forward );
    75.         }
    76.     }
    77. }
     
    Last edited: Jun 7, 2020
  17. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    803
    @roumenf ... another issue. When I set KinectManager GetPoseFrames mode to "Update Transform" it works fine with a connected sensor. But when I switch the Kinect4AzureInterface DeviceStreamingMode from "Connected Sensor" to "Play Recording" the transform is rotated to 6 degrees on the x axis no mattter what recorded file I play back.

    I have checked the recording in k4aviewer.exe and they do include IMU data.
     
  18. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I think it should be: '_rotations[ bj ] = Quaternion.Euler(0, 180, 0) * rawJointRot;', because you get the mirrored rotation.
     
    cecarlsen likes this.
  19. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Currently the Kinect4AzureInterface skips the IMU data in the recording-file. Otherwise the performance suffers. Please set the sensor position & rotation manually, in the transform component of the sensor interface object. Here is a tip how to do this: https://rfilkov.com/2019/08/26/azure-kinect-tips-tricks/#t9
     
  20. Recky256

    Recky256

    Joined:
    Dec 1, 2014
    Posts:
    11
    Hi! I love this amazing asset. I have a question.
    To prevent privacy security incidents, can I stop generate .bin file (color & depth images)?
     
  21. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Yes, sure. Just open KinectInterop.cs, look for 'SAVE_SPACE_TABLES_TO_DISK' and set it to 'false'. But in this case prepare for longer waiting times at scene start.
     
    Recky256 likes this.
  22. UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    Joined:
    Aug 5, 2016
    Posts:
    11
    Thanks @roumenf. Does this mean that a seamless VFX point cloud between two Kinects is not possible at the moment? For example, User0 remains User0 in both cameras instead of 2 separate users.
     
  23. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    803
    Thanks, I see. Though, I wonder why reading a rotation from file could cause performance issues.
     
  24. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I think it should be possible. The body-merger should merge the bodies detected by multiple sensors, but the sensors should be properly calibrated.
     
  25. DFT-Games

    DFT-Games

    Joined:
    Jun 24, 2010
    Posts:
    443
    Hi, it appears that also the last update did not introduce URP compatibility. Now that 2019 LTS is out URP is the definitive standard: any chance you can make URP compatibility a priority?
     
  26. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Yes, the URP compatibility is on my todo-list as well, but HDRP had priority, because of the many customer requests. There is just too much work, to be done at all at once. That's why I do it step by step, with some pauses to minimize the stress.
     
  27. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    By the way, my recent tests have shown that v1.12 of the K4A-asset is URP compatible. Almost all scenes work with URP, just like with HDRP.
     
  28. joycon3353

    joycon3353

    Joined:
    Mar 15, 2019
    Posts:
    2
    Hi.
    I was using the UserMeshVisualizer in Kinect v2 Example before.
    Is there a way to use UserMeshVisualizer inside Azure Kinect Example?
    The function I want is that the shape of the person does not move and is reflected in the center.
     
  29. UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    Joined:
    Aug 5, 2016
    Posts:
    11
    Thanks @roumenf. Managed to get seamless VFX user detection across both Kinects.

    Bumped into another issue - somehow the user fails to reappear in the Kinect point cloud textures if they happen to be out of view from either Kinects. I am using separate position and colour textures for each Kinect to generate particles of each user via the VFX graph. Example scenario:

    1. Start up application. VFX of User 0 displayed through both Kinects.
    2. If User 0 is out of sight from either Kinect 1 or 2, its VFX will not return again, even though User 0 has not been removed (still in view of either one Kinect).

    I find that this issue does not occur when I use '-1' in the point cloud player lists of the Kinects, only when I use user indices. Hope you can help!
     
  30. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    The user mesh visualizer scene in the K4A-asset is called 'UserMeshDemo' and is in the 'AzureKinectExamples/KinectDemos/PointCloudDemo'-folder.
     
  31. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Thank you for the feedback! I would need to research this issue more deeply.
     
  32. GZMRD17

    GZMRD17

    Joined:
    Jan 25, 2020
    Posts:
    33
    Hi roumenf. Regarding the delay between a subordinate camera and a master Kinect. Are there any problems having zero delay when using multi-camera avatar demos in Unity?
     
  33. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi. I'm not sure I understand your question. Could you please provide some more details.
     
  34. GZMRD17

    GZMRD17

    Joined:
    Jan 25, 2020
    Posts:
    33
    Hi roumenf. In the link below, they discuss this:

    "In your software, use depth_delay_off_color_usec or subordinate_delay_off_master_usec to make sure that each IR laser fires in its own 160μs window or has a different field of view."

    Do I need to worry about this delay when using multiple Kinects with Unity and your demos or can I set the delay_off_master = 0? https://docs.microsoft.com/en-us/azure/kinect-dk/multi-camera-sync
     
  35. Lukiioii

    Lukiioii

    Joined:
    Mar 12, 2020
    Posts:
    4
    Hi roumenf, many thanks for providing this asset. It's helpful.

    I noticed that those characters were still quite trembling, even though I set the 'smooth factor'. I'm wondering if it's possible to add some Gaussian filter on the skeleton results to make them better? Because currently the smoothing process is based on the Slerp() method, which refers to two neighbor frames only. And I believe that Gaussian filtering can have a better smoothing performance by considering more previous frames.

    However, I believe that quaternions are not linearly, and it may not be currently to computing Gaussian/Average results directly? So, does it possible to have Gaussian filtering on the raw skeleton position data and then convert to quaternions for rotation?

    I'm trying to do that, but it looks like your code using the quaternions directly from the k4abt_skeleton_t?

    I'm looking forward to your reply and your advice.
     
  36. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi GZMRD17, it's up to you and your own experience, to delay the subordinate device(s) or not. You can do it by setting the 'Sub device delay usec'-setting of the respective Kinect4AzureInterface-component. The 160 us delay between the cameras is recommended, to prevent the interference between the lasers. My personal experience with 2 cameras is that setting a delay of 0 between them was also OK.
     
  37. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, FYI: if you need a smoother movements, you should decrease the 'Smooth factor' value, not increase it.

    Yes, for Azure Kinect I'm using the joint rotations, as reported by the Body Tracking SDK. Of course, it is possible to estimate them from the detected joint positions (I've done this before), but this would require extra coding.

    To average more than two quaternions, you can use SumUpQuaternions() & AverageQuaternions()-methods in 'KinectInterop.cs'.
     
  38. GZMRD17

    GZMRD17

    Joined:
    Jan 25, 2020
    Posts:
    33
    Hi rounmenf, I recorded a file outside Unity only using the Kinect body tracker software from Microsoft with some modifications using one Kinect. Very simple: I obtain three positions and three rotations with meters and degrees. In the end, I generate a "Bodyrecording.txt" text file in the same format as that recorded by the player with the same columns as defined here: https://rfilkov.com/2019/08/26/azure-kinect-tips-tricks/#t15. I am using the recorder/player demo to view my file, but I only see the time change, but no animation. However, when I use the recorder/player to generate a Bodyrecording.txt file and I play that file back. The animation comes out perfect. I am not sure if you are familiar with this problem at all, but I have tried changing camera locations, mirroring, and several other changes in the settings and I cannot visualize my data. I am not sure if you would have any suggestions?
     
  39. Tyndareus

    Tyndareus

    Joined:
    Aug 31, 2018
    Posts:
    30
    Started to use this and wanted to get a recording setup so I can test easier, I attempted with the bodyrecording thats provided from the sample scene but that didn't work. So i switched, got a quick recording using the k4arecorder from the azure dk, set that as the recorded file but that also didn't work.

    Both methods give this error:
    Code (CSharp):
    1. AzureKinectException: result = K4A_RESULT_FAILED
    2. Microsoft.Azure.Kinect.Sensor.AzureKinectException.ThrowIfNotSuccess[T] (T result) (at <90a50830ba314d3797700c14e42dbc54>:0)
    3. Microsoft.Azure.Kinect.Sensor.Playback.OpenPlaybackFile (System.String filePath) (at <90a50830ba314d3797700c14e42dbc54>:0)
    4. Microsoft.Azure.Kinect.Sensor.Playback..ctor (System.String filePath) (at <90a50830ba314d3797700c14e42dbc54>:0)
    5. com.rfilkov.kinect.Kinect4AzureInterface.OpenSensor (com.rfilkov.kinect.KinectManager kinectManager, com.rfilkov.kinect.KinectInterop+FrameSource dwFlags, System.Boolean bSyncDepthAndColor, System.Boolean bSyncBodyAndDepth) (at Assets/ThirdPartyAssets/Tools/AzureKinect/KinectScripts/Interfaces/Kinect4AzureInterface.cs:243)
    6. com.rfilkov.kinect.KinectManager.StartDepthSensors () (at Assets/ThirdPartyAssets/Tools/AzureKinect/KinectScripts/KinectManager.cs:2807)
    7. UnityEngine.Debug:LogException(Exception)
    8. com.rfilkov.kinect.KinectManager:StartDepthSensors() (at Assets/ThirdPartyAssets/Tools/AzureKinect/KinectScripts/KinectManager.cs:2831)
    9. com.rfilkov.kinect.KinectManager:Awake() (at Assets/ThirdPartyAssets/Tools/AzureKinect/KinectScripts/KinectManager.cs:2608)
    Edit; I updated the firmware and it seems to accept the new recording, will keep an eye on this though but if there is something more specific that I can do to avoid this in the future let me know.
     
    Last edited: Jul 14, 2020
    roumenf likes this.
  40. GZMRD17

    GZMRD17

    Joined:
    Jan 25, 2020
    Posts:
    33
    I was able to determine what the problem was. Unity is very particular about the data formatting. But it is working now. Just FYI.
     
    roumenf likes this.
  41. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I'm actually surprised the body recording in BodyDataRecorderDemo did not work. Are there any errors in the console?

    By the way, please always make sure there aren't any non-English characters in the file path or in the Unity project path. They may be causing troubles.
     
  42. Lukiioii

    Lukiioii

    Joined:
    Mar 12, 2020
    Posts:
    4
    Hi @roumenf, many thanks for your reply. The movement is quite smoother now :)

    And I noticed there is a "mesh clipping" issue on my model. My character's hand can work into his belly while I'm touching my belly. I tried to add the "Rigidbody" and "Collider" component, but it looks like the problem still not be solved. I'm new to Unity, could you please provide me some possible solutions and advice?
     
  43. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    A body part going through other body parts is a tough problem since a very long time, and unfortunately it doesn't have an easy solution. The easiest one would be the model to replicate user's proportions as close as possible.

    One other solution would be to have trigger colliders around the model's body, and when a collision is triggered, to revert the last body joint update. But this would require more coding and a careful tuning as well, because all body joints are updated at once, and if one of them triggers a collision, all joint updates may be blocked. So, they should be tried one by one, to see which one causes the collision, and then block only this joint's update.
     
  44. UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    Joined:
    Aug 5, 2016
    Posts:
    11
    Was just able to do more testing on this issue. It seems as if the user fails to reappear in the point cloud vertex textures under a couple scenarios (manual multiple Kinect setup, user multi cam config unticked):

    1. User will fail to reappear from the Kinect texture that it is out of view from, even when user has not been removed.
    2. When user is removed and added again under the same index but different ID.

    Point cloud colour textures don't seem to be affected.
     
    Last edited: Jul 27, 2020
  45. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hm, this looks like a tough issue. Please e-mail me, so I can request some more details and (if possible) some recordings from you, so I could reproduce the issue here.
     
  46. UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    UDN_31531f55-d82f-47e5-aeb6-20f4aaa3f82b

    Joined:
    Aug 5, 2016
    Posts:
    11
    Thanks @roumenf , I have sent an email with a fuller explanation and video demonstrations attached.
     
  47. Yunitea

    Yunitea

    Joined:
    Jul 7, 2014
    Posts:
    4
    roumenf, I'm really grateful you are offering this fantastic asset. It allows for such a quick and painless entry into the world of Kinect development! Of course, there are always some special scenarios

    I'd be interested to know how you would send joint data of a Azure Kinect that's plugged into a PC running a Unity server app (Windows 10, Unity 2019.3.13f) to a Hololens 2 client app. Both are connected via WiFi. The joint data would then of course also need to be applied to an avatar on the Hololens.

    Does the Kinect SDK support the ARM64 architecture of the Hololens 2? Is there anything from your Network Demo examples I can employ for this use case? What do you regard here as the shortest path for getting a simple experimental demo up and running?

    Any guidance is much appreciated! :) :D
     
    Last edited: Jul 29, 2020
  48. Tyndareus

    Tyndareus

    Joined:
    Aug 31, 2018
    Posts:
    30
    I'm getting the following error when trying to run the kinect in a build

    Code (CSharp):
    1. NotSupportedException: To marshal a managed method, please add an attribute named 'MonoPInvokeCallback' to the method definition. The method we're attempting to marshal is: Microsoft.Azure.Kinect.Sensor.Logger::OnDebugMessage
    2.  at Microsoft.Azure.Kinect.Sensor.NativeMethods.k4a_set_debug_message_handler (Microsoft.Azure.Kinect.Sensor.NativeMethods+k4a_logging_message_cb_t message_cb, System.IntPtr message_cb_context, Microsoft.Azure.Kinect.Sensor.LogLevel min_level) [0x00000] in <00000000000000000000000000000000>:0
    3.  at Microsoft.Azure.Kinect.Sensor.Logger.Initialize () [0x00000] in <00000000000000000000000000000000>:0
    4.  at Microsoft.Azure.Kinect.Sensor.Logger.add_LogMessage (System.Action`1[T] value) [0x00000] in <00000000000000000000000000000000>:0
    5.  at Microsoft.Azure.Kinect.Sensor.AzureKinectOpenDeviceException.ThrowIfNotSuccess[T] (System.Func`1[TResult] function) [0x00000] in <00000000000000000000000000000000>:0
    6.  at Microsoft.Azure.Kinect.Sensor.Device.Open (System.Int32 index) [0x00000] in <00000000000000000000000000000000>:0
    7.  at com.rfilkov.kinect.Kinect4AzureInterface.OpenSensor (com.rfilkov.kinect.KinectManager kinectManager, com.rfilkov.kinect.KinectInterop+FrameSource dwFlags, System.Boolean bSyncDepthAndColor, System.Boolean bSyncBodyAndDepth) [0x00000] in <00000000000000000000000000000000>:0
    8.  at com.rfilkov.kinect.KinectManager.StartDepthSensors () [0x00000] in <00000000000000000000000000000000>:0
    9. UnityEngine.Logger:LogException(Exception, Object)
    10. UnityEngine.Debug:LogException(Exception)
    11. com.rfilkov.kinect.KinectManager:StartDepthSensors()
    Any help with this would be great

    Edit: Just saw another user post it.
    To fix you need the original C# wrapper and add the MonoPInvokeCallback attribute in logger.cs and then build the binaries
     
    Last edited: Aug 11, 2020
  49. bngames

    bngames

    Joined:
    Jul 3, 2012
    Posts:
    67
    Hi

    HeightEstimator scene gives good results, but when I add the HeightEstimator.cs and BodySlicer.cs to my scene it gives the wrong height info if should be double the amount.
     
  50. robotmechanic

    robotmechanic

    Joined:
    Dec 29, 2009
    Posts:
    106
    @roumenf
    Hello. Your asset works great independently, however, when I try to import it into another working project for my client I get the following error.

    Library\PackageCache\com.unity.collections@0.7.0-preview.2\Unity.Collections\UnsafeUtilityEx.cs(10,24): error CS0234: The type or namespace name 'Unsafe' does not exist in the namespace 'System.Runtime.CompilerServices'

    Now I understand your project doesn't include this dependency (com.unity.collections). However, the working project without your asset works fine. The trouble only arises in combination.

    Do you have any clue what could be causing this and perhaps any tips on how to troubleshoot this?

    Thank you.