Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Bug Help adapting old Device-Based Code to Action-Based?

Discussion in 'Scripting' started by lex_henrion, Apr 26, 2023.

  1. lex_henrion

    lex_henrion

    Joined:
    Mar 16, 2023
    Posts:
    2
    Hi all, I'm a newbie at Unity and C#, and I'm trying to implement a 3D drawing feature in my app. I found a promising start to how to do this, but it's using device-based XR controllers rather than action-based (which I need). I have already gotten the controllers working with the XR interaction toolkit, but I have failed at adapting this code to be compatible with them. Could anyone give me any pointers in how to go about adapting the following as a complete beginner? I have tried a few things on my own, but each do not seem to interact with my action-based controllers at all. :(

    Code (CSharp):
    1. using System.Collections.Generic;
    2.  
    3. using UnityEngine;
    4.  
    5. using UnityEngine.XR;
    6.  
    7. using UnityEngine.UI;
    8.  
    9.  
    10. public class Brush : MonoBehaviour
    11.  
    12. {
    13.  
    14. // Prefab to instantiate when we draw a new brush stroke
    15.  
    16. [SerializeField] private GameObject _brushStrokePrefab = null;
    17.  
    18.  
    19. // Which hand should this brush instance track?
    20.  
    21. private enum Hand { LeftHand, RightHand };
    22.  
    23. [SerializeField] private Hand _hand = Hand.RightHand;
    24.  
    25. // Used to keep track of the current brush tip position and the actively drawing brush stroke
    26.  
    27. private Vector3 _handPosition;
    28.  
    29. private Quaternion _handRotation;
    30.  
    31. private BrushStroke _activeBrushStroke;
    32.  
    33. public InputDeviceCharacteristics controllerCharacteristics;
    34.  
    35. public GameObject test;
    36.  
    37.  
    38. private InputDevice targetDevice;
    39.  
    40. void Start()
    41.  
    42. {
    43.  
    44. controllerCharacteristics = InputDeviceCharacteristics.Right | InputDeviceCharacteristics.Controller;
    45.  
    46. var devices = new List<InputDevice>();
    47.  
    48. InputDevices.GetDevicesWithCharacteristics(controllerCharacteristics, devices);
    49.  
    50. if (devices.Count > 0){
    51.  
    52. targetDevice = devices[0];
    53.  
    54. }
    55.  
    56. }
    57.  
    58.  
    59. private void Update()
    60.  
    61. {
    62.  
    63. // Start by figuring out which hand we're tracking
    64.  
    65. XRNode node = _hand == Hand.LeftHand ? XRNode.LeftHand : XRNode.RightHand;
    66.  
    67. string trigger = _hand == Hand.LeftHand ? "Left Trigger" : "Right Trigger";
    68.  
    69.  
    70. // Get the position & rotation of the hand
    71.  
    72. bool handIsTracking = UpdatePose(node, ref _handPosition, ref _handRotation);
    73.  
    74.  
    75. // Figure out if the trigger is pressed or not
    76.  
    77. bool triggerPressed = targetDevice.TryGetFeatureValue(CommonUsages.trigger, out float triggerValue) && triggerValue > 0.1f;
    78.  
    79.  
    80. // If we lose tracking, stop drawing
    81.  
    82. if (!handIsTracking)
    83.  
    84. triggerPressed = false;
    85.  
    86.  
    87. // If the trigger is pressed and we haven't created a new brush stroke to draw, create one!
    88.  
    89. if (triggerPressed && _activeBrushStroke == null)
    90.  
    91. {
    92.  
    93. // Instantiate a copy of the Brush Stroke prefab.
    94.  
    95. GameObject brushStrokeGameObject = Instantiate(_brushStrokePrefab);
    96.  
    97.  
    98. // Grab the BrushStroke component from it
    99.  
    100. _activeBrushStroke = brushStrokeGameObject.GetComponent<BrushStroke>();
    101.  
    102.  
    103. // Tell the BrushStroke to begin drawing at the current brush position
    104.  
    105. _activeBrushStroke.BeginBrushStrokeWithBrushTipPoint(_handPosition, _handRotation);
    106.  
    107. }
    108.  
    109.  
    110. // If the trigger is pressed, and we have a brush stroke, move the brush stroke to the new brush tip position
    111.  
    112. if (triggerPressed)
    113.  
    114. _activeBrushStroke.MoveBrushTipToPoint(_handPosition, _handRotation);
    115.  
    116.  
    117. // If the trigger is no longer pressed, and we still have an active brush stroke, mark it as finished and clear it.
    118.  
    119. if (!triggerPressed && _activeBrushStroke != null)
    120.  
    121. {
    122.  
    123. _activeBrushStroke.EndBrushStrokeWithBrushTipPoint(_handPosition, _handRotation);
    124.  
    125. _activeBrushStroke = null;
    126.  
    127. }
    128.  
    129. }
    130.  
    131.  
    132. //// Utility
    133.  
    134.  
    135. // Given an XRNode, get the current position & rotation. If it's not tracking, don't modify the position & rotation.
    136.  
    137. private static bool UpdatePose(XRNode node, ref Vector3 position, ref Quaternion rotation)
    138.  
    139. {
    140.  
    141. List<XRNodeState> nodeStates = new List<XRNodeState>();
    142.  
    143. InputTracking.GetNodeStates(nodeStates);
    144.  
    145.  
    146. foreach (XRNodeState nodeState in nodeStates)
    147.  
    148. {
    149.  
    150. if (nodeState.nodeType == XRNode.RightHand)
    151.  
    152. {
    153.  
    154. Vector3 nodePosition;
    155.  
    156. Quaternion nodeRotation;
    157.  
    158. bool gotPosition = nodeState.TryGetPosition(out nodePosition);
    159.  
    160. bool gotRotation = nodeState.TryGetRotation(out nodeRotation);
    161.  
    162.  
    163. if (gotPosition)
    164.  
    165. position = nodePosition;
    166.  
    167. if (gotRotation)
    168.  
    169. rotation = nodeRotation;
    170.  
    171. return gotPosition;
    172.  
    173. }
    174.  
    175. }
    176.  
    177. return false;
    178.  
    179. }
    180.  
    181. }
     
    Last edited: Apr 27, 2023
  2. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,064
    Kurt-Dekker likes this.
  3. lex_henrion

    lex_henrion

    Joined:
    Mar 16, 2023
    Posts:
    2
    Hi MaskedMouse! Thanks for the reply! I have already built a unity project and am using the XR Interaction Toolkit for all my controls. However, this code was originally written for device-based controller input, and I'm struggling to figure out how to convert it to action-based. Do you have any suggestions on how to implement 3D drawing in Unity? I was not able to find good up-to-date tutorials, so I was trying to simply adapt old code.
     
  4. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,064
    The XR Interaction Toolkit example project should have everything you need for interaction action-based handling.
    You could just have a
    InputActionProperty
    field and bind it to the trigger of the right hand controller.
    subscribe to the events with the actions that need to be executed.
    But get something simple working on PC first. So you know for sure that what you're doing works. i.e. key down, key up, key held.
    Just a simple
    Debug.Log("Button Down");
    for spacebar and such.
    Get an understanding of how those action based inputs work.

    "3D drawing" is too generic. It depends on what you mean by that. That could be something like google's Tilt Brush or be something like a 3D canvas model and paint on the canvas with brushes. Or maybe some whole other kind of way of painting.
    I wouldn't know any tutorials on those. But googling it will probably take you somewhere.

    Even when they're older tutorials, if you know how it works you can try creating it based on that.
    For an example something like Tilt Brush, my first hunch would be to use the line renderer. When the controller trigger is held down and moved a certain distance then add points for the line renderer to draw to.

    Something like a canvas painter would involve a texture and shader. Probably enough tutorials on that.
     
  5. Kurt-Dekker

    Kurt-Dekker

    Joined:
    Mar 16, 2013
    Posts:
    36,954
    This problem is generally solved by refactoring the original code to work in stages:

    - gather input
    - process input

    Then when you change out device for action based input, only the first part changes.

    This is a massive open-ended topic. You would need to choose what is even the drawing medium. Meshes? Sprite cards in space? Spheres? Something else?

    To get started and to understand your problem space try this:

    - make a VR app
    - hook up a 3D object "paintbrush" to your hand controller
    - when the trigger is down, continuously Instantiate<T>() spheres where the tip of the wand is

    That's a 3D drawing program. That's it.

    With proper use of all the millions of VR tutorials out there and a basic understanding of Unity, that should take you less than 15 minutes.

    Everything else is just features, such as saving, editing, erasing, coloring, changing the brush, etc.