Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Trying to change object color with Touch Controller

Discussion in 'VR' started by cgbenner, Jul 19, 2019.

  1. cgbenner

    cgbenner

    Joined:
    Jun 26, 2019
    Posts:
    13
    Hello.

    First, I'm a CAD guy not a developer. I've been tasked with VR development.... I know, right?

    I'm using VRTK 3 to build very simple interactive scenes of our machinery. I've gotten pretty good at reverse engineering the locomotion and physics interactions like simulating a valve handle turning.

    Where I am stuck is with a specific interaction. I borrowed a script that changes the color of an object with a mouse click. The script works great when NOT in VR, and using the mouse. I've been trying to adapt it to use in the VR environment using the Oculus Touch controllers... any button will do. But I just can't get anything to happen.

    Here is the code I'm using, and the only thing I've changed is the "Input" section. Any help would be appreciated. What I want is for the user to walk or teleport over to the item and use the "hands" to touch it and have the color change.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. //Make sure to change the class name (CCSphere) to whatever you called your script.
    5. public class ColorChange : MonoBehaviour {
    6. public Material[] materials;//Allows input of material colors in a set size of array;
    7. public Renderer Rend; //What are we rendering? Input object(Sphere,Cylinder,...) to render.
    8. private int index = 1;//Initialize at 1, otherwise you have to press the ball twice to change colors at first.
    9. // Use this for initialization
    10. void Start () {
    11.     Rend = GetComponent<Renderer> ();//Gives functionality for the renderer
    12.     Rend.enabled = true;//Makes the rendered 3d object visable if enabled;
    13.     }
    14. void OnMouseDown()
    15.     {
    16.     if (materials.Length == 0)//If there are no materials nothing happens.
    17.         return;
    18.  
    19.         if (OVRInput.GetDown(OVRInput.Button.One))
    20. {
    21.         index += 1;//When mouse is pressed down we increment up to the next index location
    22.         if (index == materials.Length + 1)//When it reaches the end of the materials it starts over.
    23.             index = 1;
    24.         print (index);//used for debugging
    25.         Rend.sharedMaterial = materials [index - 1]; //This sets the material color values inside the index
    26.         }
    27.     }
    28.  
     
  2. darkesco

    darkesco

    Joined:
    Mar 19, 2015
    Posts:
    61
    Hi cgbenner,

    Try:

    if (Input.GetButtonUp("Fire1"))

    instead of:

    if (OVRInput.GetDown(OVRInput.Button.One)).
     
  3. cgbenner

    cgbenner

    Joined:
    Jun 26, 2019
    Posts:
    13
    @darkesco Thanks for the tip, but this didn't make any difference. I'll just have to keep trying.
     
  4. cgbenner

    cgbenner

    Joined:
    Jun 26, 2019
    Posts:
    13
    Ok, so taking a new approach. I can change the material on a "Touch" or "Near Touch" by changing it in the Mesh Renderer with Unity Events.... but only once. Is there any way to change it to a different material with each new touch.

    What I am trying to simulate is a user touching a touch screen on a machine control panel. Each touch would take the user to a new screen. The screens can easily be made and assigned to an interactable as materials. But how to cycle through them with each new touch is the challenge.

    Ideas?
     
  5. cgbenner

    cgbenner

    Joined:
    Jun 26, 2019
    Posts:
    13
    Finally put together enough pieces of borrowed code, worked through 6 days of debugging, countless pages visited and videos watched.... Errors on top of errors. And now it works. I'm no programmer at all, so don't ask me how or why it works, but on last check,... it worked. Now I can touch the ball in VR with the Oculus Touch controller, and have it cycle through a list of materials. Here is the code for anyone who wants to use it. I'm sure it's ugly, so feel free to clean it up and make something more elegant.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using VRTK;
    5.  
    6. public class TestBall : MonoBehaviour
    7. {
    8.  
    9.     public Material[] materials; // allows input of material colors in a set sized array
    10.     public Renderer rend;  // what are we rendering? the cube
    11.  
    12.     private int index = 1; //initialize at 1, otherwise you have to press the cube twice to change color
    13.  
    14.  
    15.     // Start is called before the first frame update
    16.     void Start()
    17.     {
    18.  
    19.         rend = GetComponent<Renderer>(); // gives functionality for the renderer
    20.         rend.enabled = true; //makes the rendered 3d object visible if enabled
    21.  
    22.         //make sure the object has the VRTK script attached...
    23.  
    24.         if (GetComponent<VRTK_InteractableObject>() == null)
    25.  
    26.         {
    27.  
    28.             Debug.LogError(message: "TestBall is required to be attached to an Object that has the VRTK_InteractableObject script attached to it");
    29.  
    30.             return;
    31.  
    32.         }
    33.  
    34.  
    35.  
    36.         //subscribe to the event.  NOTE: the "ObectTouched"  this is the procedure to invoke if this object is touched..
    37.  
    38.         GetComponent<VRTK_InteractableObject>().InteractableObjectTouched += new InteractableObjectEventHandler(ObjectTouched);
    39.  
    40.     }
    41.  
    42.     //this object has been grabbed.. so do what ever is in the code..
    43.  
    44.     void ObjectTouched(object sender, InteractableObjectEventArgs e)
    45.     {
    46.         if (materials.Length == 0)//If there are no materials nothing happens.
    47.             return;
    48.  
    49.         {
    50.             index += 1;//When mouse is pressed down we increment up to the next index location
    51.  
    52.             if (index == materials.Length + 1)//When it reaches the end of the materials it starts over.
    53.                 index = 1;
    54.  
    55.             print(index);//used for debugging
    56.  
    57.             rend.sharedMaterial = materials[index - 1]; //This sets the material color values inside the index
    58.         }
    59.     }
    60. }
     
    Li-Ko likes this.
  6. Li-Ko

    Li-Ko

    Joined:
    Jun 4, 2019
    Posts:
    1
    Hi, dear @cgbenner , thank you so much for sharing! I am also just an architect, practicing Unity and VR from Zero. Could you please explain more details, of how did you apply this code? Is that applied on the Interactable Object? Or how does it work?
    I would like to create a following interaction: I open the closet, I see in transparent yellow hatch or outline the area to drop my Jacket (similar to Snap Drop Zone). Once I touch this area or touch and click with the controller, the actual Jacket with all its original textures and rendered meshes appears there. And backwards, so I am kind of simulating the taking of/on the jacket procedure with predefined area to put it in.

    So basically I need a Jacket mesh, two Materials - this transparent yellow and original material. All I need is to switch them with the controller. I would appreciate your advise and more details to make it work! Thank you so much