Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question How to simulate real user input in tests?

Discussion in 'Testing & Automation' started by Tkz00, Nov 7, 2023.

  1. Tkz00


    Oct 19, 2018
    Hi! I've recently started adding test to a project, but I've realized I need to simulate a user's touch (mobile game) completely, previously I was just invoking the click event in the elements I needed, but if for example I had to test a touch on a button, and that button was out of the screen, or behind another UI element, which would make it unreachable for a real user, the test would still pass since I'm just invoking an event to an object, not simulating a real touch on the screen.

    I've tried with this script:

    Code (CSharp):
    1. public class TestClick : InputTestFixture
    2. {
    3.     Mouse mouse;
    5.     [SetUp]
    6.     public override void Setup()
    7.     {
    8.         base.Setup();
    9.         SceneManager.LoadScene("Scenes/Testing");
    10.         mouse = InputSystem.AddDevice<Mouse>();
    11.     }
    13.     [UnityTest]
    14.     public IEnumerator TestClick()
    15.     {
    16.         GameObject gameObjectToClick = GameObject.Find("Button");
    17.         yield return ClickUI(gameObjectToClick);
    18.         yield return new WaitForSeconds(2f);
    19.     }
    21.     public IEnumerator ClickUI(GameObject uiElement)
    22.     {
    23.         Set(mouse.position, new Vector2(300, 300));
    24.         yield return new WaitForSeconds(0.1f);
    25.         Click(mouse.leftButton);
    26.         yield return new WaitForSeconds(0.1f);
    27.     }
    28. }
    And I have a game object in my scene that prints the position of the mouse whenever there is a click (not in the object, in the scene in general), it works fine when I play the scene normally but when running the test nothing is printed. I don't know what is wrong, I've seen thing like this:

    Code (CSharp):
    1. InputSystem.QueueStateEvent(mouse,
    2.     new MouseState
    3.     {
    4.         position = currentPosition + delta,
    5.         delta = delta,
    6.     });
    but I couldn't make it work either. If any of you has fully simulated clicks/touches I could really use the help.

    Thanks in advance!
  2. CodeSmile


    Apr 10, 2014
    Don't go down that road!

    You do NOT have to test touch input hitting a GUI control. In fact, most test-driven developers will tell you not to test the GUI.

    First issue with this is: WHERE do you make that touch happen? It cannot be a fixed screen position in pixels, otherwise you would have a fixed GUI where every control is always in the same location no matter which device. In that case, the issue you described could never occur because you would design your controls to be in the safe area visible on all devices.

    Then if you use a percentage of the screen width and height as the touch input position, how do you determine that? You'd have to get the GUI control's current position and use that. This defeats the purpose, the touch will always hit the button.

    The plain stupid test you can make is this: take any control that you know should be on the screen. Then test its bounding box against the screen size. Is it fully contained? Test passes. No input needed. You can continue simulating events in your unit tests, you just need to add an extra test for a control's bounding boxes.

    Still, that's not a very useful test. You can entirely avoid buttons outside the screen if the GUI correctly relies on canvas scaling and layout, or even better is built with UI Toolkit. You specify that a control, for example, is anchored at the top right corner, has padding and margin, flex width, alignment, etc etc. Then while designing the GUI you test this on the smallest and largest screens, and the most extreme aspects that are supported (eg 5:4 and 32:9). If it's on screen in these four cases, it will NEVER be off-screen. Unless of course you animate it to be off-screen. But again, this is just something that needs manual testing to confirm it works, and after that you can leave it alone.
    Last edited: Nov 8, 2023
  3. Tkz00


    Oct 19, 2018
    I understand what you are saying and I've had the same thought process, but the end line is that I need to generate real integration tests, and this is impossible without simulating a user's input. To get the GUI's position for every resolution you just convert it's world position to the screen position, this is not only necessary to test if the GUI is outside the screen (your argument there is correct), but also if another GUI element is before the one I'm trying to test in the screen, obstructing it from being touched by the user. While I understand your arguments, I'm trying to implement real integration tests, I've tried using the Touch API as well as creating a Mouse object, setting it's position and using the Click(mouse.leftButton) method, all of this inside a test that inherits from InputTestFixture. If anyone has been able to make this work pls let me know.
  4. rdjadu


    May 9, 2022
    I have forgotten most of the details here but my suspicion would be it's the mouse position that's to blame. You're probably setting the position in game screen coordinates whereas mouse coordinates are in display coordinates.

    The UI Tests in the input system basically do what you're doing here and they're translating coordinates from game screen space into display screen space.

    IMO going this route for full-scale integration testing is perfectly fine and in fact quite useful. The tests, however, do tend to be of the brittle and expensive variety. Still, IMO, if they are kept small in number and deployed where you get a lot of bang for the buck, I think they can be a pretty useful tool.

    PS: if you want to test touch, there's also some helpers in
    to create touches. Also creates a touchscreen on the fly, if necessary.