Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Question Where do we set up layers for what touch inputs interact with in WebGL? (performance issue)

Discussion in 'Input System' started by IllTemperedTunas, Oct 21, 2022.

  1. IllTemperedTunas

    IllTemperedTunas

    Joined:
    Aug 31, 2012
    Posts:
    635
    I've been really excited by how performant and stable web builds have been as of late, but there's one persistent issue that's making the web build nearly unplayable, any time I touch the screen or release from the screen, there is a single frame of lag, the more fingers pressing down or releasing simultaneously, the bigger the lag.

    I recently tried to track down the problem and found that this lag only occurs if the touches on the screen intersect with a piece of geo in the scene with collision geometry on it as a raycast seems to be generating every touch and interacting with the game scene. If I remove that collision and the raycasts intersect with nothing, there is no performance hit.

    The problem is I need objects in my game for it to be a game and I can't figure out how to get touches to only interact with the UI layer of things in my game.

    I can't figure out how to set up the layers for what generic touch inputs intersects with in Unity. Thought there might be a setting similar to "culling mask" on the camera, but it can't be found anywhere... I played with camera.eventmask a bit to no effect.

    Does anyone know where we can adjust the default touch layers for webGL?

    (This issue only occurs with touch devices in the WebGL, every other build is fine, been hard to debug this on my end as i'm not familiar with debugging an external device)
     
  2. sunsi12138

    sunsi12138

    Joined:
    Apr 14, 2020
    Posts:
    23
    I don't think the touch input system is a good way to handle interaction with game object, why not just use
    IDragHandler?
    And ui show be ok with Only event system.
    Maybe these function called by input event is called too often, I'm not sure.
     
    IllTemperedTunas likes this.
  3. IllTemperedTunas

    IllTemperedTunas

    Joined:
    Aug 31, 2012
    Posts:
    635
    Think I tracked down the issue. It has to do with Amplify Shader and the Universal Render Pipeline (no idea why this only happens on input down, made this a pain to track down). Amplify shader forces 4.5 shaders, but the only way to get things to render is to set shaders to 3.0 for web builds.

    EDIT: This was the issue. Looks like I'm going to be migrating my project to the built in shader graph.

    NO FRIGGIN' IDEA WHY THIS SHADER PERFORMANCE ISSUE ONLY HAPPENS ON INPUT DOWN AND UP, totally threw me off, who thinks of a shader issue one button down?

    I have to think there's some odd selection stuff happening behind the scenes that is hard to disable.

    Most elusive damned bug I've ever encountered, deconstructed and rebuilt huge sections of my project trying to hunt this down.
     
    Last edited: Oct 25, 2022