Search Unity

Mapping GUI button to axis?

Discussion in 'Immediate Mode GUI (IMGUI)' started by henk, Nov 12, 2007.

  1. henk

    henk

    Joined:
    Nov 7, 2007
    Posts:
    21
    Because a new game of ours is going to be played on a touch screen, I need to control my FPC through screen / GUI buttons instead of keys / mouse. Now I was wondering if it is possible to map a screen / GUI button to an axis?
     
  2. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    Not directly; you can't make an input event fire as the result of a GUI button. But you can just use the same input routines and pass values based on what's being "clicked" on.

    --Eric
     
  3. henk

    henk

    Joined:
    Nov 7, 2007
    Posts:
    21
    Thanks Eric,

    but I can't make use of the GetAxis function to detect x,y,z values, can I?
    So in the case of an FPC that is controlled by GUI buttons, I have to replace Input.GetAxis with the values 1 or -1 depending on which buttons is pressed? And how do I detect which GUI button is pressed? I can't use Input.GetButton, I suppose?

    Henk
     
  4. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    Exactly.

    That depends on how the touchscreen works...does this emulate a mouse? Then you could just use OnMouseDown events like you were clicking on GUI buttons.

    --Eric
     
  5. henk

    henk

    Joined:
    Nov 7, 2007
    Posts:
    21
    Our touchscreen game is making big progress. However I have run into an issue that I want to share.

    We do use the standard MouseLook.cs script. I have adjusted it a little bit: MouseLook acts on dragging the mouse instead of moving the mouse (there is no MouseMove event on touch screens, just a MouseDrag event).

    On touchscreens (Win XP) this was supposed to work OK. But it turns out that while touching the screen even only a short while, the camera flips back and forth to very different positions. How can this be? Normal dragging in the Win environment, like dragging windows, works OK.

    To understand better what was going on, I tried to use a Wacom tablet in 'pen' modus, to mimick the behavior of a touchscreen. That way everything worked perfectly.

    So now I'm wondering where to look for a solution. MouseLook uses axes (Mouse X and Mouse Y); do they not behave properly with a touchscreen? Would it be a good idea to use MousePosition in the MouseLook script? Or what?

    Any suggestions are welcome..
     
  6. jeffcraighead

    jeffcraighead

    Joined:
    Nov 15, 2006
    Posts:
    740
    I would first print out the values being received from the touch screen. It is possible that the touch screen driver is misbehaving.
     
  7. shaun

    shaun

    Joined:
    Mar 23, 2007
    Posts:
    728
    I think Jeff may be correct. If you are using delta values, it's possible they are misbehaving. Also, are you clamping he values (Mathf.Clamp()) returned from the drag operation? This might help reduce the problem.

    Cheers
    Shaun
     
  8. henk

    henk

    Joined:
    Nov 7, 2007
    Posts:
    21
    Thanx Jeff, Shaun,

    i've done some testing, it turns out the touchscreen driver returns correct mouseposition values, but it looks like the Mouse X and Mouse Y axis values are wrong. Mouse X jumps to values like 16 or 24, while Mouse Y tops at about 6 or 7.

    Here is a short list for comparison:
    touch screen:
    260,313 0,0
    257,311 -11,-8
    257,311 -11,-8
    252,310 -14,-5
    252,310 -14,-5
    252,310 0,0
    252,310 0,0
    248,309 -13,-6
    248,309 -13,-6
    244,308 -12,-5
    244,308 -12,-5
    244,308 0,0
    244,308 0,0
    etc (axis values are rounded)

    mouse:
    266,307 -0,1, 0
    263,307 -0,25, 0
    263,307 -0,25, 0
    261,307 -0,15, 0
    261,307 -0,15, 0
    260,307 -0,1, -0,05
    260,307 -0,1, -0,05
    259,307 -0,1, 0
    259,307 -0,1, 0
    259,307 0, 0

    The game is running at a more or less constant speed of 30 fps.

    Comparing the lists it turns out that the mouseposition delta's are somewhat bigger in case of the touchscreen. Probably because the resolution of the touchscreen is not as high as the resolution of the mouse (resolution is restrictied by the number of infrared beams along the edge of the screen). I'll contact the manufacturer to check this out.

    Now I'll try whether I can use the axes by dividing the Mouse X and Y values by about 100.
     
  9. shaun

    shaun

    Joined:
    Mar 23, 2007
    Posts:
    728
    I don't know if this is a ridiculous solution or not, as I don't know the content of your game. Could you not use an invisible plane that sits in front of the camera and raycast to find the position, then move the character towards that vector? The rate of update could determine how quickly the character responds to new coordinates. Since the user is controlling on a 2D plane, the touchscreen, the X,Z coordinates of the in-game plane must have some relevance and should provide some meaningful feedback. I feel I've explained that badly - but hopefully it makes some sense. (I'm imagining something like you're moving a guy around a room and he's following your finger on the touchscreen, but there is latency between his movements and your finger)

    Cheers
    Shaun
     
  10. henk

    henk

    Joined:
    Nov 7, 2007
    Posts:
    21
    Interesting idea, Shaun.

    Actually I had been thinking about a solution like this before, but this game has a first person perspective and I want to control both movement and camera direction, separately. I used mouse dragging for camera movements, mouse clicking for 'firing' and so I needed some GUI buttons for character movement.

    But now it turns out the use of axes for input on a touchscreen is problematic. Not only because of the relatively low resolution of an (infrared) touchscreen (see previous message), but also because a single click on the screen produces high X and Y axis values when the mouse position is different from a previous click. That shouldn't be the case as axis values increase by moving / dragging the mouse, not by simple clicking on a different location. I don't understand why this happens, so I have decided that axes are no good for me.

    Now I'm looking into a different direction. I have created an invisible GUI texture spanning the whole screen. Now I can easily use mouse dragging, mouse clicking and double clicking. Dragging is used for character movement: up and down the screen represents forward and backward movement, left and right speaks for itself. Clicking / pressing the screen without dragging rotates the camera, depending on the mouseposition. And double clicking means 'firing'. So far the whole system seems to work OK.
    Does this sound like a good solution to the experts?