when i used ngui, i used raycasts a lot to detect what gui elements are "under" my mouse. how can i do that with the new gui system since there is no camera i can shoot my raycast at? thanks for your help!
Right now I'm using a World Space Canvas and attached colliders to the GUI objects (a single button, for now) I want to hit with a raycast. It works. But now I can't figure out how to use the Event system to broadcast selection events etc. to the UI system. Keeps freaking out over wrong type of the object I'm passing to it.
For 3D colliders attach the PhysicsRaycaster to your camera. This will then allow the objects (cubes, spheres, etc..) to receive events such as OnDrag, OnPointerDown, and all the rest.
i do not want 3d colliders. i just want to detect if there is for example a button under my mousecursor. i added 2d boxcolliders to my gui items and the physics 2draycaster to my camera but my raycasthits stay empty.
I've only been poking at the 4.6 beta for a few minutes but it might be worth trying to add, for that scenario, an Event trigger component to the UI button, then click its 'add new' button, and choose 'Pointer Enter'. Then click the plus to decide what you want to communicate with when that event is triggered. I can't say I've actually done anything with the new Event components yet but thats my initial hunch. edit - tried it quickly and it worked as I expected, yay.
You want to call: IsPointerOverEventSystemObject on the Event system. It will return true if the cursor is over an event system object.
I would still like to know how to do this with a Raycast. I am using a different input method that doesn't use a mouse but I have screen coordinates and I want to be able to call "highlight" or "pressed" events on buttons that are raycast from those screen coordinates.
Hi, To do this you want to write a custom InputModule for the UI. It's the designed way to send events to objects within the UI system. If you look here you will find our implementations for the pointer input modules: https://gist.github.com/stramit/c98b992c43f7313084ac https://gist.github.com/stramit/ce455682b7944bdff0e7
Thanks so much for the reply. It's unclear to me when the functions in the pointer module are getting called (e.g. GetMousePointerEventData(), GetTouchPointerEventData(Touch input, out bool pressed, out bool released)) Presumably to write a custom InputModule I'd need to have these functions registered/called somehow.
Well you don't need to call them, you just need to create a valid event data (they are mostly helper functions). So for example you need to populate one with the delta and other data since last frame. We do it like this: Code (csharp): protected virtual PointerEventData GetMousePointerEventData() { PointerEventData pointerData; var created = GetPointerData (kMouseId, out pointerData, true); pointerData.Reset (); if (created) pointerData.position = Input.mousePosition; Vector2 pos = Input.mousePosition; pointerData.delta = pos - pointerData.position; pointerData.position = pos; eventSystem.RaycastAll (pointerData, m_RaycastResultCache); var raycast = FindFirstRaycast (m_RaycastResultCache); pointerData.pointerCurrentRaycast = raycast; m_RaycastResultCache.Clear (); return pointerData; } So we just create a PointerData, and populate it with position and delta then use it for a raycast. What you want to do is the same but instead of using a mouse use your screen point. You may not even need / supoort delta, but you could do it if you wanted.
I'm still pretty confused about how to do what the original poster requested. With NGUI, I used RaycastAll to get an array of hits under my mouse. With the new UI, I cannot seem to do this. I've looked over this thread and many more, but I'm just not getting it. In NGUI, this is what I use to get all game objects under the mouse: Code (CSharp): int layer = 1 << 15; Ray myray = UICamera.currentCamera.ScreenPointToRay(Input.mousePosition); RaycastHit[] hits = Physics.RaycastAll(myray, 1000.0f, layer); Tim, I see the links to the two scripts and the sample above, but I'm not sure how to actually implement these. Are the two scripts functionality that is coming in a future version of 4.6?
It is already in 4.6. It's the source code of the TouchInputModule and StandaloneInputModule that are found on the "EventSystem" game object that gets created when you first create the uGUI in your scene. Those classes coupled with EventSystem script is responsible for the event handling of the uGUI.
Okay, thanks! I almost have everything set up, but I'm getting an error. Here is my code. When I call RaycastMouse from another class, the debugger throws a System.NullReferenceExeption when it hits the line with GetMousePointerEventData. (I've not added a return value yet) Code (CSharp): internal class MyPointerInputModule : PointerInputModule { public override void Process() { } public void RaycastMouse() { PointerEventData ped = GetMousePointerEventData(); // I get a System.NullReferenceExeption here eventSystem.RaycastAll(ped, m_RaycastResultCache); List<RaycastResult> results = m_RaycastResultCache; } }
Btw, I created a very simple C# script that does test for mouse over using the event system, and it works great on my Mac, but when I export to an iOS device, it doesn't...seems like a bug somewhere in the software. Here is the script: Code (CSharp): using UnityEngine; using System.Collections; using UnityEngine.EventSystems; public class touchTest : MonoBehaviour { public int overUI = 0; public int ReturnUIState (){ return overUI; } void Update () { if (EventSystemManager.currentSystem.IsPointerOverEventSystemObject ()) { // we're over a UI element... return 1 overUI = 1; } else{ overUI = 0; } } }
Ah! That's because the function takes an Id for which pointer. 1 is first finger, 2 is second finger ect. By default if you don't specify an argument it uses the mouse left click id (-1)
That code appears to be obsolete, or am I doing something wrong? http://forum.unity3d.com/threads/standalone-input-module-sources.273066/#post-1839473
Hi, I need to do the same, a simple raycast to know the Rect Transform based on a position in the screen. Does anyone have a simple code to do this? I am horrified to see that such a simple thing is so complicated to do. I asked more and more questions about the new GUI... I hope that the documentation will be more explicit in the official release, with examples ...
Hi, I finally succeeded with the information provided by Tim C in this post. But it might be interesting to have a simple method to obtain a PointerEventData my script Code (CSharp): using UnityEngine; using UnityEngine.EventSystems; using System.Collections; using System.Collections.Generic; public class Mypointer : PointerInputModule{ public override void Process() { } public List<RaycastResult> RaycastMouse(){ PointerEventData pointerData = null; if (!m_PointerData.TryGetValue (-1, out pointerData) ){ pointerData = new PointerEventData (eventSystem) { pointerId = -1, }; m_PointerData.Add (-1, pointerData); } pointerData.position = Input.mousePosition; eventSystem.RaycastAll(pointerData, m_RaycastResultCache); List<RaycastResult> results = m_RaycastResultCache; Debug.Log( results.Count); return results; } }
I am trying to do this same thing. I want to hit a new UI button with a raycast. I am using Java, and I have no idea what you guys are talking about. I would request unity add that to the UI. Beside that I would also love some explanation of this work around in java. Just a little more breakdown I not understanding how to really get started with this problem as I have only done java. Any help would be greatly appreciated.
Just wanted to share what worked for me in my case, where I had an item and an inventory and I needed to cast a ray from the item to the slots in the inventory to determine if it fits or not, here's the question with the answer http://forum.unity3d.com/threads/raycast-towards-ui-elements.284264/ - I just needed eventSystem.RaycastAll(pointer, results) where pointer is a PointerEventData with its position set to where you want to raycast from.
You need to read the post throughly before wasting time writing a insultingly basic answer that you assume is the thing everyone is looking for. We are asking if there is a way to use ray-casts to select UI elements. Not 3D objects in the scene. A reason we may want this is to support such functionality as Multi-touch which UnityEngine.EventSystems does not support to my knowledge.
@Galactic_Muffin Hi, IMHO, I don't think raycasting to select UI elements has that much to do with having or not having multi touch. You can also get both: You can iterate through list of UI images or buttons you get with eventSystem.RaycastAll, and get the suitable object from there, if it matched your item. For touch, you can also iterate through Input.touches, and if you find suitable, you can deal with it.
Sorry to resurrect an old thread as I am just wondering if any simplification to the above is now available? In reply to @eses raycasting to select UI elements does not have anything to do with multi-touch per se. The reason people bump into this when they are dealing with multitouch is that the first step towards correct multi-touch support involves disabling mouse pointer emulation: Input.simulateMouseWithTouches = false What happens then in my experience is a Button won't even respond to a tap. So, developers are unwittingly exposed to a mix of low level and/or sophisticated APIs. Compared to what? Compared to, say, associating a handler with a Button which with a mouse (or mouse emulation aka single touch) does not even require programming. It would be nice if simple things (such as rect transforms being tapped/clicked and generating related events) were handled consistently, with or without mouse emulation.