Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. gamescorpion

    gamescorpion

    Joined:
    Feb 15, 2014
    Posts:
    132
    Same issue here, jimmya or any unity team have any answers as to how to reset the camera in the code? (I'm searching the code myself for now but would be nice if there was a simple ResetARCamera() to reset the camera back to the original position when the game first started up.)
     
  2. gamescorpion

    gamescorpion

    Joined:
    Feb 15, 2014
    Posts:
    132
    @km30: I found this link (https://stackoverflow.com/questions/46965201/how-to-completely-reset-an-arkit-scene-in-unity), testing out now to see if it actually works =>

    public void ResetScene()
    {
    ARKitWorldTrackingSessionConfiguration sessionConfig = new ARKitWorldTrackingSessionConfiguration ( UnityARAlignment.UnityARAlignmentGravity, UnityARPlaneDetection.Horizontal);
    UnityARSessionNativeInterface.GetARSessionNativeInterface().RunWithConfigAndOptions(sessionConfig, UnityARSessionRunOption.ARSessionRunOptionRemoveExistingAnchors | UnityARSessionRunOption.ARSessionRunOptionResetTracking);
    }
     
  3. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    That class has been evolved into this over in ARInterface:
    https://github.com/Unity-Technologi...ARInterface/Scripts/ARPointCloudVisualizer.cs
     
    sama-van likes this.
  4. PixHammer_

    PixHammer_

    Joined:
    Nov 20, 2017
    Posts:
    16
    Hi, I'm currently developing a game using face tracking am having an issue where my ' ARFaceAnchorRemovedEvent' isn't firing. Nothing seems to be wrong, both the added and update events are working, just not the removed. Any assistance?

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.XR.iOS;
    5.  
    6. public class ARManager : MonoBehaviour {
    7.  
    8.     private UnityARSessionNativeInterface Session;
    9.     public bool isAvailable = false;
    10.     public bool isCameraAuthorized = false;
    11.  
    12.     [Header("Face Variables")]
    13.     public bool faceIsPresent = false;
    14.     public Dictionary<string, float> currentBlendShapes;
    15.  
    16.     // Use this for initialization
    17.     void Start () {
    18.         // mark this object for persistance
    19.         DontDestroyOnLoad (gameObject);
    20.     }
    21.  
    22.     void OnEnable() {
    23.         currentBlendShapes = new Dictionary<string, float> ();
    24.  
    25.         ARKitFaceTrackingConfiguration Config = new ARKitFaceTrackingConfiguration ();
    26.         Config.alignment = UnityARAlignment.UnityARAlignmentGravity;
    27.         Config.enableLightEstimation = true;
    28.  
    29.         if (Session != null) {
    30.             //set up AR session
    31.             if (!Application.isEditor && Config.IsSupported) {
    32.                 Session.RunWithConfigAndOptions (Config, UnityARSessionRunOption.ARSessionRunOptionResetTracking);
    33.                 Debug.Log ("AR IS RESTARTED");
    34.                 isAvailable = true;
    35.             } else {
    36.                 Debug.Log ("AR RESTART NOT AVAILABLE");
    37.                 isAvailable = false;
    38.             }
    39.             UnityARSessionNativeInterface.ARSessionTrackingChangedEvent += onTrackingChange;
    40.         } else {
    41.             //set up AR session
    42.             Session = UnityARSessionNativeInterface.GetARSessionNativeInterface ();
    43.             if (!Application.isEditor && Config.IsSupported) {
    44.                 Session.RunWithConfig (Config);
    45.                 UnityARSessionNativeInterface.ARFaceAnchorAddedEvent += FaceAdded;
    46.                 UnityARSessionNativeInterface.ARFaceAnchorUpdatedEvent += FaceUpdated;
    47.                 UnityARSessionNativeInterface.ARFaceAnchorRemovedEvent += FaceRemoved;
    48.                 Debug.Log ("AR IS RUNNING");
    49.                 isAvailable = true;
    50.             } else {
    51.                 Debug.Log ("AR IS NOT AVAILABLE");
    52.                 isAvailable = false;
    53.             }
    54.         }
    55.     }
    56.  
    57.     void onTrackingChange (UnityARCamera camera) {
    58.         Debug.Log ("TRACKING STATE: " + camera.trackingState);
    59.         Debug.Log ("TRACKING REASON: " + camera.trackingReason);
    60.     }
    61.      
    62.     void OnDisable () {
    63.         if (Session != null) {
    64.             Debug.Log ("AR IS PAUSED");
    65.             Session.Pause();
    66.         }
    67.     }
    68.  
    69.     public void FaceAdded (ARFaceAnchor anchorData) {
    70.         currentBlendShapes = anchorData.blendShapes;
    71.         Debug.Log("Face Added");
    72.     }
    73.  
    74.     public void FaceUpdated (ARFaceAnchor anchorData) {
    75.         currentBlendShapes = anchorData.blendShapes;
    76.     }
    77.  
    78.     public void FaceRemoved (ARFaceAnchor anchorData) {
    79.         Debug.Log("Face Removed");
    80.     }
    81. }
    With this only "Face Added" is in my console, i'm getting all my blendshape data ,and never 'Face Removed' in the console. All help would be much appreciated :)
     
    Last edited: Dec 11, 2017
    ina likes this.
  5. gamescorpion

    gamescorpion

    Joined:
    Feb 15, 2014
    Posts:
    132
    I answered my own question. I confirmed that this works, you can reset the camera position with this code. However if the tracking was not done correctly to begin with (Ex. The camera did not pick up tracking correctly when starting or if the camera cant find a plane to track due to the camera being covered or something similar) then the app will crash if you call this code. Otherwise it works perfectly.
     
    Terry_Stark and SourceKraut like this.
  6. supranshmurtyofficial

    supranshmurtyofficial

    Joined:
    Dec 12, 2017
    Posts:
    2
    I have a procedurally generated mesh which generates a graph based on a function that I pass. I want to take that mesh and render it in AR, but whenever I do so, the mesh gets rendered in mid air, and keeps running away from me, instead of being rendered on a plane. I've tried all the methods listed here, but it still doesn't work. Can someone help?
     
  7. Parallax-hk

    Parallax-hk

    Joined:
    Mar 28, 2017
    Posts:
    4
  8. Nitin_CMLABS

    Nitin_CMLABS

    Joined:
    Jul 10, 2015
    Posts:
    15
    Hello , i have tested Face Tracking in iPhoneX and it's working fine but i need to implement Face Tracking in iPhone 6s, 7, 7 Plus, 8, 8 Plus also.
    Could you please suggest me some alternative way to implement Face Tracking for all iPhone as i need to augment 3d models on Face and move them at runtime?
    Looking forward to your reply.
     
  9. yuliwei

    yuliwei

    Joined:
    Aug 19, 2017
    Posts:
    22
    I had the same issue, have you figured out how to solve it?
     
  10. yuliwei

    yuliwei

    Joined:
    Aug 19, 2017
    Posts:
    22
    Any luck to figure out the solution for iPhone X? I don't want to import the plugin update cuz it would mess up the project.
     
  11. yuliwei

    yuliwei

    Joined:
    Aug 19, 2017
    Posts:
    22
    Hi Jimmya, but IT WILL MESS UP the project. Could you please tell me how to fix the iPhone X issue while not messing up the project setting?
     
  12. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    You don't need to import the whole repo, just grab the ARKit folder from /Plugins.
     
  13. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    Last edited: Dec 13, 2017
    jimmya and gamescorpion like this.
  14. stuartlangfield

    stuartlangfield

    Joined:
    Jul 21, 2017
    Posts:
    13
  15. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Try putting the mesh under a root transform gameobject, and place that gameobject on a plane.
     
  16. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You could look into OpenCV or Vision frameworks, but you will not get the same fidelity and quality as they don't use the depth buffer that iPhone X has on front camera.
     
  17. supranshmurtyofficial

    supranshmurtyofficial

    Joined:
    Dec 12, 2017
    Posts:
    2
    I have the mesh under a game object and that game object contains mesh, and a shadow plane, and that game object itself is inside another game object. I feel like the issue was with my Iphone 8, apparently these things don't work well on an Iphone 8.
     
  18. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    On the note of quality, I can't seem to figure out how to get the ARTrackingQuality values. I can get ARTrackingState and ARTrackingStateReason from the UnityARCamera, but quality is not included... any ideas?
     
  19. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    TrackingQuality was the old name for TrackingState - its vestigial.
     
    shawnblais likes this.
  20. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    That should not be the case.
     
  21. yester30

    yester30

    Joined:
    Mar 4, 2015
    Posts:
    4
    Is anyone else having a iOS device runtime crash/exception in iOS light estimation related code when Enable Light Estimation is OFF?
    Seems to happen every time quality of tracking goes down to 1 then back to 2 again...
    The only way we found to avoid the crash is to keep light estimation ON, which we sometimes do not want to be.
    We have had that issue for several weeks now.
     
  22. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    If you're running it on XCode when crash happens, could you record the log output or callstack of the crash?
     
  23. SweatyChair

    SweatyChair

    Joined:
    Feb 15, 2016
    Posts:
    140
    ARKit occasionally crashed in iPhone 7p and X, at the moment surface(s) are found:

    Message
    Code (csharp):
    1. Native Crash - _platform_memmove (Unknown File)
    Stack trace:
    Code (csharp):
    1. Thread 0 (crashed)
    2. 0   libsystem_platform.dylib            _platform_memmove
    3. 1   crashapp                            MarshalDirectionalLightEstimate_MarshalCoefficients_m987551255
    4. 2   crashapp                            UnityMarshalLightData_op_Implicit_m3245280021
    5. 3   crashapp                            UnityARSessionNativeInterface__frame_update_m1325618420
    6. 4   crashapp                            ReversePInvokeWrapper_UnityARSessionNativeInterface__frame_update_m1325618420
    7. 5   crashapp                            __41-[UnityARSession session:didUpdateFrame:]_block_invoke
    8. 6   libdispatch.dylib                   _dispatch_call_block_and_release
    9. 7   libdispatch.dylib                   _dispatch_client_callout
    10. 8   libdispatch.dylib                   _dispatch_main_queue_callback_4CF$VARIANT$armv81
    11. 9   CoreFoundation                      __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
    12. 10  CoreFoundation                      __CFRunLoopRun
    13. 11  CoreFoundation                      CFRunLoopRunSpecific
    14. 12  GraphicsServices                    GSEventRunModal
    15. 13  UIKit                               UIApplicationMain
    16. 14  crashapp                            main
    17. 15  libdyld.dylib                       start
    Only log from Unity Performance is obtained....

    Any idea?
     
    Last edited: Dec 15, 2017
  24. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Hmm - looks like its trying to send directionallightestimate - are you trying to run facetracking?
     
  25. SweatyChair

    SweatyChair

    Joined:
    Feb 15, 2016
    Posts:
    140
    Thanks for the quick reply, we're not using facetracking but do have a check on light and hint the players for better lighting as:


    Code (csharp):
    1.     UnityARSessionNativeInterface.ARFrameUpdatedEvent += LightCheck;
    2.  
    3. ...
    4.  
    5.     private static float _nextLightCheckTime = 0;
    6.  
    7.     private static void LightCheck(UnityARCamera arCamera)
    8.     {
    9.         if (Time.unscaledTime > _nextLightCheckTime) {
    10.             try {
    11.                 float arCameraAmbientIntensity = arCamera.lightData.arLightEstimate.ambientIntensity;
    12.                 if (state != ARState.NotEnoughLight && arCameraAmbientIntensity <= AMBIENT_INTENSITY_THRESHOLD)
    13.                     SetState(ARState.NotEnoughLight);
    14.                 else if (state == ARState.NotEnoughLight && arCameraAmbientIntensity > AMBIENT_INTENSITY_THRESHOLD)
    15.                     SetState(ARState.LookingForPlane);
    16.             } catch (System.Exception e) {
    17.                 Debug.LogErrorFormat("ARKitManager:LightCheck - Error:{0}", e);
    18.             }
    19.             _nextLightCheckTime = Time.unscaledTime + 1; // Check only every 1 second
    20.         }
    21.     }
    Will this be a problem?
     
    Last edited: Dec 15, 2017
  26. Brent-Poynton

    Brent-Poynton

    Joined:
    Nov 12, 2015
    Posts:
    1
    I'd like to add post processing to the UnityARCameraManager script; yet if I add the handler 'void OnRenderImage(RenderTexture source, RenderTexture destination)' it fails on device with:

    [AGXA10FamilyCommandBuffer renderCommandEncoderWithDescriptor:], line 114: error 'A command encoder is already encoding to this command buffer'

    Any ideas why this is the case when the attached CommandBuffer only inserts commands on the CameraEvent.BeforeForwardOpaque event?
     
  27. lesliekhoohh

    lesliekhoohh

    Joined:
    Oct 4, 2017
    Posts:
    3
    I am trying to get the ARTrackingState and ARTrackingStateReason and print a message (in accordance to the State) on UI Text. However, I have not been able to get those info. I went through the forum and elsewhere but no luck. Please help direct to places where I can learn how to get the info (via scripting) so that I can put them in my unity project. Thank you.
     
    Last edited: Dec 16, 2017
  28. SweatyChair

    SweatyChair

    Joined:
    Feb 15, 2016
    Posts:
    140
    Yea it seems Marshal.Copy in ARLightEstimate.cs causes the problem:
    Code (csharp):
    1. Marshal.Copy (ptrFloats, workCoefficients, 0, numFloats);
    In XCode this line showing "Thread 1: EXC_BAD_ACCESS (code=1, address=0x1)":
    Code (c++):
    1. Marshal_Copy_m2353359830(NULL /*static, unused*/, L_1, L_2, 0, L_3, /*hidden argument*/NULL);
     
    Last edited: Dec 18, 2017
  29. inspection_artc

    inspection_artc

    Joined:
    Jun 27, 2017
    Posts:
    2
    Hi,

    I am trying to build ARkit Remote. I followed all the steps you mentioned above and the button labeled “Start Remote ARKit Session” has appeared in the editor window. However, when I pressed the button, nothing happened. The editor window was still green and the video data was not transmitted. And for the ipad side, the screen kept showing "waiting for editor connection" although on the Unity side the debug log has already showed that the device is connected.
    Could you please help me to figure out what is the issue?

    Thank you very much
     
  30. SourceKraut

    SourceKraut

    Joined:
    Jun 27, 2015
    Posts:
    12
    I think you can only get the state not the reason


    using UnityEngine.XR.iOS;


    void OnEnable(){
    UnityARSessionNativeInterface.ARSessionTrackingChangedEvent += UnityARSessionNativeInterface_ARSessionTrackingChangedEvent;
    }
    void OnDisable(){
    UnityARSessionNativeInterface.ARSessionTrackingChangedEvent -= UnityARSessionNativeInterface_ARSessionTrackingChangedEvent;
    }
    void UnityARSessionNativeInterface_ARSessionTrackingChangedEvent (UnityARCamera camera)
    {
    Debug.Log("TrackingChangedEvent ");
    Debug.Log("Trackingstate : "+camera.trackingState.ToString ());
    }
     
    lesliekhoohh likes this.
  31. HAKOPTAK

    HAKOPTAK

    Joined:
    Nov 22, 2017
    Posts:
    2
    Same here. Here's part of the device crash report:

    Date/Time: 2017-12-18 14:42:00.7914 +0100
    Launch Time: 2017-12-18 14:41:57.0305 +0100
    OS Version: iPhone OS 11.2 (15C114)
    Baseband Version: n/a
    Report Version: 104

    Exception Type: EXC_BAD_ACCESS (SIGSEGV)
    Exception Subtype: KERN_INVALID_ADDRESS at 0x000000003ca87d40
    VM Region Info: 0x3ca87d40 is not in any region. Bytes before following region: 3346318016
    REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
    UNUSED SPACE AT START
    --->
    __TEXT 00000001041d4000-0000000105c14000 [ 26.2M] r-x/r-x SM=COW ...p/limbojimbo]

    Termination Signal: Segmentation fault: 11
    Termination Reason: Namespace SIGNAL, Code 0xb
    Terminating Process: exc handler [0]
    Triggered by Thread: 0


    And when I debug in XCode the crash happens at:

    // System.Void System.Runtime.InteropServices.Marshal::copy_from_unmanaged(System.IntPtr,System.Int32,System.Array,System.Int32)
    extern "C" void Marshal_copy_from_unmanaged_m98320635 (RuntimeObject * __this /* static, unused */, intptr_t ___source0, int32_t ___startIndex1, RuntimeArray * ___destination2, int32_t ___length3, const RuntimeMethod* method)
    {
    typedef void (*Marshal_copy_from_unmanaged_m98320635_ftn) (intptr_t, int32_t, RuntimeArray *, int32_t);
    using namespace il2cpp::icalls;
    ((Marshal_copy_from_unmanaged_m98320635_ftn)mscorlib::System::Runtime::InteropServices::Marshal::copy_from_unmanaged) (___source0, ___startIndex1, ___destination2, ___length3);
     
  32. HAKOPTAK

    HAKOPTAK

    Joined:
    Nov 22, 2017
    Posts:
    2
    And maybe this screen dump gives some extra insight...
     

    Attached Files:

  33. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    This seems to work for me, although quality seems to always be Limited, and Reason just switches from Initializing to None. So not all that useful.
    Code (CSharp):
    1. void Start () {
    2.     UnityARSessionNativeInterface.ARSessionTrackingChangedEvent += OnSessionTrackingChanged;
    3. }
    4.  
    5. void OnSessionTrackingChanged(UnityARCamera camera) {
    6.     state = camera.trackingState;
    7.     stateReason = camera.trackingReason;
    8. }
    Edit - Once in a while I get "normal" for tracking, so seems to be working, I guess my house is just dark.
     
    lesliekhoohh likes this.
  34. SweatyChair

    SweatyChair

    Joined:
    Feb 15, 2016
    Posts:
    140
    That's same as mine, I just use an older version (Oct) without the added face tracking things. The newest version is buggy!
     
  35. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    Curious if anyone has a build process to inject the 'arkit' required capability?

    Currently I'm manually entering it in xcode.
     
  36. Parallax-hk

    Parallax-hk

    Joined:
    Mar 28, 2017
    Posts:
    4
    Hi,I am doing some planes tracking but after 5 mins the phone get heat and use a lot battery.
    I use config.planeDetection = UnityARPlaneDetection.None for pause the plane detection, but the Cpu usage in xcode is still 83%, what else can i do for decrease the battery usage and the device temperature.
    Thank you very much
     
    Last edited: Dec 20, 2017
  37. mikeyt0078

    mikeyt0078

    Joined:
    Nov 20, 2017
    Posts:
    8
    Hi All,
    Wondering if anyone has a solution for moving and placing multiple objects.

    Currently using the HitCube and children everything gets placed in the same x,y,z which is to be expected. I have tried to deactivate the game objects upon placing and this works, but I want to be able to come back to an object and reposition if necessary - much like the ikea app functionality that allows you to place multiple objects and then move the objects around the room.

    Been trying to do this with a combination of approaches but not found anything that works well.

    Does anyone have an clues how to select just one object on the plane rather than all? The hittest script (arkit) and placeonplane (AR interface) code it so everything moves apart from the planes.

    Thanks.
     
  38. gamescorpion

    gamescorpion

    Joined:
    Feb 15, 2014
    Posts:
    132
    Hey all, not sure if any of you are having any issues with this, but one of my users on an iPad 2017 is not able to have the ARKit stuff work. I tested on an iPhone SE perfectly fine. Is there an issue with iPads and Unity ARKit?
     
  39. DanTaylor

    DanTaylor

    Joined:
    Jul 6, 2013
    Posts:
    119
    A while ago I posted about sloooow framerates with the AR camera feed. Well, I finally fixed it! I had a number of cameras that were using the Unity AR Video component. Even though only one was ever active at a time, they seemed to interfere with one and other. So... I removed all the offending components, and added a separate camera, which just showed the background - no objects. It worked like a charm with little to no frame rate hit. It required a little rejigging of some of the presentation functionality... but not too much. Sharing my solution in case anyone else has issues with choppy video feed.

    TLDR - Don't have more than one camera with the Unity AR Video component, even if only one is ever active at once. If you need multiple cameras, use a separate camera on a lower layer for the background.
     
    SweatyChair likes this.
  40. gentlemanman

    gentlemanman

    Joined:
    Oct 22, 2017
    Posts:
    1
    In ARkitforUnity, how can I get every frame of real scene. I need RGB frame for next operation.
     
  41. prabhusam

    prabhusam

    Joined:
    Aug 3, 2017
    Posts:
    6
    Unity 2017.3 is available
     
  42. lesliekhoohh

    lesliekhoohh

    Joined:
    Oct 4, 2017
    Posts:
    3
    Hi Burglecut, Thank you. Will check it out.
     
  43. lesliekhoohh

    lesliekhoohh

    Joined:
    Oct 4, 2017
    Posts:
    3
    Hi Shawnblais, Thank you. I will amend my scripts accordingly.
     
  44. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    Put your items on a layer and do a standard raycast from the Camera? Once you hit an object you're interested in, select it, and move it in it's parents local space, using the plane as bounds.
     
  45. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    60
    I don't know if anyone else has made a cross platform multiplayer AR game yet, but I thought I would share it here.

    Multiplayer Snowball Fights for ARKit, ARCore, and Tango:
     
    jimmya likes this.
  46. Kowciany

    Kowciany

    Joined:
    Feb 24, 2015
    Posts:
    6
    Hi again ! Does anyone have noticed that when we change orientation of device - there is a glitch with black stripe on one frame... (after changed orientation everything is ok) It's showing in a random way... Any clue ? I've spend all day to try to figure it out... no luck. The only thing which helps is disable ar video and set tracking camera to solid color but it's not full AR then. Additional info: No difference if we check/uncheck animate on orientation, Unity 2017.1 / 2 / 3 checked, latest ARKit plugin checked - nothing works.

    Screenshot from recorded video.

    IMG_0165.png
     
    Last edited: Dec 20, 2017
  47. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    shawnblais likes this.
  48. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Did you check if they have an iPad with A9 or better, and do they have latest ios 11 installed?
     
  49. mikeyt0078

    mikeyt0078

    Joined:
    Nov 20, 2017
    Posts:
    8
    Thanks Shawn. I've tried that but still when the ray returns a hit on an object and I then select where to place it, it moves all objects at the same time. My setup is as follows:

    1) Ray Caster on main camera (child of AR Root)
    2) Objects on "x" layer with PlaceOnPlane script assigned to the respective objects (also tried as children of a gameobject)

    Is that the right setup? Any ideas on what to tweak?
     
  50. shawnblais

    shawnblais

    Joined:
    Oct 11, 2012
    Posts:
    324
    I haven't used that script, but looking at it now, if you have PlaceOnPlane on a bunch of objects, then they're all going to respond to a MouseDown at the same time so it makes sense that they would all move.

    I wouldn't even bother using that script, just pull out the functionality into some other control script. Maybe something like this?
    Code (CSharp):
    1. if(Input.MouseButtonDown(0)){
    2.       var planeHit = GetRayHit("ArGameObject");
    3.       if(planeHit.transform){
    4.          selected.transform.position = planeHit.point;
    5.      }
    6. }
    7.  
    8. RaycastHit GetRayHit(string layer){
    9.     Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);
    10.     int layerMask = 1 << LayerMask.NameToLayer(layer);
    11.     RaycastHit rayHit;
    12.     Physics.Raycast(ray, out rayHit, float.MaxValue, layerMask);
    13.     return rayHit;
    14. }
     
    Last edited: Dec 20, 2017
Thread Status:
Not open for further replies.