Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

VRTK or Native Unity VR for a project starting development in 2019

Discussion in 'VR' started by ROBYER1, Jan 29, 2019.

  1. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Hi all,

    I've been experimenting with VRTK and Unity Native VR support through their LWRP VR Pipeline project template for a new VR project starting soon - seems that having both in the project doesn't seem to cause any clash, however...

    - VRTK has a tonne of stuff already made in it like interactable objects (grab/throw), raycasting, interaction with UI with a pointer.

    - Unity Native VR may be better supported by Unity over time, however I'm either going to be making most interactable elements and UI elements from scratch or finding a way to bridge Unity's XR Rig with the VRTK scripts somehow

    What have other devs found the best starting point around Unity 2018.3~ releases for VR development?
     
  2. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    8,192
    I tried VRTK, but it didn't meet my needs, so I went with native stuff... and haven't regretted it. Yes, you have to write (or ask for someone to share) a bit of code for UI interaction, and your own grab/throw/locomation code (which is likely to be somewhat unique to your game anyway). But these aren't hard, and they're a very minor part of the total development time.
     
    plmx likes this.
  3. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Probably makes sense for me to get a head start and start following some tutorials on basic mechanics for VR that I can make myself for the native stuff, heck I might even be able to butcher in some of the VRTK scripts with a bit of investigation.

    It's a relief that the native stuff is good enough on its own as I'd rather not be too dependant on other SDK's for a project I will be working on, thanks Joe!
     
    JoeStrout likes this.
  4. sinzer0

    sinzer0

    Joined:
    Aug 29, 2013
    Posts:
    128
    Every time I look into native XR all I see is some basic input mappings and device tracking. Doesn't seem to be much info about native XR on google either when you want to solve a problem.

    SteamVR etc do a ton of heavy lifting. Sadly VRTK is moving to new version and looks like they have a long ways to go.

    Guess it depends on your platform goals though.
     
    ROBYER1 likes this.
  5. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Going to go ahead with VRTK I think, tried MRTK but it didn't seem too user- friendly and the presets were mostly useless for a general artist use like me with only a bit of coding knowledge
     
  6. OleJuergensen

    OleJuergensen

    Joined:
    Mar 21, 2018
    Posts:
    11
    Personally, I would never use VRTK again. But then again, I am a software engineer.
    However, I think what VRTK gives you in dev speed at the beginning of your project, it will cost you in the end. The complexity is really just moved from code to dozens of settings and a bunch of bloated components. And it still limits you in what you can do. Once you need something very specific (and you probably will) you are blocked by it. But again, this might just be my personal preference.

    Have you considered SteamVR? The 2.0 version is not half as bad as the one before.

    Anyway, best of luck with your project!
     
    ROBYER1 and sinzer0 like this.
  7. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Opting to use SteamVR 2.2 for a demo of a scene I am building now as VRTK is undergoing some heavy development at the moment for the v4 version funded by Oculus but documentation is pretty light for that at the moment.

    Further down the line I will have to decide on the SDK for the chosen platform of the app and I wish that there was something as comprehensive as SteamVR for cross-platform (I can only dream!).

    If not then the Unity Native XRrig seems to work with every headset I have tried so far so like JoeStrout mentioned I could just work on input mapping for that and basic locomotion (which I'm writing myself anyway) and go from there.
     
  8. BernieRoehl

    BernieRoehl

    Joined:
    Jun 24, 2010
    Posts:
    66
    I've been researching this very topic quite a bit recently. :)

    It really depends on what functionality you need. Do you want one package that does everything, or do you want to assemble your own custom solution out of separate pieces?

    If you're looking for an all-in-one solution, there are only two choices that I've found.

    VRTK is a "swiss army knife" that does everything. I've used VRTK version 3 on a project, and actually wrote a book chapter on creating multi-user social environments built with VRTK and PUN.

    However, I agree with @OleJuergensen that VRTK 3 is fine if you do things their way, but if you ever need to do things differently then it will fight you every step of the way. I ended up spending several days ripping VRTK out of my project and writing my own. Much happier with the result.

    You should also know (and I'm sure you do already) that VRTK 3 was abandoned by the developer, and since it doesn't support the current version of SteamVR it's really a dead end. Oculus has paid that developer to spend six months developing VRTK 4, but it's still missing a lot of features and has very limited documentation. There's also the question of what happens when the money runs out.

    The SteamVR Interaction Toolkit is awesome. It's well-designed, and appears to be inspired by NewtonVR. I've used it on one small project, and was very happy. It's also being supported by Valve, so it won't be going away anytime soon. However, the downside is that it's tightly integrated with SteamVR 2. If you're developing for another platform (e.g. Quest) it's simply not an option. The amount of work you'd have to do to generalize it is just not worth it.

    The trouble with these all-in-one solutions is that they're hard to customize for your game or port to new platforms. So, if you're looking to assemble something yourself, what parts do you need? Here's a list...

    Input Abstraction. You want to be able to move your code from platform to platform without having to completely rewrite everything. That's the idea behind the Unity XR input system, which is pretty good. However, it doesn't seem to play nice with SteamVR. If you make even one call to SteamVR 2, the Unity Input system stops recognizing your device inputs. Not sure if that's Unity's fault or Valve's, but it's definitely annoying.

    Object Manipulation. The code for grabbing and throwing virtual objects is just complicated enough that you don't want to have to implement it yourself if there's a good off-the-shelf option. There are several methods of picking up an object (re-parenting it to your hand, or using a physics joint, or applying velocities to have the object follow your hand). Ideally, you want a library that lets you choose between them, possibly even on a per-object basis.

    There are several candidates in the object manipulation category, and they all work pretty much the same way -- there's a component you add to each of your controllers, and a component that you add to each object you want to interact with. Works well.

    However, almost all of the toolkits that provide this functionality have been abandoned and have no support for SteamVR 2. In this category are Newton VR, ViveGrip and EasyGrab. The only one that seems to be actively supported is VR Interaction. VR Interaction has its own input abstraction component called VR Input, which has support for Oculus, Steam VR 1 and partial support for Steam VR 2. I'm experimenting with it, and so far it looks okay. The developer is very responsive, which is nice.

    Some also offer haptic feedback, and some can automatically generate sounds when objects collide. All very useful.

    Higher-Level Interactables. Things like doors, drawers, knobs, levers, sliders, buttons and so on. A package that provides these can be a real time-saver. All the packages I've mentioned above have support for these, except (unfortunately) VR Interaction, which is the only "lightweight" package that's actively supported.

    User Interface. Interacting with traditional Unity interface elements, usually with some kind of virtual laser pointer. I haven't researched the options yet, but I've tried a number of solutions and it doesn't seem like a complex problem. There's one called VR UIKit that uses the VRInput abstraction layer from VR Interaction, but I haven't tried it yet.

    Locomotion. This tends to be so game-specific that you're better off rolling your own. Obviously there's simple controller-based movement that you can implement yourself in an hour or two. There are also more interesting things like arm-swinging and walking-in-place. It's really up to you, and there are several packages in the Unity asset store to use as a jumping-off point. Some packages also offer motion sickness mitigation techniques, such as vignetting.

    Teleportation. If your game uses it, it's pain to implement yourself. There are several packages in the asset store, including an Arc Teleporter that I've had good success with. It also uses the VR Input abstraction layer from VR Interaction.

    Miscellaneous. In this category are things like fading the scene to black and back again during level transitions, or detecting when the user has put or or removed their headset. You can either roll your own or find small packages that do the job. For screen fading, there's one called simply Screen Fade that's cheap, works well and is completely cross-platform.

    Anyway, good luck. A lot of us are all looking for similar tools, and fortunately things are maturing quickly.
     
    Last edited: Jun 15, 2019
  9. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Thankyou so much for putting in the effort to write this up, it mirrors a lot of the things I have been finding myself, the object manipulation recommendations are exactly what I needed.

    For the interests of my project I am using VRTK v4 as it hasn't been interfering with anything else I have been coding for a prototype application. However later down the line I am going to need to think about either sticking to a platform SDK or ripping out VRTK v4 and going fully base unity with minimal features that will interfere with current/future platform features.

    Just wishing the decision of what to use was easier or that there was a more long-term solution when VRTK v4 funding could cease and support dropped again like v3 :O
     
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347

    Fired up Unreal Engine recently to check out their VR setup, they have preset teleportation and locomotion setups that work out the box with any headset, very impressed and a bit disappointed that Unity doesn't have a simple VR setup for new projects like it!
     
    Shizola likes this.
  11. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    278
    Even a simple blank demo scene just with locomotion and vr hands, using xr, would be a very welcome thing.
     
  12. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    Hey guys,

    Have a similar query, you can read a bit of some of my fumblings here...

    https://forum.unity.com/threads/unity-native-vr-support-best-practise-project-setup.722564/

    After reading the options you've put @BernieRoehl I'm a bit sad, had hoped to use VRTK as a good reference but I did notice the v3 / Steam and beta v4 situation which has made me wonder if its the best way to go.

    An reading through the rest it seems like there will be a fair bit of fumbling for me as a newbie to get up and running. Wasn't expect such a challenge to have some basic mechanics to learn from and adapt using the built-in stuff, maybe my expectations are wrong?

    I went through the the new 'Unity Learn' trial and was excited to see a whole XR section and thought it would take me through and get me up to speed. If you go into it you'll find a best practises which is good and then a bunch of stuff around VRTK (v4 beta), nothing with their own 'native Unity XR' and how to use it ?!

    Whats not helping is like @ROBYER1 I've fired up Unreal and within 5 minutes I had a VR template up and running with nice mechanics working based off their built-in VR support, 0 issues.

    Maybe Unity will get things a little more mature this year, I really would like to use it but I don't have unlimited time and things just work out the box with Unreal. I'm not saying that Unreal will be easier / better for my projects and as it stands I know more about Unity by far so suspect if I run into issues I will be able to solve them more easily in Unity but who knows.

    Choices......choices....
     
    Last edited: Aug 8, 2019
  13. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    ROBYER1 likes this.
  14. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    @Matt_D_work do your team have any thoughts on this? Getting some useful user feedback on VR templates here!
     
  15. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    https://forum.unity.com/threads/unity-native-vr-support-best-practise-project-setup.722564/

    Some more feedback for you @Matt_D_work

    Right now as a lone dev with some experience with Unity the learning process has gone like this for me...

    1. Look the official manual and not Unity has built in XR support so no need for 3rd party bits from Oculus or Value, great now I have my HMD tracking by ticking a single box and that it awesome to be honest.

    2. I want to get my controllers involved so read the XR input section, find the input mappings. After some googling around stumble on someone that shows how the mappings work as its not that obvious (remember, I have some experience, not lots). Now I can get a button working, awesome.

    3. Now, how can I track my controllers? More googling as I can't see anything in the manual. Find older articles and videos showing something call the 'Tracked Pose Driver'. Download example project and all refs to Tracked Pose Driver can't be resolved. More googling and find out for some reason its in a package called 'XR Legacy Input Helpers'. That can't be right as its 'legacy', whats the current supported way?

    Come into the forum and read around for an hour or two and start to realised it is the current way to track devices but for some reason its called legacy. Ahhhh that must be because of the new input system I keep hearing about, great I'll use that except its not production ready, nor is there helpful docs of any kind that make to me anyway.

    4. Take a step back and think, ok for the moment maybe I should go with the official Oculus integration to make life a bit smoother whilst I'm learning, they should have good docs etc. Download and import the asset / integration and open the locomotion example and its not working. I have no hands / controllers in VR. More googling and find out I need to enter Avatar ID's which isn't mentioned anywhere I could find.

    So now I have hands, great. Start trying to teleport and instantly fall through the floor. More googling and discover snippet of advice regarding timestep to be synced with your HMD refresh rate. Do this and now I'm not falling through the floor. Result! Except now the teleporting doesn't work randomly sometimes and just gets stuck and there's a weird delay before it. More messing with the locomotion component and think I've partially fixed things but its still not perfect.

    An that's the story so far, I feel like I'm fighting to get to the start line. I've been a dev in one way or another for years, have C# experience (albeit a bit old) and am not new to the Unity editor and so on (but no expert either, 3D is new for me, 2D work previously) so I totally understand problem solving is part of things but wasn't expecting this to be honest.

    I also read that the way SDK's are powering the bultiin XR support is now changing and I need to set things up in a new way, well I THINK thats what it was saying (https://forum.unity.com/threads/xr-plugins-and-subsystems.693463/), just felt like another confusing part of things.

    In-between I've been trying out UE4 which I had 0 experience with and have had a much better experience. I fire up the VR template included and I've got height control, locomotion, basic interaction etc in blue prints, nicely commented etc. I read this works out the box for all major HMD's with no effort. The UE4 VR editor works fine as well, I did try Unitys VR editor but it failed to even start, reading around lots of issues etc. I know Unitys VR editor is experimental but UE4's is marked the same yet works out the box. Manual looks well written for VR for UE4. Also, forum threads like this...

    https://forums.unrealengine.com/dev...-ar-development/15238-getting-started-with-vr

    I've also been shown an educators guide for UE4 thats free for VR containing best practises and setup that is exceptionally well written. An not directly something Epic have done but an up-to-date book around VR in UE4 as well, I couldn't find the same for Unity of the same quality (and things you do find some want you to use the 3rd party SDK's, not a lot if anything for the builtin stuff).

    I'm not trying to point out how wonderful things are with UE4 and I'm sure there's lot or trade-offs and its fair share of problems but these are things that if Unity had would really help us get going. For me and my VR projects at the moment I've paused with Unity and am seeing how I can get on with UE4, might be a terrible mistake but so far that's not been the case.

    ** EDIT ** I also tried to take the survey here https://forum.unity.com/threads/unity-product-survey-ar-vr.701033/ as your asking for XR feedback specifically etc but it doesn't work and tells me its 'invalid'

    ** MORE EDITS ** I created a VR template from scratch in Unity to see how that went, learned a lot by removing everything out my mind I was expecting and enjoying things a lot more by using the built-in Unity XR features. UE4 progress is slow and now I'm over the initial bump with Unity I have to say I'm progressing nicely and enjoying things.
     
    Last edited: Aug 12, 2019
  16. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    278
    Fully agree with the above. I also had a pretty tough time trying to get started with VR using XR. It's a nightmare trying to deal with sitting/roomscale recentering of cameras, fighting with cameras which transform themselves, and so on. The actual API isn't so bad and once its up and running its actually quite easy, but there's some horrible time wastes and trash to wade through just to get there. I thought the point of a game engine was to support us in developing? This is one area where Unity clearly fails in that ambition.
     
    appymedia likes this.
  17. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    @Innovine Good to know I'm not alone in the struggle but not good to go through it all.

    I suppose what really dented my confidence with Unity is I have an expectation of things like the the built-in VR support and Oculus Integration (for example) to work out the box. When you immediately run into issues in an example scene (Oculus Integration and made to demo a concept) the first thing that's starts running through my mind is, if this basic scene and example has got issues in it, how many other problems will I have? I had the same thoughts with the built-in Unity stuff, if there doesn't seem to be a coherent approach what state is everything in?

    To counter that I see many awesome things being made with VR and Unity but is it tons of pain first as you stumble through what could be a much more smooth process? An I thought exactly what you put about the engine helping support us. Its doing some amazing things but is almost like nobody's given any thought to the process as a whole.

    I don't want to dismiss Unity as an option and I invite any of the Unity team to make contact so I can explain things to help improve the on-boarding process, at least from a newcomers perspective. I'll keep on tinkering in any free time with Unity but for now I'm more commited trying out UE4 and in the spirit of being fair I will report back on that as well, nothing saying something can't be learned and applied across engines.
     
  18. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    190
    I can only agree with what you guys are saying, Unity seems strangely inactive / not bothered with VR. EditorXR seems dead. LWRP was recently made "production ready" but I swear it wasn't working in VR properly at the time? I'd appreciate the stuff unreal has, but seeing has Unity has semi abandoned the 3rd person character controller thing they were working on doesn't give me a lot of hope.

    I am though hopeful for XRTK (cross platform fork of MRTK). I'm in the discord and they're working super hard on it every day.

    Virtuoso VSDK (based on VRTK 3) also seems interesting, but it's just been released and I haven't tried it yet.
     
  19. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    278
    I think there's a whole bunch of unity features that got pushed out half-baked and then mostly abandoned in the last two years. I'm wondering if they're now driven by interesting sounding ideas on paper, rather than serving us a quality platform to work with. Where's all the ecs stuff? Wheres the srp for production? Where's camera stacking? The nested prefabs it a bit halfbaked. Multiplayer?. Gonna re-write the input system againnn?? and xr of course.
     
  20. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    Yes LWRP completely fails out the box with VR to the point the editor itself renders in stereo by look of it on your desktop. I've seen a few report the same, I mean since one of LWRP's main use cases is VR (as I understand) and no-one tested it? The fix is to change the stereo rendering mode to single pass. Heres someone with the issue..

    https://forum.unity.com/threads/lwr...-and-multi-pass-stereo-rendering-mode.690298/

    All I've learned so far is how to fix issues surrounding Unity and the Oculus Integration, some if it useful but most just silly things I didn't want to or need to know right now. I had hoped to be making basic stuff in VR and growing those skills.
     
  21. sinzer0

    sinzer0

    Joined:
    Aug 29, 2013
    Posts:
    128
    The CEO of Unity was pushing VR as the next platform ripe for Indies and then it seems they went radio silent in the past year. I wonder how much of it is driven by their internal analytics since Unity is probably in the best position to understand them other than steam.

    Curious to see his next keynote if he pushes VR as hard as he did in the past.
     
  22. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    I'd be interested to see what he says as well @sinzer0 I'm also interested to hear any thoughts from Unity staff in here, @Matt_D_work have you any thoughts or if its not your specific area can you ask someone else to help us understand things?
     
  23. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    This situation is forcing me to make use of my intuition and avoid being one of those developers who want to have a SDK or kit built by someone else that is already set up and working (A very common thing we expect to have these days).

    Instead, I am using the tracked pose driver, which simply works out the box when you add the Legacy input helpers package and whack it on some gameobjects. There are also some simple/native scripts and packs suggested by Bernie a few posts down in this discussion (Scroll up) which I have managed to implement to get locomotion and teleporting that I fully understand and now things are falling into place.

    Sorry to hear you are having issues with the Oculus SDK though - you should definitely report those bugs through the bug reporting option in the Unity editor.
     
  24. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    Yes, I will report them when I get a moment, was find of hoping for some input from Unity staff in here....

    I actually ended up setting up a template which uses 2019.2, LWRP, the tracked pose driver and also the new XR input system in a small way after getting fed up with things and rereading the Unity manual. I missed a key page the first time, its not very logical to be honest if you are homing in on Oculus controllers like I was. Anyway, have a look...

    https://forum.unity.com/threads/community-vr-template.725498/
     
    Shizola likes this.
  25. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    70
    in short, we are looking into the template / onboarding side of things. and yes, in hindsight, 'legacy' wasn't exactly a great prefix.

    ill pass on that the survey is broken.
     
  26. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    Thanks @Matt_D_work
     
  27. joepl

    joepl

    Unity Technologies

    Joined:
    Jul 6, 2017
    Posts:
    4
    I will mention that Unity has a new-ish API for input as of 19.1. The API is accessible via UnityEngine.XR.InputDevices and provides for low-level polling of input in a cross-platform way. We've updated the docs to outline how to use it:
    https://docs.unity3d.com/Manual/xr_input.html

    The TrackedPoseDriver (which does live in the Legacy Input Helpers package) will provide tracking data, but you can also use the API listed above as well if you just want to poll for pose changes on a device. There is also work being done to provide for higher-level frameworks and sample code to make life easier for developers starting to work with VR in Unity.
     
    mattbenic likes this.
  28. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    @joepl Yes, I found that page after skimming past it before and its helped a lot. Got input working and tracking working with my Rift S controllers + HMD fine using the new XR input system. An great to hear things are being worked on to improve the new developer experience, appreciated.
     
    ROBYER1 likes this.
  29. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    I'm running into the same frustrations.

    When I try to use the LWRP with the Oculus SDK, I cannot see the hands at all - not even pink, but they simply do not show (though they are tracking - buttons work, and they show interacting with the guardian boundry).

    It's also very frustrating that something as simple as custom hand poses has to be so difficult. SteamVR does this fairly well, but with Oculus Quest, I cannot use Steam VR. So I've tried to use the Oculus method of running the scene and posing the hand, but how can you do that when you are developing for the Quest?

    So many frustrations, and now Unreal has gone to Vulcan for Quest development which squeezes out more performance for that headset, it's becoming more and more likely we will be switching to Unreal. I just hope it's not a "grass is greener" type situation, as I'm sure Unreal has it's issues as well.

    In the end, all I want to do is get started and create good experiences but the path to get started is so murky. I thought we should be past this by now.
     
  30. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    LWRP is going to make all unsupported materials go pink, I strongly advise you read the migration notes for importing a project to LWRP - there is an option under Edit/ Render Pipelines or similar that allows you to convert all unsupported materials to LWRP.

    As for performance squeezing, you can add a script with this in start to any scene with the Oculus OVR Rigs to use Fixed Foveated rendering (this example is the top level one) which squeezes even more performance out of your scene. I am currently figuring out how to access it without using specifically the Oculus OVR rig stuff as I would rather use Native Unity VR support (as this thread suggests). If you have any more questions or need a chat or advice just post back here or inbox me. I'm happy to help!

    Code (CSharp):
    1. OVRPlugin.fixedFoveatedRenderingLevel = OVRPlugin.FixedFoveatedRenderingLevel.HighTop;
     
  31. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    Thank you for replying!

    The hands aren't pink, they simply are not there at all. I'm using 2019.2.0.f1. I thought this was an issue with the LWRP so I tried going "back to basics" - I created a new, standard pipeline (i.e., "3D") project in 2019.2.0.f1, loaded the Oculus integration pack, and then loaded their example localavatar scene. The thing is that not only are my hands missing from that scene, but the little "avatar guy" is not showing as well (again, hands seem to track, etc. - just not show up). So this might not even be a LWRP problem. I'm going to load up the same example scene in 2019.1.13.f1 and see if I have the same issue.

    Also, the Oculus integration version is 1.39.
     
    ROBYER1 likes this.
  32. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Just realised you said the hands do not show, this could definitely be an incompatibility between the SDK and LWRP - are the hands essential to what you are making? I'm sure you could salvage the models. Seeing as VRTK v4 doesn't support LWRP either.. URP I mean as it is now being renamed, I wouldn't expect many SDKs to play well with it until it is more established or replaced built-in renderer. Unless Unity create a way to split up packages to support URP and built-in 3d renderer.
     
  33. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    278
    That may be a problem with the oculus integration pack?
    I am just using native XR and have 3 tracked devices, the headset and two controllers. Works fine for me. All the additional layers on top and SteamVR stuff, or god forbid VRTK just confused the hell out of me. The native XR was much, much more straightforward. I am a roll-my-own guy, so I understand this is not for everyone... but it surely was easier for me to make everything I needed than wrestle with the extra layers of junk. My entire VR related code (supporting Vive only) is about 5 lines long.
     
  34. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    For the Oculus Integration try going into the Oculus menu and adding in avatar ID's (I was using 1.39), I used 999999 and my hands magically appeared. Can't remember what rendering pipeline I was using though for that but it did solve the issue.

    An like you I had a bumpy start to VR and looked at UE l and saw the nice things like a VR template and Vulkan rendering etc but stayed working through Unity. For me I knew if a problem happened with Unity I could fight my way out of it, right now I don't feel the same with UE (I don't have much experience with it) and as a solo dev I had to ask myself what would it give me apart from quite a large learning curve. Maybe in the future as my skills develop it will make more sense as a tool to have for certain projects but I'm not there yet.

    Also, from chats on the forums the Unity guys have said they are looking into the on-boarding experience and helping make it better etc so fingers crossed things will be getting a little easier for us as new developers
     
  35. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    A wise choice I believe if you are looking to support multiple platforms or future hardware, your code will be much cleaner to adapt and you won't have a battle with incompatible SDKs or code written by others that is confusing as all hell.

    @darryl_wright there are some great tips earlier in this thread, especially from Bernie who points out some cross-platform and native examples for interactions/locomotions. If your application definitely needs the Oculus avatar hands and it will only ever release on Oculus Headsets or platforms they support, then stick with Oculus Integration. In my experience with the SDK so far it has been rather buggy and not well-documented.
     
    appymedia likes this.
  36. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    So I loaded 2019.1.13f1 like I mentioned and I still cannot see any localavatar parts (hands, little avatar person on the table, etc).

    Has anyone else run into this? I'm wondering if something in my Oculus Integration download has gotten corrupted. thanks again all for replying (I didn't mean to hijack this thread, I'm just frustrated at these seemingly simple issues right now).
     
  37. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Unfortunately I'm not too sure how to help with this, I would advise to start a new thread as this one is for people looking to avoid using platform SDKs and just native Unity. I appreciate your question though and I will reply on your thread if I do see anything that will help :)
     
  38. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    Did you try adding the Avatar ID's I suggested, no good still?
     
  39. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    190
    I would stick with the default pipeline for now, unless you really need LWRP. Could be a long time before oculus officially support it.
     
    appymedia and ROBYER1 like this.
  40. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Did you find any interaction toolkits like EasyGrab/Newton VR that were adaptable to the Unity Input System and built-in tracked pose drivers? I'm currently looking into rewriting one or the other myself... I wouldn't want to create double the work when someone else has already done it or there is a better alternative.
     
  41. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    Sorry I'm just getting back to this. Yes, I finally did enter an App ID and that fixed it. So ridiculous... Thanks!
     
    appymedia likes this.
  42. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    No, I haven't, though I've been working on other things this week. I'm getting back to things today and looking at the Oculus way of grabbing, though as you mentioned earlier that's not really conducive to cross-platform (for this project, it's Quest only, but still, I don't like being locked into a platform if I can help it).
     
  43. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    Glad you sorted your app ID issue, and my post before this one was in response to @BernieRoehl as he mentioned some cross-platform interaction frameworks I was looking into trying now I've cracked lots of native features already myself.
     
  44. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    74
    @darryl_wright No problem, had me stumped for a bit as well :)

    @ROBYER1 Interested to see what @BernieRoehl has to say as well. Progressing onwards and always keen to hear about better / more elegant solutions
     
  45. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    190
    Might as well post this here: I'd really love it if someone released a physics driven solution such as this:
     
    ROBYER1 likes this.
  46. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    347
    EasyGrab seems to be horrendous to update to use another input, hopefully NewtonVR is better - having a chop at implementing the Unity Input system into them and ripping out all the SteamVR/Oculus crap