Search Unity

[RELEASED] VR Interaction Framework

Discussion in 'Assets and Asset Store' started by BeardedNinjaGames, Jan 28, 2020.

  1. Starmind01

    Starmind01

    Joined:
    May 23, 2019
    Posts:
    78
    Curious, what would it take to make it things like guns snap to a holster if you accidentally let go of the gun? Sorta like the lightsaber in Vader Immortal.

    Also, would you be willing to make a zero g controller at customer request? So everything works without changing asset scripts. Sorry for the same question over and over, but I really want a zero g controller.
     
  2. bander20

    bander20

    Joined:
    Jan 10, 2020
    Posts:
    1
    all it would take is registering the drop event to holster.

    For Zero G, take the concept of the rocket gloves, lower gravity on the player, and then you've got a good starting point.
     
  3. MoMonay

    MoMonay

    Joined:
    May 9, 2018
    Posts:
    19
    Hey man great asset I love it.

    There's a limitation currently for using
    OnNoLongerClosestGrabbable() for highlighting. Basically you can have 2 hands within grabbable range of the grabbable item but then remove 1 hand and the object will unhighlight even though the other hand it still in range.
     
  4. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    There is a component included that will do exactly that :) "ReturnToSnapZone" let's you set a SnapZone to return to, the Speed, and the ReturnDelay. The return delay lets you specify how long to wait before trying to return to the snap zone. That way you can intentionally throw an object and have a fixed time before it start making it's way back. Also, as Bander20 mentioned, you can script this yourself by using the OnDetach event on the SnapZone.

    Ah, good catch! I will make note of that for next update. If you need something like this in the meantime, you can use the RingHelper which does handle multiple hands since it doesn't use OnNoLongerClosestGrabbable().
     
    MoMonay likes this.
  5. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    Hi All,

    Version 1.4 has been submitted for approval! Be sure to back up your projects :) This has been submitted using Unity 2019.4 LTS which means it may be time to upgrade your Unity installation. If you have any issues upgrading be sure to swing by the Discord for support.

    Here are the changes :

    1. Reworked how GrabPoints are positioned - this means your grab points probably need to be adjusted after this update! Luckily there is now a "handy" tool to make positioning them easier :
      1. You can position the pose in real time in the editor and how the hand pose will look. You can also specify whether or not the grab point is valid for left or right hands, which allows for more freedom in where you want to position each hand.
    2. You can now blend the hand Index and Thumb fingers on grab points. For example, your trigger finger will now move with the pistol's trigger, and the bow will allow you to point your finger or grip it completely. This is done by using the blend properties on the grab point.
    3. Revamped Levers and Buttons. These are much better now. I've included a few examples to get you going. These should be much easier to configure / setup from scratch, and there is even moving platform support. Huzzah!

    4. Full XRInput integration - XRInput is now the default Input Provider. There are additional input mappings added, as well as some logic that adds support for the Valve Index and Vive Cosmos controller. That means once the OpenVR plugin is released (it's in preview), the inputs should come in properly to the InputBridge.
    5. New Player Controller - No longer using OVRPlayerController to get around, which means there is more control over how our player moves, as well as removing a dependency on the Oculus Integration Kit.
    6. Added Jump and Sprint abilities that can be bound to controller inputs or called from script
    7. Basic Moving Platform Support - There are 2 moving platforms included in the demo as an example (see if you can find the second one :p ). This works by raycasting down from the player and parenting the player to the platform if it has a "MovingPlatform" component attached.
    8. Included a basic Waypoint / Movement system so you can move objects / platforms around using physics.
    9. New HD bow model and custom properties. The bow can now be configured to lock the arrow axis (so you can't shoot the arrow backwards, for example). The arrow knock can also be specified for left-handed users.

    10. Better collision handling on Grabbables. You can now customize how the Grabbable object behaves when it is colliding with an object. For example, you may want a rifle to be locked on the Z axis, but have a ball completely free. These values also control how "rigid" or springy an object feels when it's hitting something.

    11. Controller Angular Velocity is now being tracked per frame. This can be used for throwing instead of OVRInput.GetLocalControllerAngularVelocity. I've found this to be much better for AngularVelocity, and it doesn't require the specific setup / rotation math that OVR uses.
    12. Added additional UnityEvents to Damageable class.
    13. Added recoil functionality back to weapons
    14. Weapons can now specify how to behave during slow-motion (raycast vs. projectile, for example)
    15. Added "ForceNonKinematicOnDrop" property to Grabbables. Use this if you want to place an object somewhere and have it be kinematic, but then force it to non-kinematic as soon as a player picks it up.
    16. Added a CharacterControllerYOffset property - right now this is mostly used to make slight adjustments to the player, but you can use it to raise the player up if you are in a sitting position. Useful for testing and you could use this to make your own sitting / standing functionality.
    17. Added a new "GrabbableChild" component. Add this to a child collider that you want to be Grabbable. This is different than the fix posted earlier and also supports remote grabbing.
    18. Added a Marker / Pen example. This uses a LineRenderer to draw, so I would actually recommend using a dedicated asset to do the actual drawing (like this asset, for example). The included example demonstrates how to make the pen stroke width based on distance from tip (simulating pressure), as well as setting physics materials to zero friction to ensure a smooth drawing experience.
    More info later on what's to come!
     
  6. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
  7. olidadda

    olidadda

    Joined:
    Jul 27, 2019
    Posts:
    37
    Looks Awesome!
     
  8. Starmind01

    Starmind01

    Joined:
    May 23, 2019
    Posts:
    78
    Thank you! I will give it a try!
     
  9. StevenPicard

    StevenPicard

    Joined:
    Mar 7, 2016
    Posts:
    859
    This looks amazing. I am ready to pull the trigger on this one and buy it. My only concern is I will developing using Windows WMR initially with the plan to eventually get a Quest. I am hoping that getting this working with WMR won't be that difficult since that would be my only option to begin with. Btw, how easy is it to switch bow models? I have some that are appropriate for my time period. Oh, one other thing. Any plans for integrating with Eliot AI Pro? The benefits to that is I can use Aron's A* pathfinding since my level are dynamically generated.
     
  10. biggoron88

    biggoron88

    Joined:
    Nov 7, 2017
    Posts:
    9
    Hi, thank you for this asset.
    I have found an error in InputBridge.cs
    YButtonUp = OVRInput.GetUp(OVRInput.Button.Three); --> Should be
    YButtonUp = OVRInput.GetUp(OVRInput.Button.Four);

    Thanks
     
  11. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    Hi Steven,

    Regarding WMR I think your best bet would be to hop over to the Discord and ask the community if anyone has tried your specific device model. Chances are somebody has and can provide feedback. My inclination is that everything should work just fine, but without a device hand I'm just not 100% comfortable saying it's supported just yet. You can also try the PC demo posted on the itch.io page.

    As for swapping out the bow model, I'd say that should be quite easy, though it depends on how your bow string is setup. But generally speaking you can disable the model on the included prefab and swap in your own. You may need to position it so that the knock is line up properly. And for Eliot AI pro, I actually had never heard of that one. If you're familiar with C# / scripting it should be pretty easy to get the damage system working together - otherwise happy to help with that on Discord as well :)

    Ah, good catch! And fwiw from now on XRInput will be the standard going forward - I may end up removing OVRInput entirely since XRInput can replace it. So if you're on an older version you may want to go ahead and switch to XRInput on the InputBridge.
     
    Mark_01 and StevenPicard like this.
  12. aheydeck

    aheydeck

    Joined:
    Dec 17, 2014
    Posts:
    8
    This might not be the right thread for this question, but ill give it a shot.
    How can i use my quest controllers when testing the bng framework in game/scene window? Or is the only option to build an apk and deploy to device?
     
  13. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    You can use Oculus Link or something like Virtual Desktop or Riftcat. I think Oculus Link would be the best option right now if your video card supports it.
     
  14. bartek0403

    bartek0403

    Joined:
    Mar 20, 2018
    Posts:
    7
    Hi, I suppose there's bug in InputBridge.cs:434, LeftTriggerUp and RightTriggerUp returns true all the time when buton is not pressed, instead of indicating change from state 1 to 0
     
  15. emrys90

    emrys90

    Joined:
    Oct 14, 2013
    Posts:
    755
    Hi, I'm considering this asset, and I have a couple questions.
    1) Do you have a way to simulate headset/controllers for when a VR headset is not available during development?
    2) Do you have any systems to prevent players from walking through walls in roomscale?
     
    ImpossibleRobert likes this.
  16. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    I'll look into that, thanks! I think you're right in that that should only be called if it's released this frame but not the last.

    Hi there! For #1, there isn't really an emulator thing type setup, but it's on the list. You can use WASD / Q,E to move / rotate, but the controllers / grabberss aren't hooked up to the mouse or anything like that yet. For #2, there is a "HeadCollision" script that will dim the screen if the player tries to put their head throw a wall / object. It's enabled by default in the demo so you can try it out there.
     
  17. emrys90

    emrys90

    Joined:
    Oct 14, 2013
    Posts:
    755
    Any plans to support a gamepad for the simulation instead of mouse/keyboard? Or how much of your code would need to be modified for me to add it in?

    Any plans for other ways to restrict movement? IE teleporting them back to a valid location if they go through a wall.
     
  18. lcaExtende

    lcaExtende

    Joined:
    Sep 2, 2019
    Posts:
    25
    Hi,

    What the best way to change player height pls ?
     
  19. amit-chai

    amit-chai

    Joined:
    Jul 2, 2012
    Posts:
    80
    Hi! I am trying to use the "RifleWithGrip Variant" with the Final IK PlayerFinalIK. when holding with one hand is ok, but once trying to hold the grip with 2nd hand, unity freeze and getting an error:

    : The variable HandsGraphics of Grabber has not been assigned.
    You probably need to assign the HandsGraphics variable of the Grabber script in the inspector.
    UnityEngine.Transform.set_localRotation (UnityEngine.Quaternion value) (at <a979f18d51af41179d12b797e8c5be14>:0)
    UnityEngine.Transform.set_localEulerAngles (UnityEngine.Vector3 value) (at <a979f18d51af41179d12b797e8c5be14>:0)
    BNG.Grabbable.checkParentHands () (at Assets/BNG Framework/Scripts/Core/Grabbable.cs:772)
    BNG.Grabbable.GrabItem (BNG.Grabber grabbedBy) (at Assets/BNG Framework/Scripts/Core/Grabbable.cs:980)
    BNG.Grabber.GrabGrabbable (BNG.Grabbable item) (at Assets/BNG Framework/Scripts/Core/Grabber.cs:529)
    BNG.Grabber.TryGrab () (at Assets/BNG Framework/Scripts/Core/Grabber.cs:488)
    BNG.Grabber.Update () (at Assets/BNG Framework/Scripts/Core/Grabber.cs:220)

    Can you help?
     
  20. amit-chai

    amit-chai

    Joined:
    Jul 2, 2012
    Posts:
    80
    Ok. so I solved this issue was solved by copying the model's prefabs and reassign.
     
  21. amit-chai

    amit-chai

    Joined:
    Jul 2, 2012
    Posts:
    80
    Now another issue, the fingers of the playerFinalIk are not animated at all. please advise.
     
  22. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    That's correct - FinalIK fingers are not animated on the model RootMotion provides. You would need to animate them yourself or use a model that has humanoid animations baked in. That's a bit out of scope for VRIF. One option you could try is to have your hand model and humanoid model separate, similar as to how the VRIF demo scene has hands setup with the IK body. There isn't a universal way to animate hands at the moment, so it generally depends on the model / rig you are using as to how you decide to set this up.

    Hi there! You can use the variable "Character Controller Y Offset" on BNGPlayerController to adjust the player's offset from the ground. This could be useful for making the player taller / shorter, or potentially for creating a sitting or crouching feature as well.

    The SmoothLocomotion script uses Unity's Input system, so it shouldn't be too hard to modify that to use your gamepad if you want to. I do not currenly have any plans for gamepad support, however.

    As for restricting movement, you can also use the "TeleportOnEnter" script on a Trigger collider to teleport a player back to a defined location. That way if a Player physically crosses into an area you don't want them in you can send them back to a teleport point or some other type of behaviour. In the future there will be a physical hands option as well which would potentially discourage players from entering unwanted areas.
     
  23. amit-chai

    amit-chai

    Joined:
    Jul 2, 2012
    Posts:
    80
    Thanks for clarifying
     
  24. ccoutinho

    ccoutinho

    Joined:
    Dec 24, 2016
    Posts:
    20
    Hi
    I would like to utilize weapon models from external Gun packs for Handguns, Rifles and Shotgun models with the VR Interaction Framework v 1.4. Is it possible to substitute the weapon models provided within VR Interaction Framework, with different ones and have the guns function exactly the same. I believe there will be certain components like for the slider, trigger magazine etc that could be directly added onto a new weapon model to get it to work.
    Also would need to changed the magazines to match with the new weapons, so probably they would need some setup too.
    If so, could you advise steps on how to achieve this.

    Thanks,
    Chris
     
    Last edited: Aug 9, 2020
  25. seldemirov

    seldemirov

    Joined:
    Nov 6, 2018
    Posts:
    48
    Hi! For my project, I need to make a simple VR project but make it fast.
    Tell me, can your asset be integrated "out of the box" with OcoulusRiftS? I need to make moving from the stick and call the object method by directing the joystick ray.
    Can you tell me how I can do this quickly?

    Unity 2019.4.7 HDRP
     
  26. bartek0403

    bartek0403

    Joined:
    Mar 20, 2018
    Posts:
    7
    Hi,
    our team had some grab point problem on unity 2019.4. When prefab is opened in prefab view, with auto save ON, toggling on hand preview causes reimport loop. Commenting out GrabPointEditor.cs:149 and GrabPoint.cs:81 fixes problem.
     
  27. uz986

    uz986

    Joined:
    Oct 29, 2017
    Posts:
    12
    Hi there,

    first thing I want to mention, this has basically become a default asset for me to develop of any VR headset app, I am using this in all of my VR projects. so I want to say it is really great work.

    Now I am working on a project where I will be using hand tracking as well as controllers, I am able to do everything what I want with hand tracking except using UI canvas, the pointer shakes so much that is impossible to use it.

    I know you would be working already so much so it might not be possible to work on it at the moment, so I hope atleast you can guide me if there is something I can do to make it work better with hand tracking?

    let me know if you have any ideas for me. thanks again for giving us best asset for VR :)
     
  28. LIVENDA_LABS

    LIVENDA_LABS

    Joined:
    Sep 23, 2013
    Posts:
    377
    How do we integrate this with the latest OpenVR
     
    Akshara likes this.
  29. stain2319

    stain2319

    Joined:
    Mar 2, 2020
    Posts:
    417
    I would like to suggest a feature- "vr simulator" similar to what VRTK has/had. I would love to be able to do quick tests of concepts without putting the headset on every time...
     
  30. AshyB

    AshyB

    Joined:
    Aug 9, 2012
    Posts:
    191
    Just bought this, pretty cool. I had been working on trying to do my own for quite some time. This worked out of the box but I did have to install the Oculus Integration which I thought would not be required any more since unity has it's own XR toolkit...

    Few things I noticed on first play through;
    1. If you hold something in your right hand and then try to grab it with your left hand it bugs out. Most noticeable with the sword. Both hands will hold it, one will be facing forward on the z axis, the other will be inverted. Then when you release one hand the sword no longer knows which hand it belongs to and gets stuck until you grab it again. I had the same problem in my own system.
    2. Hands are not physical objects, i.e. they go through tables and walls but the object you are holding does not. This is the standard way most VR games and systems work which is fine and all but would be nice if the hands did not (like in HL:Alyx). This is an issue I also had when building my own system so I ended up using multiple layers, one for the hand model, one for the player controller and one for the grabbables. They can also pass through each other which is either a good or bad thing depending on what your doing.
    3. The slots on the body and shoulders for storing items and the ammo dispenser work really well and felt quite natural same with the backpack, really cool to store stuff and then put the backpack in your shoulder.
    4. The bow and arrow work really well and felt natural.
    5. The web-slinger is a really neat idea and worked well.
    6. Climbables worked really well and felt fairly natural.
    7. Two-handed rifles need some work, aiming with them did not feel that nice or natural.
    8. Sliding the pistol clip in surprised me, I did not expect it to work so well.
    9. Teleport ray and pivot felt really good.
    10. The hand and arm IK worked really well.
    11. The hand fingers have an usual design where if you press grip the grip fingers close which is good, but the index finger (and thumb I think) does not close when you pull the trigger unless you also are pressing grip which leads to your brain thinking "I'm squeezing my index finger but its not moving".
    12. If you hold grip with an empty hand then move close to the handle on the drawer you automatically grab it. Might be better to check if in range of the handle first then if grip pressed grab handle. otherwise you might for example, try and punch something next to the drawer and end up grabbing the drawer instead.
    13. The remote grab feels nice but i'm forever grabbing things through walls. I'll check to see if there is a "Line of sight" option otherwise this could possibly be better as a ray/sphere cast instead of several capsule colliders joined together.
    I think a fair few of these can be resolved by changing layers and settings on the components so I'll have a look and see what can be done.

    Pretty damn cool system though I look forward to poking through the code and learning from it. Good work!

    Quick update, just playing around with putting in my own custom hand. Had to inherit from HandController.cs and override updateAnimimationStates() as the animator controller that comes with the package just wasn't doing it for me. Nothing wrong with it but just wasn't my style. Now trying to get custom weapons in with custom poses and noticed the handPoseDefinition.cs file to add the enum for custom poses. Could this possibly be better as just an int field in the grabbable.cs script? Rather than change a file that comes with the package (that will get overridden on update) or inherit from or make a new definition file and then have to modify the grabbable script anyway (which will get overridden on update)? How are others handling this i'm curious (for learning purposes).
     
    Last edited: Aug 16, 2020
    atomicjoe likes this.
  31. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    Sure, this works with RiftS. I'm not quite sure what you mean by "moving from the stick" - if you mean reading input from the joystick, you can use something like InputBridge.Instance.LeftThumbstickAxis to get the Vector2 X / Y coordinates. Does that help?

    Thanks for pointing that out! I have a fix for next update that will only set the scene as dirty if you are not in prefab mode.
    Thanks for the kind words! I think what you are looking for is a "pointer pose". It gets you a steadier alignment based on the index and thumb position. This should get you started : https://developer.oculus.com/docume...acking/?locale=en_US#integrating-pointer-pose

    I guess it depends on which package you mean, as there have been some OpenVR changes / updates recently. I believe Valve is moving forward with OpenXR in the future, but generally speaking, you can use XR Management to install the OpenVR plugin, just make sure you don't have any of the legacy packages installed as well.

    Agreed! It's on the Roadmap, though I've got a few things to wrap up before I loop back around to keyboard / mouse controls. It's definitely doable, though.

    Thanks for all of the feedback! Just a few quick notes to a couple of your points :

    1. I'm currently working on a more robust physical hand option, so you can't stick your hands through walls and such. That should hopefully be in the next update, or very least next after that :)
    2. There isn't yet a line of site for remote grabbables, yet, but it is on the list. The simplest solution would be to raycast every frame, but since there could be multiple grabbables I wanted to spend a bit more extra time to optimize it. There are benefits to using colliders over just a raycast or spherecast every frame, but I do encourage you to experiment with it and see what works for you!
    3. As for Grabbable hand poses / animations, "Could this possibly be better as just an int field in the grabbable.cs script?". I think an int could be a better option, I just thought an enum seemed a bit more readable. It is an extra step and prone to overwrites as you mentioned, however. The end goal is to have all of this handled through the Unity editor, though - You click "Add Pose", do your rotations, "Save Pose", and then the pose is saved as an animation and added to the animator for you. The other option would be to save positions / rotations, which I believe is what the SteamVR hand poser tool does. I did some experimenting and both seem like viable options, so I think it's just a matter of spending some more time on it.
     
    uz986 and atomicjoe like this.
  32. AshyB

    AshyB

    Joined:
    Aug 9, 2012
    Posts:
    191
    Ha cool, I was going to have a look at making the hands physical myself but if your already on it i'll just wait and focus on other stuff :)

    The "Add Pose"/"Save Pose" thing sounds cool. I'm still trying to work out the current pose thing you have on the grab point script now. I got it working with my own custom hands but not sure how to use it well. I'm assuming its mean to be able to spawn in the hand and pose it so you can position the item correctly .ect but its the latter part that's left me scratching my head. For now i'm just hitting pause during the game and positioning stuff but I feel like you've already got something to do it for me and i'm just not utilizing it.

    One thing I noticed with grab points and the ring helper is, some of my custom hand poses require me moving the grab transform a bit off from the item itself so that it positions correctly in the animation. This is fine. But when your in game and look at the item the ring is not over the actual item itself but where the grab point transform is (thin air), which is expected. Not sure if you have a better way to do this, but for now all I did was modify one of the ring scripts to say if(customTransform != null) ring.position = customTransform.position else GetFirstOrDefault(). Can't quite remember sorry, at work :p and then add another empty game object, call it "Ring Position" and set it over the visual grip and it works fine.

    Keep up the good work, look forward to the next version.
     
  33. DigitalAdam

    DigitalAdam

    Joined:
    Jul 18, 2007
    Posts:
    1,204
    Hey, I read this and I'm trying to find where this is located. I'm using the 'PlayerAdvanced Variant Variant'. Is it on that, or the 'PlayerController', and which script? Thanks!
     
  34. amit-chai

    amit-chai

    Joined:
    Jul 2, 2012
    Posts:
    80
    Hi again! Is there a tutorial on how to setup new character ? I have a model, also got separate hands+arms. I tried to look at the example but is not so clear how to replace it. please help.
     
  35. AshyB

    AshyB

    Joined:
    Aug 9, 2012
    Posts:
    191
    In the GrabPoint.cs file the variable;

    Code (CSharp):
    1.        [Tooltip("If specified, the Hand Model will be parented here when snapped")]
    2.         public Transform HandPosition;
    Is not used?

    Also, where does the "Models" game object get its transform.position value from when grabbing a weapon in the demo scene? For some reason its position is offset slightly from vector3.zero when I would expect it to be exactly zero as a child of the GripTransformRight on the rifle.

    - All good I figured it out, it was because the "Grabber" object on the right controller had an initial offset from the demo only so slightly, I guess this gets applied to the final offset. Once I reset this to vector3.zero everything worked as expected :)

    On another note, in the documentation I take it;

    Code (CSharp):
    1. Grab Position Offset A local offset to apply to the Grabbable if "Snap" Grab Mechanic is selected.
    2. Grab Rotation Offset A local euler angles to apply to the Grabbable if "Snap" Grab Mechanic is selected.
    3. Mirror Offset for Other Hand If true, the "other" hand (typically Left Controller) will have it X position mirrored. Example : 1, 1, 0 offset would become -1, 1, 0
    Has been replaced with the "Grab Points" array?

    Some more feedback, when the shell ejects from the guns, does the system ignore collisions between the ejected shell and the gun? It looks like if you move the "Eject Position" too close every time a shell ejects it applies forces to the weapon. Would it maybe be better to do a physics.IgnoreCollision on them just to avoid potential issues?

    There also seems to be an odd situation where the audio clips on shells and gun shots double up/play twice. I haven't figured out what it is yet but it seems to happen after you insert the clip into the weapon. I'll keep playing with it and find out what the cause is eventually. Is an audio source required for each object? I notice the raycastweapon.cs adds the audio source component to the weapon which means having one already on it would be redundant?

    More feedback lol, in the demo scene its not very evident because of the size of the slide on guns and rifles but if you use custom weapons with larger slides when you throw and remote grab the weapon, the slides (either due to collision or rigidbody weight of the slide itself) will "slide" on its own. Not sure which it is, if its collision then thats cool cos you can use other physicals objects to manipulate the slide. But if its acting on its own due to its weight and the spring/joint not maintaining its position might need to increase the strength a bit?

    Update, seems to be the "weight" of the slide itself. If you grab a large enough weapon like a rifle and hold it so the barrel points directly up (+Y) then the slide will slide back towards the grip.. then if you tilt the rifle down so the barrel points along -Y the slide will move back towards the barrel.

    Any chance of adding a "OnFiredLastBullet" event to the RaycastWeapon.cs script? So we can plug in custom actions when the gun fires the last bullet and the slide opens.

    Really digging the grabpoints.cs script. It's so easy and handy to make custom poses for different grip positions and items. Curious though, I'm planning on implementing a second hand pose when gripping the pistol. For example, you know how when you would hold a pistol in real life you would then use your off hand to help "brace" said pistol. That's what i'm planning on doing. I think it would be easy to do with another grabpoint but that does rely on actually "gripping" the control. Just thinking back to HL:Alyx where you have doors, and when your hand goes near the edge of a door it changes the pose and sort of "snaps" to the edge of the door. Same with when you put your hand over your mouth it sort of snaps to that area or when you put your hands near a railing it sort of snaps to that pose on the railing and then when you brace a pistol hold it snaps under the grip just because your in range. Is this something you've considered before I go and add it in? :)

    Grabbing ammo from the ammo slot in the inventory does not primarily use the grab point or hand pose of the actual ammo item. I can understand why it is easier to just use "precise" with a single pose and it does work well (looks pretty good), just would of been nicer to use the actual hand pose and grab point of the grabbed ammo item.

    Suggestion, change the AmmoDispenser.cs script from hardcoded values like the prefab fields you have to something like an array of structs, something like;

    Code (CSharp):
    1. public struct ammoStruct
    2. {
    3. public string weapon;
    4. public gameobject ammo;
    5. }
    6. ammoStruct[] ammo;
    7.  
    Then the user can create a new array element, put "Rifle" in the string and drag the rifle ammo prefab in and then when the user is holding that weapon it will instantiate the ammo when they grab it.
     
    Last edited: Aug 20, 2020
  36. ccoutinho

    ccoutinho

    Joined:
    Dec 24, 2016
    Posts:
    20
    Hi
    I would like to utilize weapon models from external Gun packs for Handguns, Rifles and Shotgun models with the VR Interaction Framework v 1.4. Is it possible to substitute the weapon models provided within VR Interaction Framework, with different ones and have the guns function exactly the same. I believe there will be certain components like for the slider, trigger magazine etc that could be directly added onto a new weapon model to get it to work.
    Also would need to changed the magazines to match with the new weapons, so probably they would need some setup too.
    If so, could you advise steps on how to achieve this.

    Thanks,
    Chris
     
  37. AshyB

    AshyB

    Joined:
    Aug 9, 2012
    Posts:
    191
    Sure is, just copy the ones from the demo scene and add your own stuff in.

    Takes a bit to figure out what is and isn't required but then its all good.

     
    Last edited: Aug 19, 2020
    BeardedNinjaGames likes this.
  38. ccoutinho

    ccoutinho

    Joined:
    Dec 24, 2016
    Posts:
    20
    Your video looks great.. Are you changing only the meshes on the original weapons ? or are both your weapons new and you are then adding the components from the original.
    Could you advise how you went about achieving this.
    Thanks,
     
  39. AshyB

    AshyB

    Joined:
    Aug 9, 2012
    Posts:
    191
    "Are you changing only the meshes on the original weapons" - this mostly. Just duplicate one of the weapons from the demo that resembles yours, move it somewhere, then scale your custom mesh to match then start removing the demo meshes until you have what you want.

    Some tips I've found;
    • Make an empty gameobject called "Colliders" and then inside that make another gameobject called "Collider" with a box collider component. Use this to duplicate, position and scale the box collider to make up the collision geometry for the weapon. This helps keep the hierarchy clean.
    • The base gameobject for the weapon does not need any colliders, just a trigger.
    • The clips have a collider on the bottom, be careful where you put it because if you put it too high and insert a clip then drop the gun on the ground it will start moving on its own from the clip collider acting on the gun's colliders.
    • Same for the eject shell object, don't have it too close to the weapon.
    The rest is just trial and error really.
     
  40. ccoutinho

    ccoutinho

    Joined:
    Dec 24, 2016
    Posts:
    20
    Thank you very much. I will try it and see how it pans out. Thanks.
     
  41. TheStarboxTR

    TheStarboxTR

    Joined:
    Mar 22, 2014
    Posts:
    8
    Hello, i am very much the new in the vr game development and i pre ordered a HP Reverb G2 yesterday, i would like to develop steam vr games and also i will acquire an oculus quest later, so i also want to publish my game for quest vr.
    Firstly: Can ı use your framework solution with my HP Reverb g2?
    Secondly: Which specific headsets are not currently supported by this environment?

    If you answer my questions i will be very pleased. Have a good day.
     
  42. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    Hello there! Congrats on the Reverb - that looks like a great headset :) First I should say that I originally developed this framework using Oculus so getting it to work with Steam was a bit of a chore. However, in the next update (1.5) I have removed the Oculus Integration requirements and moved everything over to XR. Certain Steam devices do however need a little extra setup (like the Valve Index), and I believe that applies to the Reverb as well. You would need to install the SteamVR SDK and a SteamVR binding that I've included to get this to register inputs properly.

    My advice would be to wait a little bit until that update comes out. It is possible to get it working now with the current version if you're familiar with SteamVR and some of the quirks of Unity VR, but if not you may end up getting frustrated.

    Lastly, since you mentioned wanting to also publish for the Quest, I should mention that you can't publish a Quest game that uses the SteamVR SDK. You would need to remove the SteamVR SDK and build separately.
     
    TheStarboxTR likes this.
  43. Starmind01

    Starmind01

    Joined:
    May 23, 2019
    Posts:
    78
    Dumb question, will we still be able to publish to oculus quest?
     
  44. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    Definitely! There are just more options now :

    1. XR Rig - This is the new, default rig that utilizes Unity XR features and will work with Quest and any other VR devices that Unity supports natively (Rift, HTC Vive, WMR, etc.).

    2. Oculus Integration asset - You can optionally import the Oculus Integration asset if you want to use features specific to that asset, such as Hand Tracking, Spatial Audio, Avatars, etc. Not absolutely required, but if your only target is Quest then there is some useful stuff in that asset.

    3. SteamVR - Certain devices such as the Valve Index require the SteamVR SDK in order receive proper inputs. To build for Steam you would include the SDK, and then add the bindings from VRIF. If you decide to compile for something like the Quest later on, then you'll just want to make sure to remove the SDK, as Oculus doesn't like SteamVR references in it's app submissions, from what I hear.

    Hope that clears things up :)
     
    Mark_01 likes this.
  45. lcaExtende

    lcaExtende

    Joined:
    Sep 2, 2019
    Posts:
    25
    Hi,

    I have a problem with Unity 2020 with export for Quest.
    If i add a GrabbableHighlight component on a object, the rendering of the scene in the Quest is completely white
     
  46. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    The Highlight script uses a secondary camera to generate an outline, and I wouldn't recommend using it on the Quest as it will tank your fps. The second camera is most likely what is causing the white screen. Instead, I'd recommend using a shader or swapping materials using the GrabbableHighlightMaterials component. The soccer ball on the table from the Oculus Integration asset is a good example of a highlight material.
     
  47. bcv

    bcv

    Joined:
    Sep 1, 2012
    Posts:
    34
    Hello, I created a new project and I'm getting The type or namespace OVRHand could not be found and so on with other namespaces, do you know what could be happening ? using unity 2020.1. Also I tried the demo with a valve index and I was not able to teleport or move, is that ok ?
     
  48. lcaExtende

    lcaExtende

    Joined:
    Sep 2, 2019
    Posts:
    25
    The outline effect no working on Quest with Unity 2020
     
  49. AshyB

    AshyB

    Joined:
    Aug 9, 2012
    Posts:
    191
    Did you install the Oculus integration? I can't remember but I think i had the same issue when I tried to skip it.
     
  50. BeardedNinjaGames

    BeardedNinjaGames

    Joined:
    Jan 26, 2020
    Posts:
    176
    Hi everyone,

    I am pleased to announce that version 1.5 has been submitted for asset store approval :D.

    This is a big update, hence the change to a major version number. The Oculus Integration asset is no longer required, and some of the prefabs and demo scenes have been renamed to reflect this update. I recommend completely removing VRIF before updating.

    Here is the change log :
    1. Removed Oculus Integration asset requirement. Player prefab has been renamed to "XR Rig" and "XR Rig Advanced". Oculus Player rigs and hand tracking examples have been moved to an integration package.
    2. Oculus and SteamVR integration can now be enabled through a handy window inside the editor. Install the corresponding asset, and then enable it's integration by going to Window -> VRIF and enabling the integration. This will add the appropriate Scripting Define Symbols for you.
    3. Physics Hands - No more sticking your hands through walls! Check out the new "Physics Hands" scene. There are some tradeoffs to this approach, so I'd recommend evaluating it's relevance for your project before automatically making it your default setup :)
    4. Custom UI System - The OVRGazePointer dependency has been removed entirely and replaced with a custom solution called "VRUISystem". Now if you need to interact with a Canvas just make sure it has a GraphicRaycaster component on it.
    5. New Velocity Grab Physics - This grab physics type uses velocity to move the object towards the controller, and respects mass of objects during collision. This makes for a more realistic simulation and also opens up some new opportunities for full body IK.
    6. New Fixed Joint type - pretty straightforward!
    7. Cleaned up various editors with headers and more tooltips
    8. Grab Type and Grab Button are now set on the Grabber. Grabbables can then overwrite that setting.
    9. Added Snapzone events directly to GrabbableEvents
    10. Added controller bindings to snap rotation, locomotion, jump, sprint, toggle locomotion, and more.
    11. Custom screen fader - fade in / out from any color - on Scene Load, when the head is colliding with objects, or via script.
    12. Custom Controller Offsets - Using a special controller and it's coming in at an odd angle when using the TrackedPoseDriver? Use the ControllerOffset component to apply an offset for you during runtime.
    13. Added Air Control speed settings to SmoothLocomotion
    14. Fixed bug with Trackpad Deadzone values
    15. Now using XRInput for haptics (thank you for the Discord submission, D3m0n92!)
    16. Fixed Issue with GrabPoint editor looping in prefab mode
    17. Raycast weapons can now always shoot projectiles
    18. Various physics tweaks such as better friction handling on Grabbables. Prefabs have had their mass updated to more appropriate values.
    Working on more documentation updates before the update goes live and then will post some more info here.

    Cheers!