Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  2. Ever participated in one our Game Jams? Want pointers on your project? Our Evangelists will be available on Friday to give feedback. Come share your games with us!
    Dismiss Notice

BodyLanguageVR - Use gestures to trigger input in VR.

Discussion in 'Assets and Asset Store' started by DarkAkuma, Oct 7, 2019.

  1. DarkAkuma

    DarkAkuma

    Joined:
    Nov 28, 2013
    Posts:
    80
    facebook-keyimage.png

    Asset Store Page

    Finding traditional forms of input to be un-intuative in VR? Sick of reaching for a keyboard, or pressing gamepad buttons breaking your players immersion when all you wanted was for them to answer "Yes" to an option? Why not use a gesture like shaking your head!

    With this asset your players can use gestures to trigger input!

    ----------

    Hello! This is an asset I originally created and mostly finished a few years ago, but had to put it on the shelf before it was ready. Hopefully its better late than never, but I'm happy to be bringing it out now! This marks my 3rd asset released on the store, and with this one I've tried to step things up a bit by creating something a bit more advanced. Advanced in the skill it has taken me to create it, but it also may take a more intermediate Unity dev to utilize it. Still, thanks to my experience with my other assets, I've done my best to make this as easy to use as possible for even the newest Unity devs.

    The purpose of this asset is simply to let you get away from immersion breaking forms of input in VR, like button presses. In real life you have a form or non-verbal communication in the form of common gestures and body language. Simply waving hello at a NPC to trigger an event seems more immersive and intuitive than walking up to them and pressing a button.

    I started development of this asset for a project of my own that never materialized, but thankfully I planned from day one to make this available as an asset. I designed it to allow creating your own custom input gestures. But I do have 3 main included example gestures to work with to help you get started. Head shake Yes, No, and hand wave Hello!

    Over time, depending on how well this goes, I have plenty of ideas to work on for future improvements and stock input gestures. Enough to make this asset hopefully more and more invaluable over time! For now I hope to refine what I have, and welcome feedback and ideas to help guide the direction of the asset.

    This supports petty much ALL VR devices, as its not tied to any one device/SDK. Naturally if you are targeting say, a mobile device, hand based gestures may be out of the question due to a lack of motion tracked controllers. Though you can still use it with head based gestures, like the 2 stock ones, it will be more limited and will truly shine with a more complete VR hardware experience. Of course, if you can conceive of a need for this not based on gestures, it can do other things. It currently supports detection of position changes based on distance, rotation changes based on degrees, and a tracked objects facing of a given direction.

    ----------

    Limited Time Offer!!!:

    To get this started I'm giving away 1 free voucher in exchange for a review! First come, first serve!
    ----------

    BodyLanguage-SS1.png
    BodyLanguage-SS2.png
     
    Last edited: Feb 23, 2020
    StevenPicard likes this.
  2. raaden89

    raaden89

    Joined:
    Nov 16, 2019
    Posts:
    2
    HI, i see your asset.

    is it able to recognize hand using only smartphone camera? i'm working on a AR app and i need the possibilities to recognize hand using camera.

    Thanks
     
  3. DarkAkuma

    DarkAkuma

    Joined:
    Nov 28, 2013
    Posts:
    80
    First, this part in the opening post should answer it.

    But to further elaborate. No. To use this with hands requires your device/SDK to support hands. It simply tracks the movement of a selected object. How that objects position is changed is irrelevant to this asset, though generally its assumed its changed by use of VR device positional tracking of a headset or gamepads.

    If you had an external solution to track hands by a camera, and update the position of an object in the scene, then it probably could. But this asset does not provide such a thing.
     
  4. DarkAkuma

    DarkAkuma

    Joined:
    Nov 28, 2013
    Posts:
    80
    New update to the asset.

    I had a little more I wanted to do for this update, but I decided to release this early instead. I want to make some changes that will really shift the direction of the asset, and I don't want to rush them. Some of the changes in this version are related.

    To explain and preview, ATM this asset is focused around motion sequences. I'd like to expand its scope a little to encompass singular motions a bit. Also, its center around what I currently refer to as a "DoThen" style of input. What that means is, the users Does a motion, Then a value is returned. This is as opposed to what I'd like to additionally support with what I currently refer to as "WhileDo". That would be, return a value While the user Does a motion/sequence.

    Previously this asset was purely based around returning a boolean value for a single frame, and I now what to additionally support returning a more analog type of value while also supporting such values for more than a single frame. Since this asset is about replacing traditional input (digital buttons and analog sticks/buttons) with VR movement, I want to focus in on that goal a bit more. The original setup works well enough for the stock motions, but overall does not reflect that goal. It's like pressing a button for a single frame.
     
unityunity