Search Unity

VR Magic Gestures AI - 3D Gesture Detection and Recognition

Discussion in 'Assets and Asset Store' started by ravingbots, Apr 27, 2017.

  1. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Last edited: May 5, 2017
  2. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
  3. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Version 1.03 brings the following improvements:
    1. Added event delegates to WandManager class (gesture events now can be handled without modifying the code, and the class does not rely on the existence of the UI system)
    2. Added optional mirroring for a recognized gesture (support for left-handed players)
    3. Fixed the singleton base class (it triggered assertion when a singleton was in a prefab)
     
  4. ColtonKadlecik_VitruviusVR

    ColtonKadlecik_VitruviusVR

    Joined:
    Nov 27, 2015
    Posts:
    197
    Hey @ravingbots,

    Do you think your plugin would be able to help with recognizing sword swings (i.e. when they start and end) and where they will end up striking an enemy?

    Cheers,
    Colton
     
    Last edited: Aug 24, 2017
  5. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hi Colton,

    The mechanics you described would require continuous gesture recognition, which is still on the road map. However, you should be able manage and customize the way how the recognition system works quite easily. You could continuously register the tracker position, add it to the gesture trail (prune it when it is too long), invoke the recognition method every frame (or in intervals), and clear the gesture trail every time a sword swing was successfully recognized. You could also add some constraints like minimum gesture length or minimum recognition interval. To summarize, the package does not have that but it should be easy to add.

    Best,
    Bartek
     
  6. sunari

    sunari

    Joined:
    Nov 30, 2014
    Posts:
    6
    Hi @ravingbots,
    Ready-To-Use gestures (like 'm', '|' gesture in the demo) can be added by developer?

    Thanks
     
  7. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hi @sunari,

    I will add this to the to-do list for the next update, but for now you should be able to train a new gesture easily.

    Best,
    Bartek
     
  8. sunari

    sunari

    Joined:
    Nov 30, 2014
    Posts:
    6
    Hi ravingbots,

    I got it, I will purchase when the next update is alive.
    Can't wait!

    Thanks
     
  9. UniversalGesture

    UniversalGesture

    Joined:
    May 29, 2017
    Posts:
    125
    I get a compile error --
    Assets/RavingBots/Sources/MagicGestures/AI/GesturePreprocessor.cs(132,64): error CS0104: `Vector3Int' is an ambiguous reference between `RavingBots.MagicGestures.Utils.Geometry.Vector3Int' and `UnityEngine.Vector3Int'
     
  10. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hello,
    The package was uploaded before Unity introduced its own int vectors (Vector3Int), so these classes were provided in the package. They are exactly the same and you can delete them (Vector3Int.cs in the package). After that you will need to change the int vector variables from upper case to lower case (replace ".X" with ".x"). This should solve the problem.
     
  11. UniversalGesture

    UniversalGesture

    Joined:
    May 29, 2017
    Posts:
    125
    Great, is there a way
    Great, the only problem I have now is using this with an Oculus, is this possible? I tried other products but none of them work except this one. It would be a big plus for my project if I can get this to work.
     
  12. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    At the moment, the package communicates with SteamVR plugin, which should support both the Vive and Oculus. However, I haven't tested it for Oculus, but it shouldn't be difficult to adapt for it.
     
  13. UniversalGesture

    UniversalGesture

    Joined:
    May 29, 2017
    Posts:
    125
    Could you tell me which script I need to edit for the controls? This way I could change it to whichever button I need.
     
  14. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Check out ControllerManager and TrackedController in the RavingBots.MagicGestures.Integration namespace. They handle controller events. The events are used by MagicWand class.
     
    UniversalGesture likes this.
  15. UniversalGesture

    UniversalGesture

    Joined:
    May 29, 2017
    Posts:
    125
    Thank you!
     
  16. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    and I've forgot about the WandManager - you should look there before MagicWand :)
     
    UniversalGesture likes this.
  17. UniversalGesture

    UniversalGesture

    Joined:
    May 29, 2017
    Posts:
    125
    I tried making this work for simple keyboard controls but the coding is fairly hard to understand as it references a lot of ambiguous objects.
     
  18. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Do you have any particular objects in mind? The package has several modules.
     
  19. UniversalGesture

    UniversalGesture

    Joined:
    May 29, 2017
    Posts:
    125
    I just want the start action and stop action to be controlled by keys on my keyboard, but simply calling the methods doesn't seem to work since they have some object "this" inputed
     
  20. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Maybe this will help:
    1. Open MagicWand.cs
    2. Replace all occurrences of SteamVR_TrackedController with MyCustom_TrackedController
    3. Create a new class MyCustom_TrackedController (MonoBehaviour) and implement the following events:
    - public event ClickedEventHandler MenuButtonClicked;
    - public event ClickedEventHandler TriggerClicked;
    - public event ClickedEventHandler TriggerUnclicked;
    (the events are referenced by the MagicWand in Awake())
    4. Use keyboard or mouse input to invoke the events in MyCustom_TrackedController
    5. Use keyboard or mouse input to move the transform in MyCustom_TrackedController
    6. Remember to replace all SteamVR_TrackedController with MyCustom_TrackedController components in the scene.

    Now your new class should emulate a VR controller.
     
  21. UniversalGesture

    UniversalGesture

    Joined:
    May 29, 2017
    Posts:
    125
    I will try this, thank you very much for your help
     
  22. saltysquid

    saltysquid

    Joined:
    May 1, 2017
    Posts:
    41
    Quick questions - I want to detect when my player swings a fist (grip pressed + arm swinging towards target). Can MG track this type of movement? ( I can handle the grip press myself, but it's the swing motion that I'm having trouble with). Can I define these types of movements pre-run or does the user have to define all motions in-game? Once defined can a user 'save' these? For instance if they save their game I'd like to save all trained spells. Otherwise, this looks like a great library. Thanks!
     
  23. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hi, sorry for a delayed reply, but I missed the notification.

    Yes, the system should be able to detect a forward swing, but you must increase the "Grid Resolution Z" (let's say to 6) in Gesture Learner component.

    At the moment, you can define gestures in the Unity Editor - this way gesture data will be saved to a ScriptableObject. Unfortunately, this method will not save gestures defined during the game. Do do this, you must write your own serialization method that will store the data in user files.

    The asset is still supported, and soon it will be updated to the latest Unity version.
     
  24. Evil-Otaku

    Evil-Otaku

    Joined:
    Oct 17, 2012
    Posts:
    72
    I have a quick question. Can this support 2 handed gestures? Like a street fighter fireball motion?
     
  25. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Two-handed gestures will be supported in a new version of the package. We are planning to release it next week.
     
    Evil-Otaku likes this.
  26. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Unfortunately, the holidays ruined our schedule a bit. The new version of the package is ready to be uploaded, but we did not manage to update the documentation and prepare a new video tutorial. If you purchased the package and urgently need a new version, send us an email to get access to the updated asset.
     
  27. reigngt09

    reigngt09

    Joined:
    Jun 19, 2014
    Posts:
    2
    Hi, I would like to buy this Unity Asset but I need to know 2 Things.
    1. Does it surport two handed gestures?
    2. Is it compatible with Steam VR 2.0 or later?
     
  28. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hi!

    1. Two handed gestures will be supported in the updated version.
    2. Also, in the updated version, we replaced SteamVR with built-in OpenVR so that it will be easier to use.

    Unfortunately, the new version hasn't been uploaded yet. I will post here when it's published on the asset store.
     
  29. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hello @ravingbots , could your asset be adapted for 2D gestures recognition based on screen input (mouse/touch)? How modular is your asset in terms of splitting the gesture recognition from the VR input mechanics?
     
  30. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hi!

    Yes, the gesture recognition itself doesn't depend on input method, therefore it's possible to use it without VR and even in 2D. Please be aware that this solution will require some adjustments in the asset scripts.

    Btw, the new version of the asset is almost there...
     
    henriqueranj and Evil-Otaku like this.
  31. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Good news for those who have been waiting for the new version - it is finally live at a promotional price! Below is a list of improvements:

    - Published a new video tutorial,
    - Added two-handed gesture tracking,
    - Replaced SteamVR with OpenVR framework,
    - Simplified code structure and reviewed documentation,
    - Upgraded to Unity 2018.3.

    Sorry it took so long, but our team worked hard recently. We hope you like the update!
     
    Evil-Otaku likes this.
  32. fusecore

    fusecore

    Joined:
    Oct 3, 2013
    Posts:
    16
    I'm planning on getting this asset soon, I'm surprised how little reviews you have given how polished the asset looks.
     
  33. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hi! Dunno why so little reviews. Maybe a giveaway will help a bit. Check out our social pages to find free keys:

    - https://twitter.com/RavingBots
    - https://www.facebook.com/RavingBots
     
    johaswe likes this.
  34. fusecore

    fusecore

    Joined:
    Oct 3, 2013
    Posts:
    16
    Thank you, that's very kind. I'll write a review once I get some experience with it. :)
     
    ravingbots likes this.
  35. miles99

    miles99

    Joined:
    Sep 22, 2015
    Posts:
    1
    Hello, I wonder if this package still works in 2017.4.3?
     
  36. Nuugames

    Nuugames

    Joined:
    Oct 5, 2012
    Posts:
    11
    Is this free for commercial use? Many of the gesture packages are not. Thank you.
     
    Last edited: Oct 9, 2019
  37. Dr_Evil_

    Dr_Evil_

    Joined:
    Sep 19, 2019
    Posts:
    1
    Hi! Would love to buy your asset if there is a way to use Oculus instead of OpenVR(steamVR). Im doing a Projects för the oculus quest so it needs that native backend.

    Maby its possible to just modify some reference.

    The oculus ask is called Oculus integration on the assetstore

    Thx //Evil
     
  38. henkjan

    henkjan

    Joined:
    Aug 1, 2013
    Posts:
    146
    Hi, I'm thinking of buying the plugin but I first have to know if this can help me with my problem:
    We are building a VR application or Oculus Quest where I need to recognize the pose and gestures that the player makes when talking or listening to an NPC. For example I have to know if the (VR) player brings up both hands as a "what do you mean" gesture.

    We are also using embodiment to give the player presence in the environment. For that I use an rigged character that follows the head and hands using IK (Final IK - VR IK). So I have more information about the body (although interpreted).

    Could I use the plugin for detecting (by training) the natural gestures a player makes?
     
    Last edited: Oct 24, 2019
  39. GregSquire

    GregSquire

    Joined:
    May 12, 2017
    Posts:
    2
    Hi @ravingbots,

    I purchased this asset a while back, however I'm having trouble getting it to save the training data between play sessions. My understanding is that this should be saved in the file "Game Data.asset", and that is what the Magic Game script is pointing to. I can click on the "clear" button to remove spells (and that works), however if I train a couple of spells in a play session, that works fine in that play session. But if it stop the play session in Unity and they play it again all that training data seems to be gone. What am I doing wrong? Thoughts?

    Also I'm having trouble getting it to recognize a "push" type gesture. Line moving forward from player. It seems to handle gestures better in an X, Y plane instead of X, Y & Z. I tried setting "Grid Resolution Z" on the Gesture Learner script to 6 (the same as X and Y) thinking this would help, but alas no it didn't. Thoughts? Is this designed to do 3D gestures, or just 2D gestures (in a 3D space)?
     
  40. ravingbots

    ravingbots

    Joined:
    Aug 20, 2015
    Posts:
    44
    Hi @GregSquire,

    Please note that saving to the "Game Data.asset" is only for the sake of example, and it works only in the editor (you need to remember to click "Save Project" after you make any changes).

    Yes, the "Grid Resolution Z" needs to be set to have gestures in 3D space. The algorithm works the same, but by increasing the z-resolution you also increased the number of possible configurations (X x Y x Z), so you need much bigger training data.
     
  41. GregSquire

    GregSquire

    Joined:
    May 12, 2017
    Posts:
    2
    Hi @ravingbots,

    Do you guys have any plans to support Unity's new input system, and OpenXR? (This is all in Unity 2021). The industry seems to be moving to this new standard.