Search Unity

  1. Unity 2019.1 is now released.
    Dismiss Notice

Feedback PROCEDURAL BIPED ANIMATION FRAMEWORK Any interest?

Discussion in 'Assets and Asset Store' started by ParadoxSolutionsSoftworks, May 8, 2019.

  1. ParadoxSolutionsSoftworks

    ParadoxSolutionsSoftworks

    Joined:
    Jul 27, 2015
    Posts:
    262
    Feedback wanted (Any interest):
    I have been developing a procedural animation framework for creating high quality biped humanoid animations procedurally but I want to make sure there is interest before I put any work on the marketing, docs and such for publishing on the asset store. I really can't afford to develop this more if there is not significant interest. (Just like the thread to show you want this).

    Why:
    I wanted to create a way to animate humanoid characters without having to do it by hand or mocap and since I am a noob animator (yikes things don't bend that way) and there was no code I could call like OpenLeftHand() or RaiseRightArm() the best I have seen was IK and maybe value sliders, what I have created takes it to the next level.

    The framework provides the following:
    -A unique algorithm using anatomically correct bone constraints and terminology ensuring natural looking animation.

    -Scriptable animations, create animation sequences though code. i.e. something like:
    Code (CSharp):
    1.  
    2. AnimationSequence.Add(RaiseRightArm(180));
    3. AnimationSequence.Add(RightHandMakeFist());
    would make raise the characters arm 180 degrees and make a fist. RaiseRightArm() could be another sequence that might slightly bend then straighten the arm while raising it for more realism. This sequence can then be called RaiseRightFist() and used in other sequences. (Making a fist would be made up of a bunch of sequences for rotating each finger that are made up of base sequences for each joint).

    -Base sequences for basic movement on the individual bone level.

    -Save animations and animation sequences as assets for reuse, 3rd party resale, cross project.

    -Create procedural animations at runtime (for use in Cinemachine or whatever)

    -Kitbash animations by stacking sequences, programmers can create animation packs i.e. hand gestures, martial art stances.

    Some things I have not implemented but I would for the Asset Store:
    -Support for Generic rigs (where they correlate to humanoid rigs), already partially implemented.
    -A visual editor of some kind for creating animation sequences.
    -Convert animations to animation sequences (id have to look into this but in theory should work).


    This is still a bit of a WIP project and I have not finished making all the bones scriptable (Its about 40 lines of code per bone for 30 bones after optimization not including base motions so a lot of tedious typing and find/replace but the core is done) If this gains interest ill post a gif or something of what this system can make.
     
    SpookyCat and StevenPicard like this.
  2. DebugLogWarning

    DebugLogWarning

    Joined:
    Jun 28, 2018
    Posts:
    9
    Sounds interesting, but you should really post some visual examples.
     
  3. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    5,178
    sounds interesting, videos would be nice to see!

    and i guess this could be easily used in VR then? (to make hands follow controllers and more)
     
    StevenPicard likes this.
  4. StevenPicard

    StevenPicard

    Joined:
    Mar 7, 2016
    Posts:
    604
    Sounds interesting, especially like @mgear said, for VR.
     
  5. ParadoxSolutionsSoftworks

    ParadoxSolutionsSoftworks

    Joined:
    Jul 27, 2015
    Posts:
    262
    I'm not to big on VR yet so I am not to sure, maybe it would work but keep in mind that this uses Unity's animation system and not custom interpolation. It can create animations at runtime or bake them but I have not added any custom interpolation similar to IK. You could use this to create the animations for the hands and then use IK on top of that (IK is typically used in VR for hand tracking). The code is super flexible so it would not be hard to create a system that act similar to IK or on springs like the UFPS camera.

    TL; DR

    A punching animation- Yes
    Grabbing/reaching for something and wrapping finger around a surface- Not really, use IK on top of animation
    Holding something that is parented to the hand- Yes

    The main benefit of this is that you could create an animation where hands open/close without having to manually rotate each finger joint, instead have the computer rotate all of them by a certain i.e. press a button, make a hand gesture, or play a pre-baked animation.
     
  6. ParadoxSolutionsSoftworks

    ParadoxSolutionsSoftworks

    Joined:
    Jul 27, 2015
    Posts:
    262
    I have the core of the code done but I'm still in the process of creating the base animation sequences for each bone. Once I have the whole body setup I will show something off but it is all code right now so there is not much to see.
     
  7. ParadoxSolutionsSoftworks

    ParadoxSolutionsSoftworks

    Joined:
    Jul 27, 2015
    Posts:
    262
    An in-depth description of how the system works:

    High level API:
    Each bone has been mapped based on its anatomical constraint (i.e. a neck joint will bend different than the wrist).
    This uses float ranges to determine the minimum and maximum range of motion per motion type of each bone, some constraints can change depending on the current position of another bone for example you can only move you leg outwards away from your body in the direction your toes point in the leg chain. Some constrains can even take into account the size of a characters muscles that may inhibit movement.

    The high level API allows the user to rotate a specified bone by degrees within the constrain over time. These are the base animation sequences.

    I have about 500 lines of commented code that go over the anatomy and flexibility jargon used in the high level API, it is fairly scientific and I doubt most people will end up writing code in this pure form.

    Scripting API:
    Creating animation though code takes collections of the base animation sequences and lists them in a ScriptableObject these animation sequences can contain other animation sequences allowing users to stack motions and create complex animations for example a animation sequence that moves just the tips of the fingers can be added to another sequence that moves the middle finger segment then another for the last segment. The combined sequence could be either opening or closing the hand based on what values you gave it though out. That final animation sequence can then be used in any other sequence that may need it.

    Visual API:
    A visual API would be something I would add to an asset store version that allows users to kitbash animations from sequences made by programmers or even create sequences from existing animation files.

    TL; DR
    So in short Unity only has a way to animate an individual position or rotation, this system figures out where all the positions and rotations in a body need to be and allows you to make complex animations though code or by clicking buttons.
     
    Flurgle likes this.