Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Learning Blender vs Maya?

Discussion in 'General Discussion' started by squanch, Jul 10, 2017.

  1. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,553
    Rigify rigs are fairly heavy (it has 3x number of bones) and are not compatible with mecanim retargeting system out of the box.

    Because of this I often create my own rigs when making anything for unity. However, I gotta say that by itself rigify is a very cool system. I only wish it had simplified game-dev-oriented rigs in it.

    In my experience, blender's "Automatic weights" vertex weight generation (when you first bind a model to a rig, you can chose how the weights are generated -> based on envolopes, automatically, or no weights should be generated (meaning you'll handpaint it all by yourself)) currently produces reasonable default results that require minimum amount of tweaking. I'd recommend to try them out if you haven't already.
     
    Deleted User and theANMATOR2b like this.
  2. theANMATOR2b

    theANMATOR2b

    Joined:
    Jul 12, 2014
    Posts:
    7,790
    Yeah - I've been meaning to give blender another try just to become a bit more proficient in it's animation and rigging ability - if only to be able to assist users who encounter issues.
    I've read the rigging/skinning and animation tools have become more stable and better over the past two versions. But so far it's low priority for me. As long as it stays free - I can do it at my leisure.
     
  3. SunnySunshine

    SunnySunshine

    Joined:
    May 18, 2009
    Posts:
    974
    Ah, that's good. Pardon my ignorance.

    I guess what it all comes down to when selecting software is preference, really. All these tools have what's required to create game assets. I happened to learn blender before any other 3d authoring software, so naturally I feel the most comfortable in it. In retrospect, I'm happy I started with it, because I think it's a great piece of software, and I have a lot of faith in the blender devs and community. And it's free. (Although I've actually been donating €5 per month for the past 3 years to blender foundation.)

    When something is open source like Blender is, it has a sense of purity I think commercial products lack. There's always a risk the company developing product X might decide to F*** their users over. But with an open source product, it's different. There's always a risk with open source stuff that it might get abandoned of course, but I think Blender is past that point. It has a solid foundation and community, and now even proper funding.
     
    theANMATOR2b likes this.
  4. Elzean

    Elzean

    Joined:
    Nov 25, 2011
    Posts:
    584
    Ha yes it's not compatible with the humanoid stuff out of the box. There is a few tutorial on youtube which are the one i followed that shows the few steps to fix it. Once you did it once or twice it a matter a minutes to do that step, it's mostly reparenting a few bones. Once exported correctly it doesn't have more bone than other humanoid :)
    Rigify strength are also all the widget stuff that you use for animation.

    Here is one made and animated using rigify, his tail is also using rigify, it's like adding an arm. There is a few extra bones for his face, ear and cloth :)
    unnamed.gif
     
  5. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,553
    Yes, I'm aware of that. It is also possible to write an asset preprocessor for this, plus there are few scripts for ditching widget bones, however it would be also great if this stuff was built into rigify by default. Basically, if it is something you need to do every time, it should probably be automated.

    ----
    This rat seriously reminds some idea I had once. I wonder if I should just start working on those ideas...

    Also it is surprising that mecanim didn't break its neck and didn't put feet flat onto the floor.
     
    theANMATOR2b likes this.
  6. Elzean

    Elzean

    Joined:
    Nov 25, 2011
    Posts:
    584

    Yes if the guys who make rigify would do something to make it easier to work with unity that would be great.

    The animations on the character are working because i made those, it was pointless for him to make it compatible with other humanoid because most animation are going to look awkward on him, like you notice his feets are not like normal humans.
    I very rarely do rigs, so i did it compatible to just not forget the steps too much XD

    On a side note:

    His cloth are separated, belt, rob bottom and top are skin separatly on the same rig, then when you import those piece of cloth you can map them to the bone of the character.
    Here is a piece of code that i found somewhere, i made some changes but i can't remember what or why right now, anyway its not very complex and might be usefull for others:

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5.  
    6.  
    7.  
    8. public class Equipmentizer : MonoBehaviour {
    9.    
    10.     public GameObject target;
    11.    
    12. // Use this for initialization
    13.    
    14.     void Start () {
    15.        
    16.         SkinnedMeshRenderer targetRenderer = target.GetComponent<SkinnedMeshRenderer>();
    17.         Dictionary<string, Transform> boneMap = new Dictionary<string, Transform>();
    18.         foreach( Transform bone in targetRenderer.bones )
    19.         {
    20.             boneMap[bone.name] = bone;
    21.         }
    22.        
    23.         //List<Transform> bones = new List<Transform>(targetRenderer.bones);
    24.        
    25.         SkinnedMeshRenderer thisRenderer = GetComponent<SkinnedMeshRenderer>();
    26.         Transform[] boneArray = thisRenderer.bones;
    27.         for(int idx = 0; idx < boneArray.Length; ++idx )
    28.         {
    29.             string boneName = boneArray[idx].name;
    30.             Transform t = null;
    31.             if(boneMap.ContainsKey(boneName))
    32.             {
    33.                 t=boneMap[boneName];
    34.             };
    35.             if(t!=null)
    36.             {
    37.                 boneArray[idx] = t;
    38.                
    39.                 //t.SetParent
    40.             }
    41.             else
    42.             {
    43.                 if(boneMap.ContainsKey(boneArray[idx].parent.name))
    44.                 {
    45.                     Vector3 tempPos = boneArray[idx].localPosition;
    46.                     Quaternion tempRot = boneArray[idx].localRotation;
    47.                    
    48.                     boneArray[idx].SetParent(boneMap[boneArray[idx].parent.name]);
    49.                     boneArray[idx].localPosition = tempPos;
    50.                     boneArray[idx].localRotation = tempRot;
    51.                 }
    52.             }
    53.             //if( false == boneMap.TryGetValue(boneName, out boneArray[idx]) )
    54.             //{
    55.                 //List<Transform> bones = new List<Transform>(targetRenderer.bones);
    56.                 //bones.Add(boneArray[idx]);
    57.                 //Debug.LogError("failed to get bone: " + boneName);
    58.                 //Debug.Break();
    59.         //}
    60.         }
    61.         thisRenderer.bones = boneArray; //take effect
    62.        
    63.        
    64.        
    65.        
    66.     }
    67.    
    68. }
    69.  
     
    theANMATOR2b and SunnySunshine like this.
  7. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,553
    Yeah, I worked with a werewolf character once.

    I'm aware of the inverted knee issues - basically animals stand on their toes, Backward-facing "knee" is actually heel of the foot, and so plugging this into mecanim will make mecanim straighten character's back, and put heels on the floor, resulting in awkward posture. At which point you could just use "generic" animation without mecanim.

    So I've been wondering if you used some trick to make skeleton mecanim compatible, but guess not.
     
  8. Elzean

    Elzean

    Joined:
    Nov 25, 2011
    Posts:
    584
    Well now that i think about it, there should be ways to force the heel bone to keep an offset from the toes, like foot placement or other stuff acting on bones work. It would potentially make most human animation work then no ?
     
  9. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,553
    Nothing simple/straightforward I can think of.

    Basically, you could do something like that:
    1. Make animal toes "feet" instead of toes, don't make a bone for heels.
    2. Make spine bones not follow curved spinal cord and instead make it straight.

    However, this kind of rigging sounds like more trouble than it is worth.

    A safer option would be to create a secondary humanoid rig, and derive wererat animation from it using IK layers. Similar to having an "ik target" bone in your animation, except that it'll be a whole human skeleton.

    Or , I don't know, it would be great it was actually explained somewhere how mecanim works so people would be able to read human animation data directly and convert it to their desired skeleton. As far as I can tell, mecanim parametrizes animation into series of "muscle contractions" (see the sliders in tpose scene), but that's still just a guess.
     
    Elzean likes this.
  10. squanch

    squanch

    Joined:
    Jun 24, 2017
    Posts:
    21
    I heard it wasn't all that great, I think from another thread about Maya vs X software, so probably just someone's opinion. I've used it before but I have never used ZBrush so wasn't really able to make a comparison. I'll definitely give it a download though to try and learn it.
     
  11. Deleted User

    Deleted User

    Guest

    I had a day or two left of my Maya sub, so I actually sat down and thought I'm going to learn Blender.. I think last time when I tried Blender what put me off was the dire default setup.. This time around I hit the preferences and set it up just like Maya, some of the preset keys were off like Alt + X is set to a default extrude which sort of extends faces based upon the camera view..

    If you open the tool menu (left hand side) where it says extrude, then right click to change the preset key you can replace it with Region Vertex Normals which extrudes the way you'd expect it to. In total I spent about half a day learning the basics, still much more to learn but I have to admit once it's set up properly it is quite impressive.

    The documentation is somewhat to be desired (especially if you're using Maya presets), it wasn't initially apparent you had to LITERALLY select the edge to grab an edge / face loop with CTRL + ALT. Although it's instant when you figure it out and super quick, so much better than Modo's clicky or all over the place keyboard map press 3 for poly's then L at the other side of your keyboard to select the loop.

    If you're a Maya user it's already right next to the viewport controls, so it fits like a glove.

    This one's hard to describe in text, but look at the lip that loops over the crate (pic below). Usually if I were to add an edge loop and move / scale it to cover the edges for highpoly mapping it would to some extent upset the mesh. Blender for singular loops (CTRL + R) seems to scale / place the loop to match the outer / inner poly's without flexing the poly edges which is pretty sweet for HP / LP workflows, also the ability to change the extrude type to even out bends was also super cool.

    Initially wasn't that impressed with the instant mouse engagement after using a tool, lets say you want to extrude or loop it will just slip.. Although as it seems with most things in blender there's an answer to it, hold CTRL whilst you move the mouse about it and it will use it as a uniform snap segment.. There's even a pivot point directional snap like Maya, which was one of the reasons I kept using it as opposed to Modo's workplane / pivot thing that I really can't get into.

    Here's the thing, I used Blender for half a day.. I've been using Modo on and off for months over quite a few years, I still don't "get it".. Well not completely it seems.

    This is only the beginning, I do quite complex modelling with soft deformers etc so we'll see how it ultimatley stacks up.. Again still got a lot to learn about it but from initial impresions I am very impressed. Seems @neginfinity was right.. (As much as I hate to admit it :D))..

    After a bit o' learning, this sci-fi crate mesh thingy took me 3.5 minutes (I timed it).. Sure I bet I could do it with Maya in about 10 minutes but these things add up.

    Blender1.jpg
     
    Last edited by a moderator: Jul 24, 2017
    jabevan, Ryiah, kB11 and 2 others like this.
  12. squanch

    squanch

    Joined:
    Jun 24, 2017
    Posts:
    21
    Sorry, I know this is kind of a late reply but I just thought I'd share my thoughts.

    I think for me definitely like you said, is the default setup is very displeasing, I get that with a bit of setup it might function better than Maya, but maybe I'm just used to Mayas UI more at this point.

    From what I can tell, Blender might actually be generally easier and quicker to learn/use, but isn't necessarily the industry standard.
     
  13. Deleted User

    Deleted User

    Guest

    Unless you're looking to get a job at a specific company I wouldn't worry about "industry standard" and TBH skills are easily transferable between the two.. I'm still looking into it but one of the things that made Maya LT so powerful was it's integration with Stingray, irrelevant of whether or not you're using it to complete your full game even from a DCC to game for pre-visualisation it is great.

    There's no guessing, once you've visualised it in Stingray it will transfer across every engine easily.. Not exactly a new workflow for larger companies, still it does make a lot of difference.

    In all fairness, if you can afford both I don't see why you don't get both.. Especially for the .FBX pipline alone.

    Anyone got any experience with exporting large enviro's / levels from Blender to Unity? I try to avoid the one mesh per export / zero vector approach if I can (as it's crazy time consuming). What about simulation / visualisation in BGE?

    There's a few things I'm still trying to find equiv's of:

    Automatic quad safe re-topology, seen some decent tools but not like the one's in Modo / Maya.
    ShaderFX + Integrated support for Allegorithmic.
    HumanIK which sorry is better than Rigify etc.
    Motion Trails
    Unfold 3D

    I still need to look deeper because there might be other tools available (whether paid or free).. Although like everything else there's pro's / cons.. I will say Blender is in my top three now which is cool, if it was my only choice I'd be fine with it.
     
    Last edited by a moderator: Jul 26, 2017
  14. ArachnidAnimal

    ArachnidAnimal

    Joined:
    Mar 3, 2015
    Posts:
    1,760
    Unfortunately, job recruiters do not believe this.
    If the job requirements say Maya experience required, and your resume only mentions you are an expert at Blender, then in the recruiter's mind you are 0% qualified for the position.
     
  15. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,553
    I recently realized that fbx exporter actually resolves references to linked data correctly.
    Meaning if you make one mesh, create thousand of copies in the scene with Alt+D (meaning by editing one you modify all of them), export it, you'll import - in unity - thousand objects and ONE mesh.

    So this should work, but you'll need to slap asset preprocessor onto it to handle colliders and other stuff like that.

    As for blender game engine, I haven't used it much, plus it doesn't seem to be the center of Devs attention at the moment.

    There's a few things I'm still trying to find equiv's of:

    Remesh can create quad only mesh, but it'll be quite ugly and horrible. A fairly popular tool is retopoflow, but it is not automatic


    Nope, no integration. Nodes or custom shading language (OSL) only.
    I have high expectations from eevee, though, which is said to use pretty much the same shader system as unreal engine.

    Also nope. I take it you've discovered horrors of dealing with pole targets.

    You can use onion skinning with armatures and tehre was an addon for displaying trajectories, but that's about it.

    Not familiar with it.

    It is entirely possible that there's an addon that does osme of this somewhere (I discovered Loop tools entirely by accident for example) and I'm simply not aware of it.
     
    Deleted User likes this.
  16. Deleted User

    Deleted User

    Guest

    That's because a lot of the time HR departments go by keyword searches to get through as many applications as possible, as there's generally a metric ton of applicants and HR aren't generally technically inclined.. If I talked to a technical artist for an interview and they dissmissed someone out of hand because their main app was blender, I'd be concerned about their knowledge and start questioning who I'd applied to get a job with..

    It's like game engines as well really, if you can create a complex game in Unity you can create a complex game in any engine, the fundamentals are pretty much same it's just the API's and finding stuff in the editor you have to worry about.

    @neginfinity

    Yep I ran into the flipping issues / rotational errors, I re-rolled and stopped using Rigify although I've still to try something like iTaSC (like someone suggested) anyway It still seems ultimatley Maya is better for getting the job on the whole (although at this point I find blender to be a quicker box modeller), Maya's animation systems are easier / more efficient and seem to just work as expected although Blender is useable if not finicky (removing the IK solvers bit)..

    One of the issues I did come across is compatability with Blender, in LY and UE Blender did not play nicely.. I had to bake a lot of things down and LY outright rejected imports, I had to bring it into Maya to merge vertices / freeze everything and export it again w/ .FBX SDK..

    If you're using Blender with Persona for re-targeting etc. that will be your issue, I personally couldn't get it working right.. HumanIK doesn't have a problem, then again they did release ART for Maya as well so it somewhat explains things..

    Only issue with Maya LT is Autodesk keep stripping useful stuff out of it, like sure I suppose I can deal with two animation layers one for the face and the other for the rest but it's unnecessary.. Also transfer maps were useful, until they took it out.. Can't say the muscle deformers were a massive issue, but originally there wasn't cluster deformers either which was a pain..

    Also could do with proper scale deformers which is in the full version of Maya.. They need to decide whether or not it's a tool for indies or an extended cut down trial version with a path to full Maya.. In which case I'll write my own IK solvers.!

    Anyway, quite contrary to a lot of threads in the general forum this one's been very informative and useful..
     
    theANMATOR2b and ArachnidAnimal like this.
  17. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,553
    @ShadowK: Speaking of which, does MayaLT have support for an equivalent to Blender drivers?
    https://docs.blender.org/manual/de/dev/animation/drivers/index.html

    Basically... you can make a programmable controller for any object parameter. For example, you could reference position, scale or a member variable of some object in the scene, and use it to automatically control any parameter of something else. The controller is defined via python expression but can also be used in conjuction with F-curves.

    This is ridiculously powerful, and in practice it means you can make armature control blendshapes, and you can make helper bones that will toggle object visibility, for example.

    ------
    Speaking of flipping issues, the problem with blender IK is that secondary target (elbow target etc) may require entering "angle" value which is fairly dumb. In case of improper limb bending I've noticed that setting rotational limits helps.

    As for persona retargeting, it is exactly the problem which I hit last time I worked with unreal (which has been quite some gime ago). Unity can happily eat blender skeleton without complaining and retarget animation onto any other skeleton. Unreal couldn't handle those properly.

    It is possible that there are ways to deal with problem, for example, I definitely recall some girl posting an "Unreal rig for blender" on unreal forums, somewhere.
     
    Last edited: Jul 29, 2017
  18. Deleted User

    Deleted User

    Guest

    It does via MEL script (in Maya LT) and you can also create / set preset actions via the animation driver system so a lot of it you don't even need to script (Driven keys are cool for multiple objects, let's say you want a door to swing as the character pulls it, you can use driven keys to do both and it will interact with the IK component of a character for e.g.)..

    Also it has automated path animation and a full timeline editor, if you ever have issues with re-targeting (which I have with Mecanim on various occasions) you can actually re-target with Maya then just import multiple rigs w/ generic in all fairness if it's the same rig Unity seems to understand and you can use generic rigs anyway.

    Yeah Unity does seem to play well with blender, the fact you can dump a .blend file in and Unity will process it and I even came across this doc:

    https://docs.unity3d.com/Manual/BlenderAndRigify.html

    I did always wonder though, if there's anything that could / should be integrated into an engine it's characters.. I've had hit and miss situations with every engine so far, with every DCC. They all expect you to do things X way and they can be very rigid about it..

    Even with Gepetto (LY / CE), Persona and Mecanim they all have their deformer / skeletal mesh setup system and Unity even has a dope sheet / graph editor system in which to create animations.. You seem to overwrite a lot of animations with rigidbodies / engine inverse kinematics / ragdoll / root motion and PHAT animations anyway..

    The only thing you're really doing is setting up the rig and moving things about in the DCC and for all the extra effort you have to do to get it working right with the engines own system I'm suprised one engine just didn't finish off the bits that were left over with generic bi and quad rigs..

    A joint editor isn't difficult to make and you can find them in most consumer (non-pro) pieces of software.
     
    Last edited by a moderator: Jul 29, 2017
    theANMATOR2b likes this.
  19. jtbentley

    jtbentley

    Joined:
    Jun 30, 2009
    Posts:
    1,397
    A few years ago, I'd have said to avoid Blender - not because the tool was immature (which at that time, it was), but because of how entrenched the art world was in Autodesk products, artists who used Maya/Max/XSI just had much more proven workflows and pipelines - and a lot more industry experience in general.

    Nearly a decade has passed since - and I've seen some excellent work from Blender, there are genuine veteran artists using it now, it works fine with Unity. Also, although I'm no longer an artist by trade, I've not seen a lot of innovation coming out of Autodesk across any of their products - I feel like the products iterate a handful of small things per year to warrant ongoing subscription costs.

    So I don't think at its core there's an issue to pick one or the other if you don't have existing entrenched workflows or staff who are already too deep in one or the other to re-skill. My personal preference is always going to be Maya/Max, simply because they're the tools I'm most comfortable with. I've yet to see 'the killer feature' one app has over the other.
     
  20. theANMATOR2b

    theANMATOR2b

    Joined:
    Jul 12, 2014
    Posts:
    7,790
    As a Blender artist - IF you happened to get lucky enough to get past the first 2-3 sorting rounds - to get to the interview phase, any experienced tech artist would be able to know if the interviewee was knowledgeable and competent with the techniques - even if they worked in a different software.
    Unfortunately for most companies that have 20+ employees the standard/common application review process does not include any technical people. The very first round is a computer sorting round that looks for specific key words, no human interaction beyond selecting the keywords to find within a resume.
    The second round of resume sorting consists of HR noobs who are tasked with getting the resume count down to a certain number. usually 20 or less. Only then will someone with ACTUAL working knowledge be available to look at the remaining resumes to select 5-10 applicants to perform tests, phone interview, or 2-3 to come in for and in person interview.

    Instead of listing software packages - another option is to detail the experience in techniques, workflows and have a kick butt demo reel that will get noticed - regardless of which software package the artist uses.

    This is still largely fact, except for larger CG companies who have proprietary software.
    I agree - blender has made leaps and strides playing catch up for about 10 years just to come up to be considered a viable alternative to other commercial packages. But I don't see blender ever exceeding beyond those commercial packages. They will always be 1-2 iterations behind in innovative/cutting edge solutions built into the commercial packages.
    To be honest though - I think blender has done a great job at advancing their product form a clunky, unstable alternative to be a viable option comparatively to todays 3D content creation tools.

    But software really isn't important - it's all about the artists ability, creativity and ability to overcome any software's shortcomings - either within the software or in the import/export pipeline - going into the game engine.
     
    ArachnidAnimal and Deleted User like this.
  21. Fera_KM

    Fera_KM

    Joined:
    Nov 7, 2013
    Posts:
    307

    Adding to that,
    On project hires have zero interest in paying for your "on the job learning time", learning a new software or improving one skillset or another.
    Permanent jobs hires are usually a little more flexible in allowing you to learn a new off the shelf software, and investing in you.
    That said, most jobs are on project basis, so while Blender is probably excellent in it's own regard, picking up a fundamental knowledge of 3dsMax or Maya might be worth considering depending on future prospects.
     
  22. Korindian

    Korindian

    Joined:
    Jun 25, 2013
    Posts:
    584
    @ShadowK Have you seen Auto Rig Pro ? Looks like they're supporting export to Unity. I haven't tried it yet, but I was looking for a possible rigging solution if I make the switch from Maya LT to Blender.

    I'm really interested in a mesh decal workflow, and I really love how there is a dedicated plugin for that being developed for Blender by one of Unity's Asset Store developers. Haven't found anything like that for Maya LT yet.
     
    Deleted User likes this.
  23. Deleted User

    Deleted User

    Guest

    I have now, thank you very much :).. There's also one for muscle deformations which is cool, I do wonder though why these things never come with face rigs? Not hard to expand I know, but one can wish :)..
     
    Korindian likes this.
  24. Elzean

    Elzean

    Joined:
    Nov 25, 2011
    Posts:
    584
    I check the Auto Rig Pro, there is something for face no ?
     
  25. Deleted User

    Deleted User

    Guest

    Oops, seems so.. I've been researching 50 things at once so I kinda skimmed it.