Search Unity

Rotation, position VS 3D application coordinate system

Discussion in 'Scripting' started by Shushustorm, Mar 20, 2018.

  1. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    1,084
    Hey everyone!

    When using files from Blender (.blend), the rotation seems to be calculated differently.
    Unity does correct the different coordinate system by rotating the files so they are aligned with Unity's coordinate system.

    However, this correction only seems to rotate the object, not the mesh relative to the coordinates.

    This causes various problems:

    - someTransform.rotation = Quaternion.LookRotation(someVector3)
    or
    - someTransform.forward = someVector3

    cause different rotations when using .blend or .fbx (exported from Blender), making some rotations seemingly impossible. For example, when an object is supposed to look at a position, instead its down axis will face that position and with declaring a custom world up, I can only rotate along the height of the object, which doesn't make its front face the position either.

    - someTransform.position = someVector3

    This can still be done, but is quite confusing since you seem to have to change different axes than you see in Unity to move the object along that axis.

    Especially using keyframes makes this scenario an order of magnitude more difficult to deal with.

    The current workaround I am using ( just like it is described by the Unity documentation: https://docs.unity3d.com/Manual/HOWTO-FixZAxisIsUp.html ):

    - Parenting each .blend that is supposed to be animated to an empty that uses Unity's coordinates of forward, up and right.

    This doesn't seem like a good idea, though, because especially animated objects have their positions calculated using parent and child transforms, which can have a major impact on performance ( see Unity's blog entry: https://blogs.unity3d.com/2017/06/2...-the-spotlight-team-optimizing-the-hierarchy/ )

    Is there another way to fix this? I tried various ways, including rotating the mesh data inside Blender, different export settings for .fbx (, which I don't want to use at all; just for testing purposes, but didn't work either) and python scripts that should change the mesh relative to coordinates before saving / export. Only rotating the mesh data worked (which is not suitable to work with, because then in Blender working is from "top down" perspective). Everything else resulted in the Blender objects being rotated incorrectly without the use of an additional parent.

    Best wishes,
    Shu
     
  2. Kurt-Dekker

    Kurt-Dekker

    Joined:
    Mar 16, 2013
    Posts:
    38,697
    Pretty sure you have explored all the options my friend. This problem arises because Unity is a left-handed coordinate system and Blender3D is a right-handed coordinate system.

    Personally I just use the mesh as-is and put the extra pivot in, make a prefab and call it a day, since 99% of interesting use cases require you to do a prefab in any case. I have yet to see this cause performance issues but then again I haven't really stressed it. I can't imagine it causing you any real issues to have an extra transform in there.

    If you are really nervous about performance, make a simulation of your heaviest REALISTIC imagined use case, i.e., clone as many objects of the complexity you imagine, then run it with and without the extra transform above it.

    My bet is you won't be able to tell the difference. Blender to Unity integration is awesome.

    BUT be forewarned: you will no longer be able to use Unity Cloud Build unless you hand-export all your Blender files to FBX files first, which to me is a non-starter. Unity does not have a license to use Blender in their cloud system, which is a terrible, terrible shame given how easy it is to use Blender files in Unity. Not a day goes by that I don't wish I could use Unity's cloud build WITHOUT fiddling with FBX exports, but alas.
     
    Shushustorm likes this.
  3. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    1,084
    @Kurt-Dekker
    Thanks for your answer!

    For some reason, I was able to fix this easily today. At least using a test .blend.
    The object doesn't have to be rotated that you look at it from top down.
    You only have to rotate along Blenders z axis so that -Y is front, -X is right and +Z is up.
    Then, in Unity set all rotations to 0 and it should work.
    Hopefully, this wasn't just an obscure conincidence.
    Also, I didn't test this with code. But the axis look right in Unity now (+Z forward, +X right).

    Even if this works with .blend, I don't know about .fbx, though, since I don't use .fbx (or Unity Cloud Build).

    You may be right about this not being a huge impact on performance for most projects.
    Personally, I am trying to run about 20 rigged characters per frame on platforms including low end mobile, so I try to avoid anything unnecessary. Also, I don't want any additional parenting (as long as it's not needed) for workflow reasons, because I have to expand / collapse once more per object in the hierarchy.