Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Why can't Unity HDRP correctly render AAA looking character models?

Discussion in 'High Definition Render Pipeline' started by cloverme, Sep 9, 2021.

  1. Safemilk

    Safemilk

    Joined:
    Dec 14, 2013
    Posts:
    42
    This looks fantastic.
     
  2. schema_unity

    schema_unity

    Joined:
    Jun 13, 2018
    Posts:
    109
    Thank you! I appreciate it.
     
  3. schema_unity

    schema_unity

    Joined:
    Jun 13, 2018
    Posts:
    109
    SuperSaiyan.gif
    I might have accidentally created a Super Saiyan effect...
     
  4. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    577
    Kamehameha-ir
     
  5. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,033
    Will you be also able to make different shapes of a pupil, like vertical (snake like) for example ?
    I was kind of missing this feature in unity eye shader.
     
  6. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    183
    Yup, I really wish that Unity would step up to the plate when it comes to helping to put AAA characters into games for indiedevs, they've really missed the mark. Especially for cinematic sequences. Most of the design workflow exists outside of Unity, I don't see that changing anytime soon. Unity keeps releasing these low-level packages like digital human while Unreal is releasing high-level tools like metahumans. Overall, Unity seems to be missing the point I think since tech demos are showing off these amazing visuals with AAA cinematic sequences, but by in large, the editor and asset store is more or less geared for making a low-poly cartoon airplane go in circles. So there's somewhat of a disconnect I think with Digital Human and the upcoming Ziva (supposedly) with indiedevs in mind. I could be wrong, but I doubt we'll see a gallery of characters that are game-ready.
     
    Shizola likes this.
  7. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Once I figure out how to blend two masks together, I'll definitely add this to my shader. It mainly involve using texture channels to morph their shape.. I tried Linear Interpolation and that wasn't doing the trick.

    All my eye textures are made in Substance Designer so I can easily add this as a texture. The irises of my eye shaders are separate textures, so you can change out the pupils for something else
     
    Last edited: Aug 22, 2022
  8. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    I aim to change this in a few months. I've decided to take a stab at doing the asset store, and give back to the community. Also, I did something new in Unity using the APV, that was demonstrated in the Enemies demo and it's impressive!!

    I'll do more tests later today

    Unity_CvUDYNCZsy.png Unity_9UOl9Sxgcn.jpg
     
  9. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Your request has been granted :D. In Substance Designer, by adding a custom shape to my irises, I can NOW do ANY shape I want :)

    upload_2022-8-22_12-54-12.png

    upload_2022-8-22_12-54-28.png
     
    hopeful and koirat like this.
  10. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,033
    Can you morph from circular to vertical like animals do ?
     
  11. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    And Cat Eyes

    upload_2022-8-22_14-45-43.png
     
    koirat likes this.
  12. schema_unity

    schema_unity

    Joined:
    Jun 13, 2018
    Posts:
    109


    I've only had the chance to play with it for a few hours so far, but I'm already a big fan of the hair system. If it already looks so nice with just lines, I can't wait for the HDRP renderer to drop.
     
  13. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Look at the post above. I've designed a substance that allows me to use ANY shape to create custom pupils
     
  14. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,518
    Ziva is (or will be) a paid tool (if i'm not wrong) so, is definitely not for indie devs in mind

    Looks cool but the hair needs more weight i think
     
  15. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    183
    Probably... Unity is very unfocused right now when it comes to tools for a unified solution for character design and animation inside the editor. Indiedevs are left to try to cobble together solutions as best as we can, for now.
     
  16. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    577
    It can save some time, basic actor days rates and a days work can easily end up being about the same as a single train, staff, gear, studio, etc, if you don't already have that in-house, or a studio partner to do it with.

    It can easily be indie/solo orientated, you just really need a solid production pipeline.
     
  17. Safemilk

    Safemilk

    Joined:
    Dec 14, 2013
    Posts:
    42
    I think I fall a little more in the middle on this one. I would rather get mid to lower level tools because frankly... I HATE Metahumans.

    It's a glorified character creator and people tend to just plop one in their game and say, "Look what I made!" But... a huge team of artists and techno wizards made that, you just changed some sliders!

    I think Ready or Not was the first game to use Metahumans in a non irritating way, just to fill in gaps for like, head variations and stuff, the rest of their character art is hand made and brilliantly detailed.

    SO I think where Unity could succeed is by taking a middle ground approach between what they are doing currently and what Metahumans does. Because really what makes Metahumans powerful in my opinion is 2 factors:

    1: Unreals material and shading and post processing and shadowmapping when set up a VERY specific way can make really nice character renders.

    2: Metahumans are made by collections of amazing artists and tech people and they export very nicely to other programs.

    The focus I'd love to see Unity take is more along these lines:

    Improve their IBL BRDFS and have better shader tech and lighting for all of the surface rendering etc, more bespoke examples of great hair and skin and eyes like we've been seeing that we can take a apart and understand, a focus on screen space effects that enhance character shading, like currently the AO can either look good for environments or look good for characters, but never both.

    I think HDRP is definitely a step in the right direction, but it's still not quite competitive with some of the high level output we see from Unreal, and I think that comes down a lot to rendering techniques over fancy things like pre baked muscle deformation in animation, though that would certainly get us to the final stretch.

    I think if we could see Enemies as a project to open and inspect, if we could have the Ziva dynamics thing, we might be able to get some more data on how those demos look so great.

    Honestly I think this thread is pretty great because it's something we can point to and say, "Look how hard we are trying to talk about and solve these problems, Unity! Come with us on this journey!" lol
     
  18. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    I'm with you on this one. I'm not hating on MetaHumans, but I can't see myself using it for my character work. Same issue with the Daz Genesis models, you can TELL they're generated, especially when an inexperienced artist uses it and the output looks too similar.
    Me personally, I rather build some really solid body base models to start from and build NPCs from there. For a one man team, it seems like a lot of work, but it's satisfying to see your work beautifully hand crafted and you actually have an understanding of how it works.
    I've also been reworking my personal shaders so that it's more artist oriented, easy to use, and clean UI. Also a lot of these shaders in Unity and Unreal have too much S*** going on with the numerous and insane sliders. It can be complex under the hood, but make it simple
     
    Oguzkagansahin and Safemilk like this.
  19. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Unity is a glorified game maker and people tend to plop their game in and say "Look what I made!" But... a huge team of artists and techno wizards made that.

    To effectively use metahuman you're required to do a lot of additional work, and also use the metahuman mesh function well, which is essential for making unique characters. To this end, my brother was able to reproduce his likeness almost exactly in it. I was really impressed but it takes a lot of high-level work.

    The real steal from it is you get it integrated with the engine, you get all the facial rigging done (with metahuman mesh it can be really stylised but still work). This is no different to working on top of Unity with all the time saving Unity supplies. So metahuman is just a "character engine" like a game engine, or physics engine.

    It's easy to criticise it, if you haven't done anything unique in it or just see what people are doing with 5 mins of experimenting. Because that's the truth, otherwise I can't see how you'd dismiss it so easily.

    If Unity wants to have char generating as a service or feature, they absolutely will need to tackle realistic humans first - just like metahuman, and then build on it once that's stable with face meshing (as in metahuman) followed by body modification and so on.

    In short, I think metahuman is absolutely the right way to go, and you absolutely have to do plenty of good honest work on top of it, just like using Unity as a game engine.
     
    Safemilk, Faikus and KRGraphics like this.
  20. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Yep and also remember how Epic Games bought Cubic Motion and now they have it integrated into unreal 4 to work with MetaHuman. I was also able to make myself in it but can't use it outside of Unreal. Also Cubic Motion is in-house vs Ziva which is currently an external vendor.

    I could probably get a facial rig working if I could scan my face and wrap it to a character model I've made
     
    hippocoder likes this.
  21. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    These are the Unity forums so it's remiss of me to go into details but in brief, I believe you get out what you put in, namely the mesh feature, which transfers the likeness of your own mesh (and guide lines/coloured areas) to an existing metahuman head, which allows all the rigging, the motion capture, skin textures, etc etc to be retained and used properly with the provided user mesh. It's a clever way to make it properly usable IMHO.

    Combined with something that would make animation perfect like Unity's Ziva, it would be a very interesting development. I don't know Unity's plans, but I do know they have a special alpha group for forward thinking technology. So it's anyone's guess how things will turn out.

    I don't think the game has started for Unity yet, just pieces put in the right places, but I'm expecting Unity to come on in leaps and bounds. Especially if you look at their hair. Proper black / African hair is supported, with the right kind of physics, which IMHO is rather impressive.

    I'd say hats off to Unity but they'd only show a hair raising response.
     
    Safemilk and KRGraphics like this.
  22. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    It's getting there. Just gotta make sure it's artist friendly and doesn't require me to be a rocket scientist to use it
     
    AnomalusUndrdog and hippocoder like this.
  23. Safemilk

    Safemilk

    Joined:
    Dec 14, 2013
    Posts:
    42
    I think I see what you're saying, though the hyperbole wasn't necessary for me to get where you're coming from. That said, there's still a little bit that I differ on in some regards there, for instance, what people feel they can use the different things we are talking about for.

    For instance, Metahuman tends to be used in its current state as is... as a whole solution, which is more the fault of the community misunderstanding it's purpose, much like an asset flip game in Unity. So if I'm more precise in my language I'd say, the way people use Metahuman is not extensive enough to merit the hype it's getting, it's not DONE as is, it's step one on a process most creators don't take all the way.

    So in the end I think we have the same criticism, which is the various uses the community finds are acceptable as a final state of content.

    Now as a tool system, getting those rigs out of it is great, and it all depends on your intention as a developer, what are you willing to roll a bespoke version of yourself, vs what are you fine using an integrated system for?

    For instance, I love the fact that I can use Dialogue System from the asset store rather than doing the learning and research to try and build a far inferior version of it since I'm big dummy when it comes to that stuff.

    Just like since I'm more comfortable in the character art department, I might roll my own system over picking something Unity integrated themselves.

    To that end I think it's about choice, HOWEVER, the real complaint in the thread I think... is that the ceiling is lower in Unity than in Unreal for things that make characters shine, like shading tech and lighting tech, as well as things that take it to that next level like alembic bakes of muscle deformation yada yada.

    So something like Metahuman doesn't necessarily solve that on it's own, it solves a piece of a puzzle, the example earlier of them porting a Metahuman into Unity earlier in this very thread, well that's a really good example of Unity's character rendering in general not holding up, and I think in the end that's a problem worth solving as well.

    Sorry if I seem combative, I just come from a very specific place on this, but I definitely value you and your brother's perspective on the way you've interacted with the tools as well. It can certainly be a blind spot for me, so I'm glad to hear about it.
     
    hippocoder likes this.
  24. Safemilk

    Safemilk

    Joined:
    Dec 14, 2013
    Posts:
    42
    Yeah I think that's a better way to phrase it, I don't hate the system/solution itself, just the output people that use it find to be acceptable. I'd love to see people leverage the aspects of it that make it RENDER super well, and the systems that allow for customization, with their own meshes, which I think is what @hippocoder is pointing out to me as well above.
     
    KRGraphics and hippocoder like this.
  25. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Exactly why I'm a character artist and I've studied my Unity pipeline so extensively to where it's artist driven.

    On the topic of HDRP, the one thing that I REALLY miss right now is the use of things like custom lighting, which is AMAZING for things like rendering eyes. I use Amplify Shader Editor and I know it can be done using URP, but it's a missed opportunity and I wish Unity could add custom lighting support for HD.
     
  26. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Nah not combative - interesting. I'm the same, we are likely passionate in opinion and our work, so that is way more important than keeping it back. We're grown ups sharing stuff about our craft after all.
     
    KRGraphics likes this.
  27. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    This.
     
  28. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,518
    Hi, can you explain what do you mean with that?
     
  29. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Before Unity introduced SRPs, in older versions of the engine using Amplify Shader Editor, there is a node called Custom Lighting where you could do things like car paint and human eyes.

    The cornea was very easy to set up and you could have custom normals for iris and cornea, but HDRP makes that task a bit difficult. Also, with custom lighting, I could have clearcoat normal maps and roughness
     
  30. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    183
    Yeah, I mean I get what you're saying. I think though that like, what Unreal is doing for character support is a step in the right direction for allowing game developers to put characters into games without a long complicated character pipeline. Unity has zero character tooling with the native editor, I don't consider a Skin/Hair/Eye shader to be good enough without a system to accompany it. For me to put a walking, talking, breathing, aaa-style character into my game, it's a very long workflow pipeline with disparate tools that don't work well with each other, it's all export-import-tweak, repeat.

    Unity is failing hard on improving or providing a clear character workflow pipeline, this thread is a shining testament to that.
     
  31. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Yep. Sculpt, export, uv, export, etc. That is why my characters take a while to design
     
  32. Safemilk

    Safemilk

    Joined:
    Dec 14, 2013
    Posts:
    42
    True, that's something pretty heavy to think about too, there's the layers of it, and on one hand it's shaders and lighting models, and on the other is something unified. I think asset store stuff can be a tough thing for creators to lean on as well when you have the engine changing as drastically as it can sometimes as well, lot's of stuff I got back in the day to help me with workflows that just don't work with Unity in it's current incarnation, or Unity just natively does now...

    So character pipeline stuff is tough, because like you and @KRGraphics mention, the process outside of Unity can already be so painstaking as well.

    Which is something Metahumans solves fairly well with it's rigs and such and how it interfaces when it comes back IN engine.

    Definitely good points!
     
  33. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    And sadly, I can't use it outside of Unreal
     
  34. Oguzkagansahin

    Oguzkagansahin

    Joined:
    Feb 19, 2021
    Posts:
    4
    I didn't be able to follow the thread for a while but I like where it's going. I'm a huge fan of threads like this where you building good amount of knowledge together that in the end carry us all at a certain quality level or motivates to do more.
     
    Last edited: Nov 12, 2022
    KRGraphics and Deleted User like this.
  35. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    I'm glad. And as I am an early adopter of HDRP, I've learned so much about to get higher quality in my Unity projects, especially shaders. Just gotta have know how and actually put in the work to make Unity shine
     
  36. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    183
    Unity provided some more information at Unite 2023, but it was somewhat brief.

    The first clip talks about the hair system, the lighting and shadows (they show v2023 in the timeline) and the digital human package again (which is still unusable for indie devs and small teams). The second section of the first clip talks about Ziva, the interesting part here is that the updated the Enemies demo to use Ziva with Maya. (57:55 time mark)


    The second clips talks a bit about Ziva being integrated for real-time support and being able to use a mobile phone camera to record the motion capture. (23.37 time mark)


    I guess the key takeaways here is that Unity has some map to bring Ziva to Unity for games in the 2023 version and you'll be able to import a design (likely from Maya) and use mocap from mobile phones. I would expect some of the requirements are going to be around using raytracing on DX12.

    Overall, it was nice to see that Unity is finally acknowledging the character pipeline is pretty much non-existent. You can tell in the presentations they keep using language about being a work in progress and don't provide any detailed information since it's just a high level overview. I wouldn't expect to see any of this in a production unity version for several years still.
     
    Rewaken and Qleenie like this.
  37. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    783
    At least some more information. Ziva will most likely be render pipeline agnostic though. Its magic is happening outside of Unity. Inside Unity it's more or less compute shader only and should work with any shader and rendering pipeline. I tried the Unity demo, it provides all the source code for Unity integration.
     
  38. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    You know what would be nice for HDRP? If Unity supported UDIMs, and an entire game level could use a single Master shader.
     
  39. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    615
    Pour one out for the old asset Amplify Virtual Texturing which is like 5+ years old. I believe they supported that even in built in. But yeah that would be great
     
    KRGraphics likes this.
  40. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    I think they added this in Amplify Shader Editor, I can ask them. I'll take this over insane draw calls
     
  41. CenobiteShadoweaver

    CenobiteShadoweaver

    Joined:
    Nov 1, 2022
    Posts:
    44
    Yeah well, if anything, they are plugins for Unreal the standard Unreal editor does not do that seen in the above pic where i can get above 4k images in Unity's HDRP, might come down to the mesh & materials you're using and how you have setup the editors' options which is all too much to explain here but suffice to say the above Unreal pictures are not standard renders by any means they have been worked somewhat compared to the standard render default options offered by Unreal. These would be near still pictures using Ray Tracing in Unreal, show me some moving frames at that resolution from Unreal in any game not just a couple stills.
    Unity seems way easier to adjust in settings then Unreal for me this might just come down to how effective someone uses a particular editor rather than one being better than the other when it comes to imaging.
    The character you're comparing in the above pictures with isn't the best quality either, most of those Japanese manga characters are low quality, better mesh models use high end textures, the render can only produce what the texture is scaled too, i can tell you most Unreal renders aren't that detailed most human characters from 4 look poxy, still comes down to where the asset was originally produced which in most cases is other editors or plugins that add functions you wouldn't normally have to improve image quality.
     
    Last edited: Nov 19, 2022
    cloverme likes this.
  42. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    183

    While Unity seems to understand this is an issue, they're trying to lower the barrier in a future version of Unity. Whether this will be "Digital Human" package or Ziva remains to be seen at this point since they seem to be tooling with both now in demos. Since the post from last year, I've narrowed the "solution" down to a few key things:

    • Micronormals are required
    • Subsurface scattering is a must
    • Unity added new shaders this year to HDRP (Skin, Eye, Hair), but thus far, better results have been achieved through 3rd party shaders.
    • Workflow is very complicated and time intensive, most of the work must be done outside of Unity using 3rd party tools.
    • Mocap animation and syncing voice and doing emotions, again, complex and must be done outside of Unity.

    Less than ideal contributors:
    • Unity has the current approach of "we put some stuff on github, good luck" with the approach to a character pipeline workflow.
    • Subdivision limitations for high polycounts impact the overall cinematic of using high-end cinematic models. Models need to be <250k polygons for cinematic cutscenes, gameplay <80k.
    • Lighting is often a contributing factor whether or not models looks "wooden" or life like going from indoor/outdoor scenes. While Unity is working on better lighting, the current Realtime global illumination package doesn't have the same dynamic range as other engines. This often requires varied scenes or lighting setup to accommodate the character model.
     
    OBiwer, Faikus, KRGraphics and 2 others like this.
  43. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    783
    This sums up current state pretty well. Only thing I have not encountered is better 3rd party shaders for skin and stuff, would be interested what this could be.
    Quality of Indirect Lighting and easy availability of detailed models are probably biggest issues to reach the level of Unreal Engine.
     
  44. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    For skin, hair, and eyes, I've gotten some amazing results, it just requires a lot of work outside of the engine
     
  45. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,518
    I don't see why unity needs this, (i'm old school, for me the logical way) Unity is not a full 3d software to create everything, it is a Game engine, it has tools for animation and stuffs but even they say those tools are basic or limited. Of course if they release a tool to do that in unity i guess it can be a good option. Can you enlighten me, please?

    maybe that is truth but, if those features are not implemented in unity as beta or experimental or whatever, it makes sense that the documentation is limited, doesn't it?

    Can you please also explain that? i just put a model with more than 1.000.000 polys and it works good. Also
    O.O do you need more than 80.000 polys for an object for a game? do you know a game that use more than that?
    how many polygons have the woman from enemies?

    ad.jpg asd.jpg
     
    KRGraphics likes this.
  46. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    All of this. But I'll add some of my own points:

    For my character projects, things like Microtextures for skin haven't been needed since the mesh is authored in ZBrush with HD Geometry and baked at 8k resolution, with generated cavity maps in Substance Designer.

    Yes. And the level of quality with HDRP is definitely up there if you light your model correctly and tweak your diffusion profiles.

    And a lot of the 3rd party shaders often look better and have more control. The Unity shaders in the 2022 have too many extraneous options and many of them are unneeded.

    That is unfortunately the nature of creating high quality models, especially for characters. A lot of Unity artists are often looking for shortcuts toward high quality, and that's not how this works.

    A lot of my characters use many programs to get the look I need, such as Substance Painter and ZBrush to create the textures. Hair is another challenge to get through, especially with specialized software like Fibershop, and you need to have well made shaders for them.

    Yep. See above.

    They need to stop that crap. Epic Games doesn't do this with their engine and they are UP FRONT about what works and what doesn't.

    For my current project, my in game character is about 255k triangles and most of it is his face and hair. Things like Subsurface Scattering thrives on higher polygon counts, especially with transmission where YOU WILL get artefacts on this like ears.

    Also for deformation and facial rigging, having a denser mesh helps. This is why I LOVE Nanite and Lumen, and Unity needs to find an answer to both and soon. I often have very detailed environments and characters and I want them to look good.

    No more unfinished features... I can't even work on my games lighting because the GI solution feels tacked on. I love APV, get that finished! Get a fleshed out GI solution that is FAST, easy to set up, and looks really good. As an artist, this is important to me.
     
  47. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    That's too dense, even for me. Unless you're using a low resolution proxy, that's excessive and its topology doesn't feel right
     
  48. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,518
    i'm not using it, it was just a test
     
    KRGraphics likes this.
  49. CenobiteShadoweaver

    CenobiteShadoweaver

    Joined:
    Nov 1, 2022
    Posts:
    44
    Subsurface is a must for certain materials, i find when porting in new characters from other editors the big issue is hair & eyes as different models require different settings, hair is usually fixed with opacity settings can be a quick change from particle surfaces to particle unit when using hair in HDRP, eyes are always an issue for every editor and must be fixed in most cases, again opacity setting to the eyes can be an issue, in some cases subsurface is required for the different layers of the material depending on the eye used, ghost shell eyes for the DAZ3D gen 8 model have broken the eye up into many segments requiring each segment to be shaded which is more than the standard eye, editors like Unity
    & Unreal have issues picking up these details and in most cases have to manually fixed in ways changing settings within Unity to get materials to show up. But both editors don't show textures perfectly from import and require tinkering with, even unreal loses a lot of quality other editors produce but on the same token Unity & Unreal both run frames at high rates for the quality they produce so i don't expect it to be render movie que grade cut because that's what the posted picture looks like, render movie que for unreal which is yet another plugin for ray tracing and isn't easy to use or set properly in most cases you get this grainy look to the graphics if the setting are not right.

    Most characters don't look that good if you just open unreal and check one in overview the Twinmotion characters are quite good in Unreal but again not Unreal editor they are Twinmotion ported and a plugin runs them in Unreal just like most other good props of characters need to be supported by a bridge from another editor that fills in what Unity or Unreal doesn't supply. The character in the main post looks like Pixel Quix character again not a standard Unreal low res character that you find on their market.
     
    KRGraphics likes this.
  50. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,446
    Oh, thank God. The most I can import for a base mesh is a level 2 subdivision from ZBrush, and it's usually the final model