Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Why can't Unity HDRP correctly render AAA looking character models?

Discussion in 'High Definition Render Pipeline' started by cloverme, Sep 9, 2021.

  1. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    132
    Just for my own education, can you please show me the results of a Daz character in Unreal compared next to Unity.

    Thank you.
     
  2. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    622
    i agree they look great!
     
    ARealiti likes this.
  3. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    These are looking good. Unity needs to get on the ball with an ACTUAL GI solution and Path Tracing
     
  4. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    832
    The ambient light looks good, but I did not understand what you are using? HDRI skybox?
     
  5. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    832
    I completely agree with this. On unreal side it seems with Lumen it’s much better, but very expensive, too. I also did some ambient stuff myself, but this is only SDF based AO so far, which works good, but yours looks better. May I ask what technique you used, and how taxing it is on GPU?
     
  6. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    622
    Are you shining a bounce light on each of your characters? I think that's a big part of why they look nice in ambient lighting!
     
  7. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    I've been looking for a GI solution and I might go with Bakery, which is as close to Beast I'll get
     
  8. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    With path tracing, you can get the lighting from outside to affect the interior environments, especially with careful placement of reflection probes in the room. For my future projects, I don't use real time raytracing because the performance hit isn't worth it
     
  9. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    832
    isn't this "just" additional positioned area / spot lights in Heretic demo, like classical Key Point character lighting? Enemies demo has SSGI / RT on top of this, but also fakes all reflections / bounces with additional shadow casting lights (mostly area lights).
     
  10. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    I think the enemies demo used Adaptive Probe Volumes and while it is a quick solution to lighting your scenes, nothing is beating a full path tracer
     
  11. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    832
    the Enemies demo is basically throwing everything at it: AVP (no lightmaps), SSGI (you can choose to have it raytraced or not), and a highly complicated lighting setup for the character, which changes a lot during the scene, consisting of very expensive area lights with PCSS shadows, faking the reflections / indirect lights from e.g. the chess board.
     
  12. Kreshi

    Kreshi

    Joined:
    Jan 12, 2015
    Posts:
    443
    according to the documentation enabling SSGI disables lightmap and light probe data contribution:

    https://docs.unity3d.com/Packages/c...ion@15.0/manual/Override-Screen-Space-GI.html

    upload_2023-7-14_22-6-15.png

    This should include APV too right? Or did i missunderstood something here?
     
  13. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    832
  14. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    I've tested SSGI and while it looks good, it shouldn't be a replacement for an actual GI solution and in certain scenes, it falls apart
     
  15. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    832
    This is the case with all screen space solutions; I find it good enough for most things, but as I said only in combination with APV + Reflection Probes. But as I do mostly VR, the performance hit is quite high.
     
    KRGraphics likes this.
  16. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    132
    This debate on realism vs what practicality looks good made me laugh, Game Of Thrones Season 8 Episode 3 was so real no one could see it o_O

     
    Last edited: Jul 17, 2023
  17. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221
    * waiting for a rendering engine that replicates how the human eye adjusts to changes in lighting *
     
    ARealiti likes this.
  18. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    591
    In terms of the exposure or the pupil adjustments either can absolutely be done in unity, shader graph and the auto camera exposure clamps and be parametric together given an exposure value and adjust accordingly together or independently depending on which effect you need.

    In many cases though it truly depends on your use case.

    For scientific purposes, yeah probably not the best idea.
    Physically based, depends on how physically based you need it @ARealiti hit the nail on the head with the GoT scenes. Shows are getting darker for both technology and art direction purposes, the response to these choices are always going to be mixed. Since we can't always align to the same art direction goals.

    With game projects an extra level of gameplay design choices would be made to compensate certain choices.

    For example you may want a realtime of day in your project but night may not fit with game design, and user feedback might be "too dark, can't play"

    On the other hand you may intentionally want this to produce a layer of organic difficulty to your game.

    So with the eye thing, you could set the dark to light light to dark values for exposure, to the cones and rods in your eyes.

    Rods especially since they're more for light values, Cones. Are typically colour.

    So in that sense.
    Saturation post effect could be included.

    Setting the dark to light to 5-8 minutes and light to dark around 40 mins, is doable, however you would likely want the choice to be aligned with your goals and gameplay ( if it's a game), if not. It may be generally more accurate, but it could serve as a negative user interaction.
     
  19. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221
    My bad, didn't word it well to get my point across.

    * waiting for a rendering engine that [automatically, without any adjustments, scripting, components, probes, volumes, or other tweaking of any game object, textures, materials, lighting, scene, sub scene, or space] replicates how the human eye adjusts to changes in lighting *

    You need to admit, Unreal comes much closer to this today than Unity will any time soon?

    HIBIKI_entertainment, I like your point with respect to the eye adjusting from bright to dark to bright again over time. That is the only factor that I would want an adjustment for in a game to be playable without having to wait.

    If we had such a rendering engine available where the only tweaking or adjustments we would need to add are those to violate natural eye perception, then think of the enormous time and resources a development team could save.
     
    Last edited: Jul 17, 2023
    HIBIKI_entertainment likes this.
  20. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    832
    You probably mean to simulate light, which is the much harder part? Without light simulation, there is nothing to adjust to;) my guess is that the only way to have highly realistic lighting is by using AI as a post processor, but we are not there yet in terms of GPU Performance to do it in real-time. Until then, we need to live with screen space based solutions or ray / path tracing as next step. And Unreal is ahead of Unity in this regard.
     
    cloverme likes this.
  21. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    189
    Unity itself acknowledges it has a "light problem". They've been trying to address it by doing a a few things, they brought back Enlighten to HDRP and they added Adaptive Probe Volume... but... you also to have to try to balance out shadows, reflection probes, try to get rid of light leaks, and a complicated performance balancing to make it "gameable". Unity is trying to move into raytracing but it has a high bar for hardware and Unity's solver, even in the latest alpha versions, feels miles away from Blender.

    A few asset devs have been working a better lighting solution for Unity for years, by trying to implement two critical components to bring "Unreal-like" lighting to Unity, which is SDF (signed distance fields) and screen space indirect lighting with reflections. That's what the "recipe" is for Lumen-like lighting. A few devs are getting closed to a replacement lighting system, but it's probably another couple years off at best. You know it's bad when GODOT 4 has better lighting that Unity 2022.

    Sadly, there's nothing in the Unity roadmap to date to bring "Lumen like" RTGI or SDFGI to Unity. Like Qleenie states, raytracing is probably the way forward for quality-based solution.

    I haven't seen too much in the way of ray tracing support solutions from Daz or Reallusion into Unity. It's all based on PBR at this point. Being able to bring a character into a raytracing solution into Unity would be an interesting avenue to explore, but likely not especially performant.
     
    Qleenie likes this.
  22. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221
    I'm surprised you are comparing Unity to Blender TBH. Does Blender's open source community really have a lead over Unity?
     
  23. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    189
    I suppose it's a bit of apples to oranges comparison since Blender isn't focused on gamedev. I think Unity is chasing that "we want to be the tool for video production!" knowing that for example, Blender Octane is used for a lot of video production, it was used for the opening credits in westworld and things like that. You certainly get that feel while looking at the Enemies demo for sure.

    I think though in terms of making a game with a cinematic quality character and moving from an outdoor to indoor environment, Unity lighting is very, very difficult. Maybe Unity 2023.1 raytracing would handle this better? I went through the ray tracing tutorial for Unity (granted it's over a year old) but found it to be... complex and exhaustive to setup and use just to get good quality light.

    What's also interesting is this: https://portal.productboard.com/uni...ng-visual-effects/tabs/64-global-illumination and the mention of "Precomputed Realtime GI" - Precomputed Realtime GI / Dynamic APV

    Feels like a first step toward a "Lumen" like lighting system for Unity
     
    MrBigly likes this.
  24. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    132
    Most film cinemas and photographers don't like real light they change it especially in heavy sun light or low lighting, why bother to apply real lighting at high simulation cost only to have to control it again at even higher cost, just asking for a friend?

     
  25. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    This is extremely useful for lighting my levels and I do my best to stay away from chromatic aberration and bloom lighting. I also purposefully desaturate my background a bit in game to give the illusion of depth and distance.
     
    ARealiti likes this.
  26. Unifikation

    Unifikation

    Joined:
    Jan 4, 2023
    Posts:
    1,073
    TL;DR don't wait for improvements in this area from Unity.

    Aside from the fact that Unity appears (at best) to be narcissistic, schizophrenic and manic depressive (in terms of direction, endeavour and objective), it's Unreal they were chasing.

    Think they've given up on that, and currently figuring out a way to non-apologise their way back to focusing on lower tiers of gaming and visualisation than Unreal's at, let alone where Unreal will be in their next rounds of improvements and acquisitions.

    Keep in mind, Unity and their new (part) owner are basically out of funds for acquisitions. They can no longer buy their way to the next thing, nor next revenue source.

    That means they need to find money and talent to integrate what they already own in ways that makes money (possibly) sometime in an increasingly uncertain creative future (superior competitors, economic crisis, AI etc).

    Or, they need to start selling things and cutting staff, and start a new hype.

    We've already seen the new hype (integration of AI) and three rounds of layoffs. That's the direction, now. They will not, despite the noise around that demo, be trying to get incredible reality and immersion out of visualisation and realtime. It's too hard, and has too little benefit as the market's saturated with better competitors, there was never much profit there and it's all custom and project based within somewhat declining industries that are now without writers for an uncertain period of time - all things and they're now beginning to realise.

    Having said all that, the options for stylistic rendering (non-photo-real or NPR) is quite good in Unity... ironically, easiest with Builtin, and quite performant, too.

    Not sure if they'll focus on that because they probably can't conceive of a way to monetise it.

    And Unity needs money. Desperately.

    To that end, I think they'll sell their interest in Weta.
     
    cloverme likes this.
  27. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    132
    In the absence of a light diffuser filter and the ability to apply different light diffusion filters to different things for the specific light particularly harsh sunlight, then realistic sunlight is too harsh for good lighting particularly on characters including in Unity and the real world and Unreal. The same problems exist in the opposite for dark and increasing exposure to make things visible when there are other very bright things that would be overexposed in a dark scene which was shown in GOT where the individual characters could not be lit individually in a realistic lighting for the whole scene so no one could see anything other than the fire. The problem is a filter (or exposure in dark scenes) on the camera is not what is really needed, it is a filter (or exposure in dark scenes) on some lighting on some of the things being hit by that light, usually characters/people or things that the scene/game wants the viewer to focus on. So as the last video I posted showed even a filter on the camera will not solve diffusion issue for the things that are specifically needed for focus in direct sun or very dark scenes or even in between which will need a different filtering to the overall scene filtering for the sun light (or exposure in dark scenes), so the need for a physical shade diffuser device in the real world shading the individual in the video in my previous post and individual lighting that could not be solved in the real world in the very dark GOT.

    Sexy new competitors


    Which one is best so many in the film industry to choose from


    And even discussions about their overuse
     
  28. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    132
    An interesting series of articles on ArtStation on UE4 lighting which can be applied to Unity also, again this shows that in Unreal as well as the real world, lighting is a complex issue that cannot be solved out of the box for best quality / artistic effect as a one perfect real lighting fits all situation. Diffusion is mentioned again in these articles as a real world effect on softening the sun light "With cloud cover, because clouds are transparent, the light passes through them in a diffused manner. The light rays that hit the clouds bounce around inside of the cloud, then emerging in multiple directions. This diffusion softens the sunlight, turning a small hard light (the sun) into a large soft one (the whole sky)"

    https://www.artstation.com/blogs/el...h-implementation-other-lessons-learned-part-2
    https://www.artstation.com/blogs/el...h-implementation-other-lessons-learned-part-3
    https://www.artstation.com/blogs/el...h-implementation-other-lessons-learned-part-4

    My personal opinion is the most important thing to Unity would be diffusion of light and ability to expose / diffuse light on an individual shader basis settings for individual objects to varying degrees with a set of these settings on the shader for each light in the scene that might hit the surface. I have hunted for shaders that do this but cannot find them. I have tried to use Lit and other shaders settings to get the desired effect with no luck, so end up darkening textures and using other settings as best as possible but this is bad when characters move between lighting scenarios in the scene.
     
    Last edited: Jul 22, 2023
    impheris likes this.
  29. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221

    Wow, that was succinct.

    It makes me wonder if I should be developing my trilogy's first title with Unity or switch now to Unreal? I don't plan on realism in the rendering, but that fact doesn't take into account the other topics you touched on.
     
  30. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    622
    how performant are your per character lights? it seems like they all cast shadows as well?
     
    ARealiti likes this.
  31. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    591

    For HDRP and internal scenes in general, most of this practice should already be known if you're looking down archviz or general baking routes. Essentially for internal scenes between the two engines you should be able to achieve a similar high fidelity look with ease on both, the workflows are different of course but in terms of real world lighting data you can map and tweak that to HDRP on texturing, scaling, lighting and cameras. In terms of light diffusing the majority of this comes from light mass in unreal. For unity, it might not be as "accurate" but setting the ranges and radius of lighting correctly ( if you're using an already existing spec) should give you your base shadow diffusions, especially with PSCC shadows. from there on depends on whether You're using either baked solution Lightmapping Vs APV or a screensavers solution. Lighting indoor scenes, you also have the extra benefit of Pathtracing if you're using DX12, this makes for a blazing fast A:B test between offline and realtime rendering that you can use to validate or test realtime decisions. This works particularly well for internal scenes as the majority of what you will be using would be supported on the path tracer.

    This begins to get more complex as you add biggers scenes, dynamic lighting, internal and external zones, and your end results and ideas final fidelity may have to have more complex ideas or set ups to get close to the "on its own" internal scene.

    Lighting a room with artificial lighting verses a room with a window for example still throws off some beginners in unity.

    Lighting is always going to be worth the study, especially if you learn it from another medium, like real world lighting engineer or a gaffer, for end rendering results, experience in DP is also beneficial. Most of these positions translate very well into HDRP and other engines too, normalised values or not. You'll understand and appreciate lighting and placement more.
    You got this.
     
  32. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    591
    Unfortunately, I'm not aware of a shader in this manner.

    Most of our shader work appends to PBR, so we have particular rulesets when we're creating specialised shaders, especially those that contain elements of magic or sci-fi, since those don't exist, so creating a balance that still could be perceived as 'believable'.

    HDRP does however use per-pixel lighting meaning that if you took a different approach like creating local volume profiles with custom tone mapping curves, you could create a wide range of camera ND filters and the per-pixel lighting could help create a more believable curated lighting scenario.

    The 'diffusion' of bounced lighting is then calculated from indirect lighting ( which it typically is).
    and, if you're looking for soft shadows for curated lights.

    You can additionally then implement some more real-world techniques to achieve this;

    - inverse square law and how distance from the subject changes both intensity and shadow fall off of light
    - bounce cards and light blocking, which can help shape lighting and shadows​


    The HDRP demo scene at first glance may not seem like that much, but the room-to-room exposure difference is actually a fantastic place to test these sorts of scenarios especially custom tone mappings, the last two rooms especially since there's a large dynamic range between them.


    I did see some recent posts about APV GI contribution with non-static characters, though I haven't really explored what had been brought up here, my knowledge was that dynamic objects and skinned meshes do GI contribute, could be a bug for now. I'll certainly check this out in two weeks once I'm back from vacation mind.

    The character lighting pipeline has its own particulars to deal with but,
    with lighting anchors to create decent lighting rigs and - if your project supports light layers with no issues, (certain setups have light layers bugging other translucency issues) you can achieve some pretty quick character setups, to at least test a few lighting conditions.

    Hope that gives you some fuel for the mind to chew on, I understand that answers aren't always straight forward in a digital space when trying to recreate real-world values.
     
    ARealiti likes this.
  33. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    591
    Not a bad analysis yeah, one thing Renderers do that differs from the real world is that you have control of everything, photographers can't control the sunlight or atmosphere for a given shot only manipulate it's casting and fall off by other means, filters/diffusers/exposure triangle/flashes, unity CAN control the sun and atmospherics however, so a lot more control points but of course this means setting everything up too, and it's a simulation not reality so not everything behaves 1:1 and depending on the end results you require, shouldn't need to be 1:1 either ( adjustments for gameplay or art direction for example). HDRP in particular though the behaviour of physically based components would be that you're less likely to be surprised putting in data. You can collect real world lighting data and if you give it the same measured inputs you should see a relative similarity. Of course capturing reference is only half the battle there getting things 1:1 as of course you also need camera and environmental data as well, this is mostly where it takes a little more creativity or balance to match references.
     
  34. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    Lighting with Unity HDRP is a unique challenge but it's not impossible. It is difficult to do without a lighting system (GI is required) and Unity is inevitably chasing that black race car that no one could catch (Remember the game Ridge Racer on PS1). I still remember how powerful Beast was and I wish they could recreate something like that.

    Setting up raytracing and path tracing SHOULD NOT be a chore and easily accessible to all users. Stop playing around and let's get this working correctly
     
  35. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    591
    I've been thinking for a few months a large overview of the Volume framework needs to be visible for many newcomers, the objective of overriding is simple enough, but often times I see client errors or just no hindsight to what the volume profiles etc are for.

    I haven't personally had any issues setting up any tracing methods, it's more than tick a box sure about 5 steps and a pre requisite and you're essentially ready to go tune either of them.

    I have also seen many exploring features and getting the kind of " why not just work" when mixing up activating a feature and tuning them, the latter is of course an iterative process that -with hope, scale well in the volume framework.

    Maybe we can gamify set ups.
    Might be a useful to onboard people better into the processes.

    Like "prepare the pipeline for HDRP with forward pass and Alpha enabled buffers"

    Etcetc.
    Plus understanding what the wizard is doing.
     
  36. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    That should work. Not suggesting hand holding, but it will help get the ball rolling for us Unity users. I've yet to use raytracing in game, but the Progressive Lightmapper looked promising and I wish they kept it up to date.

    I miss Beast
     
  37. cloverme

    cloverme

    Joined:
    Apr 6, 2018
    Posts:
    189
    Also of interest, there's no tutorials or guides for putting a character into Unity ray tracing (skin / hair / eyes). At least that I can find searching or on YouTube.
     
  38. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    132
    Could you show us some of your example renders of characters using your experience of Ray Traced SSGI and APV in a scene.

    I think everyone here would benefit from your excellent character render examples and how you achieved them in Global Illumination of the scenery.
     
    HIBIKI_entertainment likes this.
  39. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221
    Please correct me if I am wrong, but Unreal uses Lumen and Nanite to significantly reduce development effort while yielding much more realism in the scene, in lighting and terrain detail. I bring this up because Lumen is based on ray tracing.

    It is my understanding that realism is secondary. Primary is the ability to eliminate a hell of a lot of development work in modeling and scene development.

    Ray tracing may require much more performance from the GPU at this time, but perhaps it will be common place in 5 years, making true realism in scenes available to three man indie studies on a short budget. For now I can see punting, but it has a future worth looking forward to.
     
  40. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    132
    I don't need Ray Tracing my stuff already looks good enough. Feel free to show me some of your renders in Unity with Ray Tracing so I can see if it is better.
     
    Last edited: Jul 27, 2023
  41. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,580
    ray tracing will always looks better than anything ss, now if you think your project looks good enough and you do not need RT, well, good for you, that is pretty cool but that is your case and your opinion
     
  42. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    I just wish it was easier to set up... the last time I saw the RT volumes being set up in Unity, it didn't look good and wasn't easy to set up.



    Just found this about setting up Raytracing in Unity, and it's SIMPLE and QUICK to set up.
     
    Last edited: Jul 27, 2023
  43. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221
    I didn't mean to imply I was using ray tracing per se. I was just mentioning that it makes development much less work intensive. As the video KRGraphics shared, ray tracing should provide adequate lighting without baking or other work to get the lighting to resemble realism.

    Personally I wouldn't try to achieve realism, because you can't on most platforms out there and that leads people that play the game subconsciously note that you are trying to achieve realism and failing. Instead, Halo 3 level of art and texture is more than adequate to immerse a player into a game AS a game and not distract them with the thought that you are failing to achieve realism with this detail and that detail and those details and ...

    Now when RT becomes the norm on 95% of the platforms and realism of models in such an environment is easily implemented with near perfection, then I would consider it, just to cut down on the work involved in developing models and scenes. To me it is not a matter of trying to achieve realism. Again that is a distraction for players at this time. For me it is development effort. Immersion and great memories are never achieved by graphics.

    edit:

    I just want to add one more point. If your game is trying to achieve realism, it is not a game, but a simulator. If you are trying to build a game, it is to have fun. Fun is found in a variety of emotions, the chief being exploration, competition/domination, and just the lure of a good story. In these contexts realism plays absolutely no part in achieving those emotional states that keep people coming back for the next title in the trilogy or the next match to be played.

    What is happening in the realm of graphics for Unity and Unreal is impressive, but it doesn't tell a story, it merely provides the vehicle to do so. And today's state of the art may get close to what you see at the theaters on the screen, but is only the vehicle. The story, the competition, the exploration, they can all be delivered on any number of vehicles.

    In fact, if you actually achieve what looks like television 60 fps broadcast of actual humans, I would say it needs to be a cinematic feature or it will come across like a news broadcast. You would need outstanding actors. A list actors. Or the story will be presenting like a B movie. A cartoon would be a better vehicle than a B movie.

    These are my thoughts, your mileage may vary...
     
    Last edited: Jul 27, 2023
    KRGraphics likes this.
  44. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    Even in my productions, I don't really care about 4k gaming. I'd keep it at 1080p to keep the performance high as possible and optimize my assets so I don't need things like DLSS and other dynamic resolution trickery, and have my game FILLED with content.

    I'm capable of producing high quality modelling and rendering, but graphics alone won't save a sparse looking game with horrible gameplay and performance.

    Raytracing is the shiny gem gamers are always shown first, and AAA developers know it. If your game doesn't run smoothly at 1080p WITHOUT any upsampling or AI tricks you need to get back to work.

    Also, Raytracing in a game should be the cherry on top when it comes to game development. Way too many AAA developers (developers period) are putting on the frosting (focusing too much on graphics), before the cake is even finished (the base game mechanics, including optimisation).
     
    Last edited: Jul 28, 2023
    MrBigly likes this.
  45. ElevenGame

    ElevenGame

    Joined:
    Jun 13, 2016
    Posts:
    144
    @ARealiti your characters look really good, but when it comes to the overall light within the room, i personally feel some kind of bounce light would add a lot. To me it seems too dark for a room lit by many candles plus a lot of direct sunlight coming in.. If it was my game, I'd add some GI or APV to that in a second, but as I said still looks good and it is easier to be more dynamic with realtime light, so use that to your advantage.
     
    ARealiti likes this.
  46. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,580
    nope... level or quality or hyper realisticc graphics has nothing to do with the genre or the level of fun of a game
     
  47. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,580
    Right now i'm not working on any project with realistic graphics, i do have an old project testing HDRP on a old laptop, on that project i was using some lightmaps (light bounces) and some minimal ssr (because ssr on unity is almost useless) feel free to check it out if you want.

    i did not know you where talking about "direct character lightning" in fact i do not know exactly what do you mean :/
    EDIT: i guess you are talking about area lights...

    Why do i need to show you my work and not someone else's work? what is the difference? we are talking about a general topic on 3d art, something that is also well know... But, for what i can see on your images, definitely some light bounces can improve your work, but again, it is a matter of taste and opinions i guess...
     
  48. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,453
    Looks good. Once you get some light probes in there, you should be good, even without real time raytracing. I still remember the days of placing light sources and faking lighting and even if those days are behind us, raytracing should be used sparingly. When I hear even a RTX 4090 STRUGGLING on a lot of games, I'm leaning blame on the development team for not optimizing their in engine assets. No amount of DLSS and upsampling techniques will mask the lack of due diligence for optimizing.

    Why do you think Doom Eternal runs SO WELL even on lower hardware? Because the iD Tech 7 engine scales efficiency and for a fast paced game, raytracing is unnecessary.

    The old adage "just because you CAN, doesn't mean you SHOULD" comes to mind.
     
  49. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221
    Let me try again. If your goal is hyper realism, then it is a simulator. If your goal is fun, then it is a game.

    If you say you can have both goals I would agree, but don't think someone will take seriously a poor game that looks hyper real because that was the goal.

    I think my post was pretty clear that graphics doesn't define or make the game, fun does.

    Well, anyways, I may not be using the best words to explain myself........
     
    Last edited: Jul 28, 2023
  50. MrBigly

    MrBigly

    Joined:
    Oct 30, 2017
    Posts:
    221
    In this pic, the front of the guy on the left looks way too dark. It seems to me that this is not what you want in a game, am I correct?