Search Unity

Can you tell which game engine is being used just from a screenshot?

Discussion in 'General Discussion' started by Arowx, Apr 9, 2017.

  1. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Or do the different game engines lighting/shadow calculations look different enough to be detectable in a scene?
     
  2. Adam-Bailey

    Adam-Bailey

    Joined:
    Feb 17, 2015
    Posts:
    232
    Depends on the game.

    For example using the default scenes as a base there are tell-tale indicators that give something away as being UE3/4, IdTech 4/5, or Unity based. When they show up you can be almost certain.

    At the same time, some games will shake things up enough that those signs aren't around, which makes it hard or impossible to tell from a screenshot.
     
  3. Andy-Touch

    Andy-Touch

    A Moon Shaped Bool Unity Legend

    Joined:
    May 5, 2014
    Posts:
    1,485
    If a project isn't using the default scene setup (Skybox, Lighting, Image Effects, etc) its usually quite difficult to tell which engine it is using. For example, would you have guessed that these screenshots are Unity? ;) https://www.artstation.com/artwork/rB2qL
     
    chelnok, HolBol, frosted and 3 others like this.
  4. Schneider21

    Schneider21

    Joined:
    Feb 6, 2014
    Posts:
    3,512
    I really hate when I get baited into Arowx threads, but I mean... come on.

    Can you tell if an image was edited in Photoshop or GIMP? Can you tell if a song was created in Garage Band or FL Studio? Can you tell if a dish was washed with Dawn or store brand detergent? Can you tell by watching a film if the director is right- or left-handed?

    If a developer is lazy and uses obvious default settings for something, or included assets... maybe? But the lighting/shadow consideration is ridiculous. So much more goes into lighting a scene than just the engine behind the scenes. It's possible to make a weak lighting engine look good with the right style, and a highly advanced engine look horrible with crap textures.

    Also, who cares?
     
    Teila likes this.
  5. ZJP

    ZJP

    Joined:
    Jan 22, 2010
    Posts:
    2,649
    Generally yes. There is something 'floating around the light' that show the way. That said, it is increasingly difficult from a screenshot.
     
  6. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    I don't have the eye for it. But as a coder I can identify a Unity game in about half a second from its file structure.
     
    landon912, frosted and Ryiah like this.
  7. Meltdown

    Meltdown

    Joined:
    Oct 13, 2010
    Posts:
    5,822
    Unity's terrain is a dead giveaway.
     
    HolBol and Billy4184 like this.
  8. Lockethane

    Lockethane

    Joined:
    Sep 15, 2013
    Posts:
    114
    UE3 games for the most part with the watery look. Also walking around ITSEC before everyone started moving away from the CryEngine you could tell from the water from a mile away.
     
    theANMATOR2b likes this.
  9. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    That's a great looking scene, and I think it shows how much the gap has narrowed (at least in terms of indoor stuff) between Unity and some of the other engines, yet I can still tell the difference. Especially if you take archviz scenes (which are generally low on post-processing effects) rendered in Unity and compare them to other engines, there's less cohesion, a sort of stark transition of the lighting when looking between occluded and unoccluded areas. I have no idea if it's the case, but my best guess is that Unity's diffuse lighting calculations have less quality (maybe fewer iterations or something like that) than some of the alternatives, making the lighting less smooth.

    Anyhow, despite the fact that this issue bothers me, I'm pretty happy with the engine in general, and there's probably a lot of stuff you guys have to do to keep things running on every which device and platform, but it sure would be good to have lighting on par with UE or Cryengine, at least as a choice.
     
  10. ikazrima

    ikazrima

    Joined:
    Feb 11, 2014
    Posts:
    320
    Older engines like Quake & Source - yes. UE3 also included, they have that shiny, plastic look.
    Modern engines - harder/impossible
     
  11. Frpmta

    Frpmta

    Joined:
    Nov 30, 2013
    Posts:
    479
    3D games, yes.
    Unreal Engine 4 has that annoying specular and the lighting looks excessively 'round and volumetric', Unity has that non-sharp blurry textures look no matter how high resolution are the textures and CryEngine has that really hard sharp look with a pretty high contrast.

    Stylized games, I just had to look at Firewatch shadows and immediately knew that it was made in Unity haha.

    I can also tell when they are made in Unity when playing them because of the ugly irregular framepacing issues that cause stutter no matter how consistent is your framerate.
     
  12. DragonSAR2013

    DragonSAR2013

    Joined:
    Apr 26, 2013
    Posts:
    77
    It is probably easy to tell for mobile games since most of the mobile games are being developed using Unity. :)
    Unity Rules mobile games.
     
  13. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    That's actually very likely, especially if you're talking about this screen:

    Unity specular highlights have very specific/unique look to them, same applies to reflection probes. This scene looks like it might be using standard unity shader, so unity would fairly high on the list of candidates.

    In the later screens postprocess improves the situation, but, still..
    but still, it may be possible to guess the engine.

    Now, if the guys replaced standard shader completely with their own solution, guessing the engine would be much harder.
     
  14. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,044
    That's an awesome reel. Really shows how much Unity has improved in the last few years!
     
  15. theANMATOR2b

    theANMATOR2b

    Joined:
    Jul 12, 2014
    Posts:
    7,790
    Can't tell between engines, but it seems the lighting/shadows are giveaways to other devs.

    I have a weird ability to recognize the software used to create 3D models. Mostly coming from Maya, Blender or Max, usually when the model is still untextured.
     
  16. Frpmta

    Frpmta

    Joined:
    Nov 30, 2013
    Posts:
    479
    I can tell when a textured organic model is made in Zbrush vs other modeling software. Mostly to do with how artist go crazy with detail when modeling in Zbrush compared to normal polygon modeling and end up baking all of it as textures unable to convey their former depth/volume :D

    I remember you also made a post once about how animations made in XYZ versus those made in Quaternion can look different. What was that about?
     
    theANMATOR2b likes this.
  17. Rodolfo-Rubens

    Rodolfo-Rubens

    Joined:
    Nov 17, 2012
    Posts:
    1,197
    Awesome!!
    Escape From Tarkov is also another good example:
     
    frosted likes this.
  18. Frpmta

    Frpmta

    Joined:
    Nov 30, 2013
    Posts:
    479
    It has that typical Unity blurry look made much more noticeable by the AA solution being used on the vegetation.
    Other engines forests look much more sharp.
    Yes, I am aware that image is a JPEG but I have noticed the sharpest screenshots of that game are always extremely downsampled and the low compression JPGs at 1920x1080 still have that unnatural blur that requires of solutions like the asset Beautify to solve.
     
    Last edited: Apr 12, 2017
    Billy4184 likes this.
  19. Kronnect

    Kronnect

    Joined:
    Nov 16, 2014
    Posts:
    2,906
  20. Rodolfo-Rubens

    Rodolfo-Rubens

    Joined:
    Nov 17, 2012
    Posts:
    1,197
    Maybe they are still using the old FXAA, maybe if they use the new TXAA provided by Unity it won't look that blurry.
     
  21. theANMATOR2b

    theANMATOR2b

    Joined:
    Jul 12, 2014
    Posts:
    7,790
    I believe that was regarding converting quaternion to bezier curves. In most software packages when converting one animation curve type to another it changes the inbetween key frames causing the animation to look different.
     
  22. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    Euler angles are subject to gimbal lock. Quaternions are not.
    At extreme rotations (pointing up) it'll create significant difference. Using euler angles might make certain movements very difficult to achieve or even impossible.
     
    theANMATOR2b likes this.
  23. TwiiK

    TwiiK

    Joined:
    Oct 23, 2007
    Posts:
    1,729
    Can you show me an example of this blur? Because if you start a new project in Unity it is bare bones with no post processing. There is zero blur. It's just hard aliased lines everywhere which I would consider the opposite of blur, pixel perfect in fact.

    I've always associated Unreal Engine with blur. If a game was blurry that was a clear indication it was Unreal Engine for me. Probably because every post process effect is enabled by default including TAA, I assume. And TAA is like smearing vaseline on the lens, and unsuited for many games in my opinion.

    And if you mean that Unity becomes blurry after you've enabled every conceivable image effect then you can't really blame Unity for that, can you? :p

    I like how bare bones Unity is out of the box. Most of my games don't need post processing so having to turn it off every time would be a hassle. I even think the default camera, directional light, skybox and GI settings should be removed, but perhaps there's a way for me to set that myself? It just hasn't bothered me enough for me to investigate that yet, I guess. :p But I rarely need those for my projects either so I usually end up deleting/resetting them.
     
    Stardog and Ryiah like this.
  24. TonyLi

    TonyLi

    Joined:
    Apr 10, 2012
    Posts:
    12,702
    I think this refers to a general lack of crispness and washed out colors in semi-photorealistic scenes out of the box when compared to Unreal, not to an intentional depth of field postprocess effect. (A commonly-cited example is this page.) Which isn't to say that Unity can't be crisp and have vibrant colors. It just provides different settings out of the box (more bare-bones, like you said) that may need to be adjusted (possibly even using some third party effects) to make it look closer to Unreal's default settings.
     
    Ryiah and Kronnect like this.
  25. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    When Unity can do an indoor scene that looks like this:



    Or an outdoor scene that looks like this:



    I will gladly sing its graphical praises.

    Until then, all I can say is that if you can't see a difference, you're missing out.
     
    Peter77 likes this.
  26. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    On related note... is there a free high quality unreal scene without "can only be used in unreal" license clause? Something like this can be used to properly compare the engines.
     
  27. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    Not that I know of, but at least for outdoors, I would suggest making a scene purely out of quixel megascans stuff. That way at least the raw material quality would not be debatable.

    For indoors, I don't know of any complete scenes that are at the PBR quality of the above picture. Yet, a bunch of high quality tiling brick/concrete/ceramic textures would make for a good comparison - no baked lighting and no modification of the materials, simple architecture/geo, and materials sourced from a very high quality place such as Quixel, would make for a good comparison imo.
     
  28. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Can it?

    Getting the most out of a renderer means developing content for that renderer's strengths. Grabbing anything designed around one renderer and shoehorning it into another doesn't show anything useful to compare the two. It's going to look worse in the renderer it wasn't designed for simply because it wasn't built in a way that takes advantage of how that renderer works.
     
  29. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    What exactly are these differences, besides a question of simplification of lighting algorithms for performance reasons? I haven't heard of any case of the lighting in a general-purpose engine being modified for stylistic reasons or to develop some particular 'strength' in terms of visual quality.

    My understanding of the way that engine lighting is developed, is that there is a simple trade-off between quality (driven not by stylistic influences, but rather by how much the lighting algorithm can approach realism as defined by our best understanding of how light works) and performance.

    This means (imo) that to the extent that the lighting algorithms in any engine deviate from realistic lighting equations, it is not developing a strength, but rather a weakness - and any modification of PBR shading that is required to force it to look more realistic in some particular circumstance, is a crutch and a burden on developers, because all the good material libraries out there are shifting very quickly to PBR for the precise reason of making artists lives easier and giving them better results regardless of the way that lights are set up in a scene.

    Of course, games need to be performant, which is why approximations are used for lighting algorithms, and my best guess of why Unity's lighting is relatively somewhat inferior, is that they want to retain easy flexibility across platforms, which is of course very important. But I can't see why it should not be possible to switch between different lighting algorithms depending on what sort of game we're making. I certainly don't want them to implement only voxel cone traced lighting and leave everyone's phones smoking, but leaving out high quality solutions is leaving out something that could make a lot of games look much better.
     
  30. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    Yes, I think so.

    Both engines are PBR based, and there aren't any secret techniques in post-processing either.

    If someone is going after photorealism, that would be a valid comparison.

    Also, indoor scenes are preferred, obviously.
     
  31. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Is there any documentation on the lighting equations used in Unity I think I recall a Unity developer working on a project and one of the things they changed for their game was the lighting...

    Found it Satellite Reign pushing SSR and lighting for a cyberpunk city look (article)

     
  32. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    Download standard shader source, open it in any text editor and read it.
     
  33. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    Isn't that only useful for affecting interaction of a single ray of light light with a surface? I mean, I thought the core lighting calculations (calculating the way that the light propagates through the scene) were calculated in some non-accessible location? From what I can tell, my best guess is that light propagation is where Unity falls behind - the scenes in other engines generally look more cohesive, with a more readable depth to them, probably because they have higher quality and/or more iterations of interactions between light rays bouncing around the scene.
     
  34. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    Uh, unity is not a real time photon tracer, so I'm not quite sure what you mean here.

    What you can't access is enlighten internals. Then again, I recall someone discussing which shader pass is being using to feed information into enlighten.

    Lighting formula - "given light position, type, parameters and surface data, calcualte color of the final pixel" -> it is in stnadard shader sources. I think there are even references to documents/papers they used.
     
  35. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    Well, I'm not familiar enough with lighting internals to know exactly what goes on, but I suppose my point is that the Global Illumination (Lightmass in UE vs Enlighten in Unity), which calculates the diffuse lighting interactions in the scene, is what really controls the overall quality.

    (Just to illustrate my point) direct lighting only is not very interesting (middle) compared to the added diffuse lighting (right):



    So (I may be wrong here) but I imagine the shader would have a cumulative (or is it a one-off?) effect on the diffuse lighting quality, but it would also greatly depend on the diffuse lighting algorithms themselves, which we cannot change with shaders.

    And to further make my point that the difference is not merely a question of shaders and materials, SEGI - which IMO is a huge improvement to visual quality in Unity, and the only time I've ever seen it look competitive with other engines - is a GI solution not a shader solution, effectively a replacement for enlighten, not the standard shader.

    So in the end, lighting has 2 sides, the algorithm that calculates light interactions, and the shader that calculates what happens to light when it hits the surface, and we can only control the second one with shaders.

    So I suppose then the GI is the main non-shader-bridgable gap between Unity and UE?

    Corrections welcome.
     
  36. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    This sentence implies that unity does photon tracing or something similar. This is not the case.

    Unless somethign changed, Unreal 4 doesn't even include Global Illumination system by default (there's LPV but it is experimental and by default it is off). Its lightmass lightmapper is much simpler (but much faster and less resource hungry), and is completely static. Which does not prevent Unreal 4 from producing superior visuals out of the box.

    On other hand, GI in unity is incredibly wonky and is barely usable, and chokes on larger scenes.

    Either way, if there WERE finished free scene without "UE4" restriciton on it, it could be transferred and compared. I don't see much point in discussing GI/no GI in depth.
     
  37. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    OK, can you clarify something for me: what is it that differentiates Unity and Unreal, if not the GI? Because as I understand, GI is responsible for the diffuse lighting, and the diffuse lighting is what really what drives the quality of lighting in a scene, and a shader can only modify the fragment (pixel) according the the diffuse lighting that has already been calculated for that point in space.

    Meaning that the shader only has limited capability to affect the visual quality of the scene (and in any case, if it's a PBR shader it should be pretty much all the same anyway).

    Anyway, if there's any magic going on in the Unreal shader, it should be quite possible to reproduce the HLSL in a Unity shader, and reproduce this magic?

    PS Not sure if there's some kind of copyright issue here of course, so not saying anyone should actually do it ...
     
    Last edited: Apr 16, 2017
  38. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    No.

    Before postprocessing is applied color of individual pixel on the screen is sum (arithmetical sum) of all light influences for one pixel. It is literally "Ambient Light" + "Light from source A" + "Light from source B" + "Light from source C" + "Reflection".

    Lighting in pbr system determined by Albedo + Smoothness + Metalness. Smoothness will affect amount of visibility of "Diffuse" and "Specular" component for this specific light being processed, and in addition to that environmental reflection.You can read up on this on marmoset site.
    https://www.marmoset.co/posts/physically-based-rendering-and-you-can-too/

    Shader can write whatever the hell it wants during ligthing passes and its ability to affect look of the scene is unlimited.

    For forward rendering lighting passes are Base (Ambient light AND color from GI), ForwardAdd (Light for individual pixel light), and shadowcaster.

    The shadowcaster is used to render shadowmap, but otherwise it is "Base" + "ForwardAdd(light A)" + "ForwardAdd(light B)" + "ForwardAdd(light C)"... etc.
    What GI essentially does is that it pretty much does the job of one of the "ForwardAdd" passes, except it computes approximated sum of all static light sources and their bounces, and adds it as one of the light influences. If I recall it correctly, in unity it is done in Ambient Pass. And that's it. If, say, your specular shiny objects look like garbage, gi won't help with that, because it is a function of of "specular" part of the light calculated by a shader, and environmental reflections.

    A shader can write whatever the hell it wants in any of the passes, completely altering look of the final scene.

    So, with that in mind...
    Default shader, exact math used in default shader, and available post-processing effects (out of the box). Also, approach to lighting the scene.
     
    frosted likes this.
  39. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    Diffuse is what I referred to. The shader cannot calculate diffuse light, since it deals only with the fragment that it is calculating at any given time, and can only use whatever diffuse lighting calculation has already been done outside the shader. It cannot calculate light bounces, or provide a GI solution inside the shader. Diffuse lighting calculations, including light bounces etc is by all accounts what determines the lighting quality of a scene.

    Sure it can write whatever the hell it wants, but it doesn't have access to any means of calculating bounces/diffuse lighting, so it's ability to increase the quality of the scene is extremely limited. It can, like, add fog and cover up bad GI, I suppose.

    OK so does that mean that the GI calculation does not take into account any material information? That would make sense.

    A bunch of monkeys can write shakespeare, but without access to relevant and useful information, chances of it are low.

    So is your conclusion that the GI quality has negligible effect on the scene? I can't agree with that. From what I know, the GI and the quality of calculation of light bounces is the definitive factor in making lighting look good, and by far the most difficult thing to make performant while approximating good results. With good GI, you can even make something without any materials at all, look realistic (rendered in iray):



    Shaders and post-processing can have a marginal effect, but they cannot create the foundation of good diffuse lighting (although they can certainly destroy it ... )
     
  40. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    Yes it can. What you're referring to is indirect light, and not diffuse light.
    Making a shader that calculate light bounces is also possible, if you're really into it.

    You can overwrite GI results completely, depending on pass settings. Negative light is also possible.

    Realtime baked GI - in unity - can access access material data, but only if you're using a surface shader and relatively high resolution for it. If you ever tried to animate GI material, you'd know..

    That doesn't have anything to do with the topic.

    If you're making a walking simulator without any dynamic objects in it, sure.

    However, when you have dynamic characters that are displayed up close, your GI becomes a glorified lightmap, and character shader and its ability to display materials becomes much more important.

    It is even less important in outdoor environments, when you can just fake it.
    1.png

    Basically, it is a glorified ambient lighting, and you can get comparable results with less complicated techniques.

    Unreal 4 does not include GI by default, only less complicated lightmapper. However, visual results, by default, are superior. That's because main impact for the quality is not global illumination, but the shader itself. A reflective surface can create a much bigger impact than a gi scene that took 40 hours to bake it.

    GI gives you very brush strokes. When you're overseeing scene at close distance, smaller details matter more. GI also does not give you any reflective lights, which is the main thing that makes scenes look real. Post-processing produces massive differences in any situation where HDR and tonemapping is involved. I believe it was covered in one of the unity talks discussing either blacksmith and adam. Basically, those scenes do not look very good without a bunch of post-processing filters.
     
    Last edited: Apr 16, 2017
  41. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    What I'm referring to is light that has bounced at least twice. Direct light can easily be calculated in a shader (just get the light relative position and color, and combine it with material properties in some way) but calculating bounces is what as far as I know a shader cannot do in a performant way ...

    So is this just possible, or reasonable as well? I mean, I know all the lighting is calculated on the graphics card in some way, but is it realistic for someone to write a shader in shaderlab to calculate light bouncing through the scene?

    Yes, but will it look good ... ?

    Well, this is an interesting point. So are you saying that UE's shaders calculate light bounces that produce the high-quality diffused lighting in this image? If so, I would definitely like to know how this feat is achieved.




    Yeah I understand that, but post-process can only work with lighting information that has already been calculated.
     
  42. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    NVXGI does that. Full realtime on GPU, also without precomputation phase. Their papes are available online, you could try to pull this off yourself. It is quite difficult and demanding, though.

    It doesn't.
    However, Unreal 4 support screen space reflection, which produce massive quality boost. You can actually toggle their debug view in a viewport and see.
    At the time when I was actively comparing two engines this wasn't even availbale in unity:
    https://docs.unrealengine.com/latest/INT/Engine/Rendering/PostProcessEffects/ScreenSpaceReflection/
    ----
    Basically, I think that importance of GI is greatly overestimated. Those subtle shadow variations are much harder to notice on textured surfaces, and if you have non-white textures, they'll also dissipate very quickly. Meaning it is possible to get good results with a lower-tech approach to shadowmap, instead of full blown global illumination system.

    However, I think that talking about it is pointless (a test scnee would be better), and in the end it is my opinion on the matter.
     
    Billy4184 likes this.
  43. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Not having written any significant rendering code myself in a looong time, I don't feel confident giving a detailed technical answer on that. Suffice to say, Unity itself has multiple different rendering paths with significant differences - including two that have different variations on the same base approach. There are loads of implementation details that can differ, and when optimising or "getting the most" out of something then implementation details can really matter.

    Keep in mind also that it's not just about the pixels. It's also about speed. Even if two engines can end up rendering identical pixels to the screen, chances are that they're going to have different performance characteristics in different scenarios, and that matters because it impacts how much of what stuff your artists can do in those different scenarios.

    Honestly... if you really could get a meaningful comparison out of putting the same scene in different engines and looking at the generated images, don't you think someone would have done so by now and given us a definitive answer to this question?
     
  44. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    Well then we need to figure out where the difference lies exactly.

    I wasn't disagreeing with you when you say that there are differences between the engines which would make a direct comparison difficult, what I am trying to get at is that IMO, any meaningful question of rendering quality (by which I mean something removed from stylistic preferences) is simply a question of how close the renderer comes to evaluating the rendering equation, which is relatively simple, but of course very difficult to integrate while maintaining performance, which is why all sorts of approximations have to be developed.

    So what I'm trying to say is that, assuming correct PBR values for materials, the idea of having to adjust materials and parameters manually in order to cater to some attribute of an engines implementation of a rendering pathway seems like a burden and not what I would call a particular strength of that engine.

    Ideally, light would simply follow the rendering equation, PBR values would be calculated against real-world values, and nothing else would need to be changed to evaluate raw rendering quality. So to the extent that stuff needs to be adjusted manually (such as for example specular values for materials used in Unity which is a pain) then I don't see that as a strength at all, and it forces developers to have to fiddle with stuff that they could otherwise leave to the elegance of physics.
     
  45. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,619
    I don't know either, but there is this Lighting the Environment page in UE4's documentation, that might provide some hints on this topic.
     
  46. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Ideally we'd have infinite computing power with which to do exactly that. We don't, so, as you pointed out, compromise is required in order to get some balance of reasonable results in useful timeframes within available resource limits.

    You're saying that it's not a strength based on comparison against a theoretically perfect system that doesn't exist. Nothing compares well against perfection. If you want to compare systems that do exist then it's a complex area, and looking at any one factor in isolation simply won't give the whole story.

    It's not going to be a single thing we can put our fingers on. Imagine you were asked to compare two cars and explain which one is "better" - how would you do that? You'd need to look at more than one thing, and you'd probably need some context - is this for track racing, transporting heavy goods, or family transport? The details will be different, but the overall approach will be similar if you're comparing renderers.

    Furthermore, it's not just the renderer that we're looking at most of the time nowadays, is it? It's the complete package - the whole engine and all of its tools. Even taking it as a given that some renderer can get superior results to another, is that on its own enough to justify one toolset over another? In some cases it may well be, but in others it certainly won't.
     
    frosted likes this.
  47. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    The theoretically perfect system does not exist, but a renderer can still be compared to it by its proximity to that perfect system.

    I don't think this issue is as complicated as many people think it is. The crucial thing is that reality (which is governed by relatively simple equations that are not bound by computing power) provides the benchmark for the quality of a renderer. Everything else is a question of personal preference, which an engine should not be designed to cater to just because someone happens to prefer it.

    Not only that, but the most important thing of all, is that pretty much all kinds of stylization can be produced from a foundation of realistic lighting, but realistic lighting cannot be produced from a foundation of stylized lighting. Almost all stylization is a lossy effect which cannot be converted back to the original image without suffering a loss of quality.

    So IMO, a game engine's core rendering tech should ONLY attempt to produce as realistic lighting as possible within performance constraints, and stylization should be left to shaders and camera effects. As far as I know, this is pretty much always the case in practice.

    No disagreements there - there's a reason why I'm still here complaining. Unity's a great engine for many reasons, but graphically it's inferior and I don't think it should be.
     
  48. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    You keep dragging this back to the ability to measure output against an ideal mathematical model. I absolutely get that you can measure output quality this way. What I'm driving at is that you can only measure output quality this way, and the complexity I'm referring to comes from the fact that there are many other factors we care about - starting with (but by no means limited to) performance.

    I think performance deserves a heck of a lot more than the offhanded (by my reading) mention you give it. Keep in mind that if performance weren't an issue then we'd not have to make the various compromises we do. We could all just use Pixar's renderer set up for photorealism and call it a day.
     
    frosted likes this.
  49. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,025
    I never said that performance is unimportant. In fact, I think it's the most important issue of all. I have no doubt that Unity have the capability to implement anything that could be found in Unreal or Cryengine, but as I've said earlier, they probably want to keep good flexibility across platforms, which is of course necessary. And I do not want Unity to implement some high-end solution at the cost of performance on low-end/mobile hardware.

    But what about switching between different lighting tech, based on whether you want raw graphics quality, or performance? I mean, in the same way that in Substance Designer you can easily switch between OpenGL and IRay, why can't we switch between high-performance/low-quality and low-performance/high-quality renderers in Unity?

    I'm not sure why this isn't the case, since by doing that Unity could outclass its competitors on just about every major feature, and developers here would be able to take advantage of the graphical capabilities that can only be found right now in UE or Cryengine.
     
  50. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,619
    Billy4184 likes this.