A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Now in Beta! Get 1:1 live lessons on any Unity topic or help troubleshooting your project – Connect with an expert on Unity Live Help
Discussion in 'General Discussion' started by cmorait, Apr 2, 2019.
It has its uses. It really sells artificial looking environments.
Glad somebody else said it.This is why I wasn't gonna take the time to post pictures about nothing.
About ACE's, I use it almost exclusively. Just a matter of taste, and probably the screen I work on too. But I think there is a tendency for noob artist to go for higher-contrast overkill in effort to make their work pop.
There isn't one tool to rule them all. They all got a different purpose. If you don't like final image, there is 1 million ways to adjust it. No reason a Unity or Unreal user is stuck with a render they don't like, unless they simply lack the know how to produce the results they envision. Unity can be crisp and sharp, Unreal can be faded and washy. They are both very versatile, and whatever negligible difference you can point out between the two probably falls within the realm of idiosyncrasy -- in other words, nobody but you really notices.
In general, you are probably gonna find more higher quality art coming from unreal, because historically that is what the professionals are using. So the high--tier artist are workign in unreal. They know how to make a nice image. Fewer artist of that caliber using Unity, so more amateurish art. Not a matter of the tool at all.
Not sure why everyone is landing in this thread with such a critical tone. It's just a talk about graphics, nothing to get wound up about. Nobody is hating on Unity, and everyone's got a different opinion about what looks good or not. If we hold it steady maybe the thread will turn out to be something useful for people to read.
It's easy to say that artistic style can produce anything, but the devils in the details. Of course technically there is nothing unachievable about any aesthetic that exists, last time I checked Unreal didn't operate on magic sauce. Unless there's something going on in Unitys sauce code the question is really one of figuring out how to get what one wants.
To my (photoshop) defense, I was actually not adressing all teh point in the thread, just some of the point, ie I looked at sharpness, then I looked at the "GI" and "black" observation some people made. Which made the thread actually move beyond those point (for once), so we are in the process of narrowing down stuff, that is what's left is actual render with:
1.Same POV, (camera at the same position and direction)
2.same FOV, (check for applied camera lens distorsion too)
3.same lighting condition (light with same type, same position, same direction, same intensity, white or same color)
yeah, a rigorous scientific test like that would really be necessary to make a meaningful argument. ^^^
Now don't get me wrong:
I know exactly how to get the results I want using post. [The asset Beautify more specifically.]
The question is why I need post-processing to achieve it in Unity when I didn't in Source/SFM and CryEngine. They had the 'sharp look' by default with no need of any post-processing or image effect. It just rubs me the wrong way to think 'some value at the core of the engine is broken' and it has been bothering me since I first started using it in 2013.
And I do not like Unreal Engine. It just used to have the same thing I am complaining and ditched it around version 4.10 when they changed lighting model and now it has 'a sharper look' even when the shaders in all demos they have put out look more plastic than any recent one from Unity.
tl;dr: I believe that the current sharpness of Unity is too low by default and I'd prefer not to have to make use of image effects to make up for it.
There is a chance it is easier to manage sharpness as a post-effect than at an engine level implementation.
So maybe I shouldn't talk as if this isn't for the best.
And assuming Billy's issue with Unity images isn't sharpness...
I found interesting the use of 'Panini Perspective' distortion in the Heretic demo.
So there is always that: camera lens altering image scale perception, but I don't think there is a difference between Unreal and Unity implementation. Perspective Projection is a standard.
I don't know about SFM but CryEngine applies post-processing effects by default.
They can be disabled using the console. [2.0 even allowed editing shaders. 3.0 was the blackbox.]
After thinking that all those differences just looked like things that would be artistic choices I decided to have a quick play with the legacy pipeline example scene. It seems silly to call them things the engine "does" to the image.
Settings adjusted in the way you did (but entire in engine):
That looks good.
How did you achieve that sharpening without image effects. (A breakdown of what you did to go from the original to the sharpened one would be appreciated.)
Our discussion is very helpful but I started that thread for a completely different purpose.
Do you know where someone can find advanced level tutorials in lighting etc because all the training materials in known sites like Lynda, Pluralsite and Udemy are only for beginners. I need training material that is for indeterminate to advance level that will help upgrade our skills for having more realistic results in Unity.
Should I contact Unity by email or phone to get more info ? If a person from Unity team will see that post I hope to answer.
I believe most go to the Asset Store, download the best Arch Viz visualizations and then study and reverse engineer it.
Then there's also this:
Spoiler: Some asset store lighting assets remind me of
I'm unsubbing from the thread (cos nothing more to add) but before I go I urge people to start thinking of their scenes as energy.
Fix your primary light so it's strong, way stronger than you expect.
Fix your sky.
Fix your exposure.
HDRP has problems with procedural sky + exposure but that will be fixed.
Thanks for your feedback.
I have seen a couple of projects and they are not so much of a help. In the promotion videos look amazing but when you are running them on your pc feel less appealing and fake.
Lighting in a box is an amazing tool If you're novice I used it for a period of time but now I prefer to do everything in the classical way.
You've perfectly captured the essence that is Lighting Box 2.
Lightbox does indeed look like a mess, I was only proposing it as a source for learning its well set up fast lightmap baking.
But assuming you are already a Lightmapping pro, then the only piece of advice left is:
I assume you've seen this developers projects:
Away from that machine for the weekend now, but it was all just playing with the default PP. Upped sharpening in the TAA, changed tone mapping, upped contrast, dropped saturation a little and played with the color balance.
It is image effects, but most of the Unreal look is post processing anyway.
Then start hitting cg, cinematography and photography lesson, not just game engine/industry
TL;DR = "git gud, read"?
The most problem in Unity is performance. Heretic can only archive 30 fps which suitable for cinematic but not gameplay. I hope Unity finish HDRP, DOTS and many *must* have features as soon as possible.
What have you actually made in Unity?
Nothing yet. I just try UUC and performance really bad. Unity does not come with easy to use template. So, I end up with UUC and behavior designer. The performance is really bad and combine with HDRP. I don't know what to say.
I know that DOTS is coming to save performance but how long does it take to complete? Does it suit all coding? I don't think it suit everything like player controller or UI except Unity done that by default.
Actually.. heretic is not optimised and running on a medium-level GPU. Staff spoke about it on discord.
I use HDRP extensively for hours every day and it's generally extremely high performance for the work I am asking it to do under load.
On my GPU (a bog standard nvidia 980 with a broken fan running on a 5 year old CPU) I usually get around 100fps native resolution (for me this is 1440p) with realtime shadows extending for 4 miles, with a 4 mile view distance, in open world with full post effect stack (volumetrics etc), time of day and no baking.
So no, you're wrong. But I don't blame you, documentation is thin on the ground and if you do the classic empty scene test then you would probably draw that conclusion, but the only conclusion worth having is if it works for your scenario.
It sounds like something else might be making things slow and we can totally help with that as a community if you make a separate thread for your issue.
So..do you think it becuase third party asset that make performance bad not HDRP itself? I'm not sure about this.
Or maybe my computer is really bad for HDRP. It's a laptop with 8GB of RAM and nvidia 930mx (not GTX).
Yes definitely other assets OR perhaps the GPU setting. HDRP has a lot of controls for performance but also laptops often switch to integrated GPU, can you check for me in preferences ? it should let you pick.
Also it goes without saying a build will be a lot faster, but you should get 60fps on the HDRP example scene with that hardware (if you switch unity to using gpu in prefs).
It's fine on way worse hardware.
This reads a lot like "I do not bother to optimise my code at all, which is the specific reason the data oriented tech stack appeals to me."
Yes, it's all my bad. Editor still using integrated GPU. It's really shame.
I'm sorry everyone for everything. I wish that Unity done their job on their road map.
About DOTS, I'm not even sure and that I can coding that way. I'll look at it again when document release. I just heard that they gonna make it suit with normal style coding and can replace everything to DOTS. I hope that will come true soon.
Thank you everyone.
They are doing their job on the roadmap by any definition, in that things on the roadmap are in the future, which is why you need a map in the first place.
I was curious too, how much of the rendering quality is just tweaking. I haven't messed around a huge amount with HDRP, but it does seem quite capable. Here is my version of the default scene, with various post effects applied:
I don't know how to put this any other way, but you're running a toaster for a GPU. Laptop hardware is already weaker than similarly numbered desktop components, but that laptop takes it one step further by giving you one of the weakest models available.
NVIDIA's 930MX is basically identical to Intel HD 630. It's approximately half the performance of a GeForce GT 1030 which is the weakest desktop card from the GeForce 10 series. It's slower than most mid-tier graphics cards from ten years ago.
Hell, it's only maybe 30% better than the Intel HD 6000, and that thing is a disaster. I'm pretty sure the only thing the 930MX offers is some feature compatibility.
Wait, GTX 1080 is a medium level GPU....OK
PS: I do agree that Heretic is not optimized (that is the reason it runs 30 fps on i9 rtx2080ti at 1440p)
I just leave it here
A GTX 1080 is comparable to an RTX 2060 in many games at 1440p. Since the RTX 2060 is a mid-range graphics card, that makes the GTX 1080 a mid-range graphics card. Original cost is only relevant when it's part of a current generation.
GTX 1080 scores 12,434. RTX 2060 scores 13,177.
Yep, no much impressive gain, for the cost.
What you mean by "mid-range"?
Because discussed cards are high end cards.
Current generation graphics cards that have 60 for the last two digits are considered to be mid-range. A quick search for the phrase "nvidia mid-range" gives me tons of results for the RTX 2060 and GTX 1660 cards.
Fair enough. Yep, I have noticed the price tag spans at least 300$, between x060 and x080. And performance one and halves - to near double.
The 1660's performance is basically equal to the 1070, but I think its MRSP is lower than the 1070's was at launch by a pretty significant margin as well.
Sure, "mid range" as far as nVidia are concerned, who want everyone to be buying a card from every series they release.
Consider that, for the rest of the people in the world, you're talking about cards that start at twice the cost of a whole game console. This will vary from game to game, of course, but a current generation "xx60" is going to be very much above median for many audiences, and that matters far more than the words you use to label it.
When I talk about mid-range I consider it a high end GPU made around 3 years ago.
Regarding Heretic, I believe it will run 60fps on most cards like mine with a few tweaks at 1080p, the purpose of the demo was not to provide 60fps.
The reason I'm happy making a statement like that is because my game is using HDRP and I've put it through a fair few paces and it just gets faster over time. Now for .... realtime GI.
I fully understood while typing that post that there were people that couldn't afford the cards, but that doesn't change the fact that they're considered to be mid-range cards. For people that can't afford them there are always the AMD RX 580s which have fantastic performance for the price, or as a final last resort a game console which can be bought used.
I suspect there's some confusion here... how is a game console equivalent to a current gen GPU of an xx60 variety?
If we look at just 1060s and greater (is that "current generation"?), according to the March data in the Steam hardware survey that's around a third of players. I fully understand that people use that label, but it's a misleading one in almost anything other than a PC enthusiast context.
I fully understand that enthusiasts and people trying to sell hardware consider the middle of the current lineup "mid-range". All I'm saying is that it's not at all representative of what your users are likely to have unless you happen to be targeting that particular audience.
From a pure performance perspective it isn't an equivalent to a current generation xx60 GPU, but then I never said that was the case with my post. I said it was an option for anyone that couldn't afford even the budget RX 580 which itself was an option for people that couldn't afford a "mid-range" card.
Why is it an option if it's not equivalent to a current generation GPU? Because it allows you to play the games with nearly the same graphical fidelity and resolution as a "mid-range" graphics card while only having to make a hardware purchase every six to eight years. You can't achieve that level of savings while maintaining that level of performance with a PC.
Incidentally, to my knowledge, NVIDIA doesn't use the term "mid-range". That quick search mentioned several posts back gives results that are almost entirely from the press. Searches directly solely at NVIDIA for that phrase only result in pages for Quadros (which is just odd).
I feel we're getting off topic here now. To wrap, yes, I feel that the GPUs in consoles are reasonable to aim for as targets for a game to run on, since that hits a really wide audience, and anyone with higher-end PCs can then get a better experience. From a developer perspective, though, note that they're somewhat of special cases - both in terms of being fixed platforms we can (and are expected to!) specifically optimise for, and in terms of differing customer expectations (eg: generally being happy with 30hz rather than 60).
Of course, if you're making something that's specifically aimed at PC gamers or enthusiasts, or purpose-built showroom PCs, or anything like that, go nuts with whatever is at your disposal. There's no reason not to!
New Video Comparison Unity HDRP vs Unreal.
What do you think ? Unity or Unreal feel more realistic?
Define realistic. Most of the differences there seem to be different settings and artistic choices.
I was lazy anc careless and thought the unreal one was the unity one, then I checked and realized my mistake
Yea, from reading the comments on that video, the artist just started using Unreal and it shows.
An good archviz example comes from Ruggero Corridori.
He create this scene in the the BuiltIn Pipeline baked with
Enlighten and Progessive with an lot of knowledge and some tricks.
This setup runs on android and vr with same quality.
He also did an an early HDRP 4.6 version with bakery who only needs the environment ibl for slightly better looking result.
Bakes down in 20min.
So archviz is solved in Unity.)