Search Unity

Seeking best practices for Built-In Realtime lighting (also w/WebGL!)

Discussion in 'General Graphics' started by Reverend-Speed, May 13, 2022.

  1. Reverend-Speed

    Reverend-Speed

    Joined:
    Mar 28, 2011
    Posts:
    284
    Hey folks. I'm a fairly experienced Unity programmer in the middle of building a game which will feature some procedural generation using 3D tiles assembled at runtime (think Tiny Keep etc).

    This is one of the first times I've had to choose the correct pipeline and graphics features for a project. I've consulted a range of documentation (eg. 1, 2, 3, 4, 5 etc), but I still have some questions, so I'm hoping some old hands on these forums will be able to set me right!

    I probably won't be able to use any kind of GI for my procedural project as Unity cannot calculate GI on static objects during Runtime (feel free to correct me on this!). I'd also like to use Camera Stacking and ultimately I'd like to be able to port my game to as many platforms as possible. WebGL might be a stretch right now, but I'd love if I could get the game running in browsers!

    This seems to suggest that I should be using Realtime lighting with the Built-In Pipeline for this project. I've seen some recent games achieve gorgeous visuals with this approach (eg. Tormented Souls), but I'm having a little trouble finding the right balance of realtime lights per scene.

    I know that by default (eg. Fwd rendering) surfaces can be lit by 4 realtime lights, with the option to bring that up to 8 via project quality settings. However, this doesn't seem to carry over to WebGL, which appears to have far fewer lights per surface available. Looking at this reference, deferred rendering seems like it would solve the multiple light issue, but I've no experience with that path - can anybody who's used it give me some advice on this?

    Emissive Materials - Am I correct in understanding that with Built-In Realtime, Emissive materials will ONLY appear to be lit by an internal light source, but NOT light the surrounding area?

    The majority of the game will take place inside buildings, though I can imagine some brief scenes that take place on the outside of these buildings. With this in mind, I'm choosing to use Distance Shadowmask for my mixed lighting mode - does that seem to make sense?

    I've built a test environment inspired by the PS2 game Silent Hill 4: The Room - this link contains a webgl player of the environment, a 360 reference video of the original location and a link to the entire test project (allowing you to test and inspect the scene in the editor). I'm getting fairly low framerates inside the editor (48-60fps for a single room...!) with framerates of 24fps in the Webgl player.

    * Is there a way to have the Webgl lighting match the lighting within the editor project?

    * Are there strategies for maintaining or improving the lighting in the non-WebGL version while improving the FPS performance?

    I really appreciate any help anybody can lend me on this - I'm somewhat overwhelmed by the amount of options right now...!

    EDIT 01: Been testing with deferred rendering and I'm now getting a steady, healthy 90fps in-editor, with the WebGL build hitting 30fps most of the time. The lighting quality is MUCH improved in both versions. Seems like 'deferred' is the way to go, though I'd still love to hear from someone who's familiar with the best practices for both...! (Some useful documentation!)
     
    Last edited: May 13, 2022