Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

[COMING SOON] Rove3D: Interactive Production Quality Pathtracing

Discussion in 'Assets and Asset Store' started by rove3d, Sep 27, 2016.

  1. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Worked on the traversal code over the weekend, tracing much larger and more complex scenes now. I promise the video is coming, I've just been prioritizing some coding over the weekend. Eye candy!

     
  2. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Another render, with high scene and lighting complexity. ~2.5M polygons and ~100 lights as well as the physical sky seeping in through a cracked door.
     
    TooManySugar and laurentlavigne like this.
  3. zelmund

    zelmund

    Joined:
    Mar 2, 2012
    Posts:
    437
    nice pictures.
    did you try render scene with numbers of trees/grass? i meen semitransparent objects and sutout shaders.
    such shaders will impact performance same like opaque shaders?

    can you tell, how much time you need to release 1st public beta or something we can use with our projects? is it possible to see it in this year? would like to mix your solution for graphics with our project to show all this to our bosses =)

    sry for english
     
  4. NERVAGON

    NERVAGON

    Joined:
    Oct 27, 2009
    Posts:
    73
    In that last shot you have 100 overhead lights, and they are scarcely contributing to the lighting. I am wondering now if your renderer supports IES lights. If those overheads are emmisive materials, are they tweaked to match Unity's internal renderers? Are you using physically accurate camera and lights? Watts, Lumen, Candelas for light values, and ISO and f stop for camera, etc? I would love to see a shot with realistic surfaces rather than perfect shiny and perfect diffuse.
     
  5. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    @zelmund - Transparent meshes (made transparent by the alpha channel in a texture) take on an extra intersection test before continuing through the object, so there's a little extra computation but not much.

    The beta will be released this year. I can't give out an exact date yet, but it will be early December.

    @nervouschimp - The 100 lights are all emissive meshes and use internal intensity values rather than using Unity parameters, since Unitys point lights and such are not physically plausible. We do use proper high dynamic range lighting with tonemapping, but our metrics at the moment are intuitive/scale values rather than actual real world units. For example, instead of ISO/f stop/etc. for the camera, we use "focal distance" and "aperture size" which will be easier to work with for some. Also, we don't use IES lights at the moment - support for more advanced features seen in offline renderers will be an ongoing thing, as Rove3D matures and evolves.
     
    Last edited: Nov 15, 2016
  6. zelmund

    zelmund

    Joined:
    Mar 2, 2012
    Posts:
    437
    thx for reply. its great news for us. wish you luck in develop =)
     
  7. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    New video with complex interior scene and dynamic geometry/lighting, as well as Unity physics integration:



    Looks much better without YouTube's compression artifacts, but what can you do.
     
  8. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,225
    It's impressive but it could be used as a game renderer if the dots were bigger to fill the viewport then as they refine, get smaller.
     
  9. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    It isn't shown in the video, but we are working on some temporal filtering techniques for reducing noise further. Our ultimate goal is pathtracing in gaming.
     
  10. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,225
    Here is what I meant.

    Screen Shot 2016-11-24 at 4.02.43 PM.jpg

    This gets refined with smaller pixels as it converges. Used to be the way raytracers converge back in the 90s
     
  11. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Oh I see, yea I've seen some of the major renderers use that (I think Octane uses it when the camera moves). I'd like to get to a point where the noise is reasonable enough to not need those kind of techniques, as I don't really like the half resolution effect it temporarily gives. We're going to keep optimizing and GPUs keep getting faster, so I think we'll get there. The video used a GTX 680 (which is notorious for poor compute), I'd really like to get my hands on a newer card and see if the noise no longer is an issue.
     
    one_one likes this.
  12. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Robotic bugs: ~2M triangles.

     
  13. TeraBitSoftware

    TeraBitSoftware

    Joined:
    Nov 28, 2016
    Posts:
    4
    This looks very interesting and an impressive achievement. I had a little go doing something with RT a while back using Nvidia Optix and my own experimental 3D engine. I even integrated it with the Oculus VR SDK (for the DK2), by tracing two viewpoints and bending the rays as they moved away from the lens middle to do the distortion correction.

    Anyway, getting back to Rove, I have a couple of questions:
    • How do you handle texture sampling, are you using Radial Rays to calculate Mip level?
    • Will rove handle particles?
    • Do you have any post process filtering in the works to smooth over the noise?
    A couple of years back, I was working on a denoising shader for path tracing that, in addition to the rendered image, used an albedo color buffer, world normals buffer and depth buffer to accomplish this. Never got to finish it, but looks like it has some potential if this is anything to go by.

    Will be keeping my eyes peeled for further updates.

    Regards,
    TeraBit Software
     
  14. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Thanks for the questions, I hope this answers them:

    1.) We calculate mipmaps and filter/interpolate texture samples traditionally, with an option for elliptical filtering, as well as rendering without mip maps (we do sample over the entire pixel area).

    2.) Yes, we plan to fully support Unity's particle systems, both mesh and billboarded particles. Once the beta is released with the essential rendering features, this and skeletal animation will be the priority.

    3.) Yes, we are working on something that avoids auxiliary buffers (depth, normals, etc.) to remove the overhead of rendering to these buffers, and it will be temporal. However, our first priority is getting enough samples in realtime that we don't rely too much (or at all) on a filter to begin with.
     
  15. TeraBitSoftware

    TeraBitSoftware

    Joined:
    Nov 28, 2016
    Posts:
    4
    Hi, thanks for the reply,
    1. Not sure what you mean about doing traditional filtering, but elliptical filtering seems like it does much the same as ray radials / differentials), so all good.
    2. Sounds good.
    3. Ok, will be interesting to see how this goes, especially on newer hardware. Sounds like there may be scope for me to dust off my smoothing shader at the less powerful end. :)
    Regards,
    TeraBit Software
     
  16. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    One of the fun things is that Rove is compatible with the post-processing pipeline, so people are free to experiment with their own filters as well as any other post-process (cel shading, tonemapping, glow/bloom, camera effects, etc.) which would look great with pathtraced renders.
     
  17. TeraBitSoftware

    TeraBitSoftware

    Joined:
    Nov 28, 2016
    Posts:
    4
    That's what I was hoping. However, if Rove is replacing the default deferred renderer, what buffers are going to be available to the post processing pipeline? I would expect a HDR colour buffer, but since the depth, world normals, world position are not needed for RT, by default I wouldn't expect these to be available for post processing?
     
  18. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    I can't promise it for the initial beta release, but providing those buffers would be trivial, as depth/normal/position are all immediately accessible from the primary rays, so it would just be a matter of rendering them to extra buffer textures that could be accessed through script.

    That is added to the todo list.
     
  19. TeraBitSoftware

    TeraBitSoftware

    Joined:
    Nov 28, 2016
    Posts:
    4
    Sounds good. Being able to enable those optional buffers will likely make the system more flexible overall.

    Something else that you might consider adding at some point, but could open up some interesting use cases is the concept of a ray "transformation material". That is a material that, rather than effect the shading of a ray, simply transforms position and rotation by a specified amount. I was planning to use this for 'Portal' like effects where a ray hits one wall then comes out of another (interestingly with this you could actually get light pouring out of a portal) but it could also be used for effects, like cloaking fields etc.

    Regards,
    TeraBit Software
     
  20. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Tera - that would be really interesting! Can imagine a cool game based on light bending.

    Just bumping here as the release date has been announced.

    Rove3D beta will be released December 13 for $349, available first on https://www.rove3d.com
     
  21. arapps3d

    arapps3d

    Joined:
    Jan 13, 2015
    Posts:
    19
    This looks very interesting. Unity with realtime pathtracing is like a dream come true.....keep up the good work. Really looking forward to it.
     
  22. AlanMattano

    AlanMattano

    Joined:
    Aug 22, 2013
    Posts:
    1,501
    Now I can start dreaming my photorealistic game!
     
  23. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Release is next Tuesday, finishing up some last minute features before the beta. Here's texture emitters:

     
  24. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Alpha transparency/translucency added.

     
  25. arapps3d

    arapps3d

    Joined:
    Jan 13, 2015
    Posts:
    19
    I don't know if anybody already suggested this, but it would be awesome to have a watermarked test version for free.......
     
    Last edited: Dec 11, 2016
    kenshin likes this.
  26. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    @arapps3d - That is planned, and will be available soon after the beta release. I know it's important for users to test with their hardware, so it'll be one of the first things I get to once everything stabilizes after the initial release.
     
  27. arapps3d

    arapps3d

    Joined:
    Jan 13, 2015
    Posts:
    19
    Fantastic, thank you!
     
  28. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Imperfections are essential for making realistic renders; roughness maps for coating and base layer:



     
    Jingle-Fett and blueivy like this.
  29. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    630
    This all looks beautiful! Do you have any human models to play around with in Rove? So we can see how hair, skin, and eyes look.
     
  30. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    No human renders yet, we need to finish our sub-surface scattering material for that, which we will do during the beta phase.
     
  31. arapps3d

    arapps3d

    Joined:
    Jan 13, 2015
    Posts:
    19
    One more question. If I would use the beta to build a standalone exe for win as a vr project for rift - will this work?
     
  32. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    @arapps3d - Not at the moment. I'm getting a Vive for Xmas and will be adding VR support soon after, but VR support will not be included in the initial release.
     
  33. contempt

    contempt

    Joined:
    Jul 31, 2012
    Posts:
    88
    That's great news because photorealistic VR is what interests me most about Rove3D's potential.
     
  34. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
  35. PlugAndPlay

    PlugAndPlay

    Joined:
    Sep 13, 2013
    Posts:
    6
    Is there a demo available?
    At that price point i want to be sure that the Asset meets my expectations before i buy.

    I'm particularly interested in how much my regular workflow has to be adjusted to get the promised results.
     
  36. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    We will have a watermarked demo version available within the next week, keep an eye on my Twitter.
     
    Detniess and arapps3d like this.
  37. fgnbr

    fgnbr

    Joined:
    Nov 5, 2016
    Posts:
    6
    vivi90 likes this.
  38. fgnbr

    fgnbr

    Joined:
    Nov 5, 2016
    Posts:
    6
    Adaptive Rendering based on Weighted Local Regression.
    Monte Carlo ray tracing is considered one of the most effective techniques for rendering photo-realistic imagery, but it requires a large number of ray samples to produce converged or even visually pleasing images. We develop a novel image plane adaptive sampling and reconstruction method based on local regression theory. A novel local space estimation process is proposed for employing the local regression, by robustly addressing noisy
    high dimensional features. Given the local regression on estimated local space, we provide a novel two-step optimization process for selecting bandwidths of features locally in a data-driven way. Local weighted regression is then applied using the computed bandwidths to produce a smooth image reconstruction with well preserved details. We derive an error analysis to guide our adaptive sampling process at the local space. We demonstrate that our method produces more accurate and visually pleasing results over
    the state-of-the-art techniques across a wide range of rendering effects. Our method also allows users to use an arbitrary set of features including noisy features, and robustly computes a subset of them by ignoring noisy features and decorrelating them for higher quality.

    Source Code And project page: http://sglab.kaist.ac.kr/WLR/

    Video
     
  39. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Thanks for the links, I have seen some of that research. It's important to focus on what can be done in real-time, as that is the goal, so something like temporal AA can help quite a bit, but other methods take a significant amount of time to render. Rove works with Unity post processing, so we'll have to see what kind of filters work best as a community.
     
  40. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    Okay I absolutely love the look, but I'm not certain I understand how this is expected to be used if not for just (gorgeous) screenshots. Realtime environ lighting for games? Is baking even a reasonable option?

    Again looks stunning and well worth the price, as long as I can use it! :D

    -Steven
     
  41. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Our long-term goal is games, but for the short term it's most useful for applications like interactive architectural rendering, where a little initial noise isn't an issue. Unity is also being used by animation studios these days, so this gives them a higher quality option. Otherwise, I think there are a lot of video game developers that have no experience with complex offline rendering tools, and I'd like to be able to enable them to create great renders whether for promotional purposes or just for fun.
     
  42. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    Makes perfect sense!

    Again looks fantastic (and I have a ton of offline rendering under the belt...)

    Cheers man :D
     
  43. Pode

    Pode

    Joined:
    Nov 13, 2013
    Posts:
    145
    The problem with that kind of nice shadertoy demo is that you have no way to give the model (the meshes) information to the fragment shader.
    It's very nice as a POC, but not very useful in a production setting.
     
  44. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    The Shadertoy demos are cool, but use very basic primitives with very basic materials, and no acceleration structure. If you were to put a few thousand polygons or a complex material in those scenes they would fall apart.

    They are fantastic for educational purposes though, and getting started.
     
  45. Berenger

    Berenger

    Joined:
    Jul 11, 2012
    Posts:
    78
    I would really like to get into the beta, but I have to convince my direction first. So could you validate that rove3d would work in my situtation.
    We develop a kitchen builder where the user can customize everything. Starts with a blank scene, create walls, add doors / windows, add furnitures etc. We'd like to add the option to switch to your renderer once the project is finished. Here are my questions : Can we setup the scene at that point through script ?

    1. Add the rove API component
    2. Scan the scene to create similar rove3d materials
    3. Add the rove object components to everything
    4. Setup the lights somehow from the points / directionals from Unity
    And last questions, what is the minimum version of Unity Required ?
     
    Last edited: Dec 21, 2016
  46. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Yes, the scripts provided should give you a good direction for setting up the scene dynamically with your own scripts. All of the functions for changing materials, importing new meshes, and creating emitting meshes are exposed in the RoveAPI.cs script and are straightforward (i.e. ImportMesh(...), SetMaterialColor(...), etc.)

    If you need any additional help you can contact me through e-mail.

    Rove was tested in version 5.x but should work with prior versions, as nearly all of the functionality is packed in the native c++ plugin.
     
  47. Berenger

    Berenger

    Joined:
    Jul 11, 2012
    Posts:
    78
    Awesome ! I couldn't get my direction to greenlight the expense though, I'll have to wait the free watermarked version, impress them then buy it.
     
  48. unisip

    unisip

    Joined:
    Sep 15, 2010
    Posts:
    340
    I've been following the forum for a while and Rove3D seems almost like a no brainer to me. Considering the 350 price tag though, I'd really like to look at the watermark demo just to make sure. I believe quite a few people will want to experience the rendering on their machine before purchasing. Over than that this all looks very cool !
     
  49. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    @Berenger @unisip - I totally understand, I'm going on holiday for XMas and plan to release a watermarked demo version in the week before New Years.
     
  50. rove3d

    rove3d

    Joined:
    Sep 12, 2016
    Posts:
    89
    Completed it early, a free watermarked demo version is available from the website (scroll to the bottom) as well as a direct link to purchase the beta version: https://www.rove3d.com

    Since the Rove beta is now released along with a demo version, I'll be making the [Released] thread soon to replace this one. Cheers and happy holidays