Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

[Released] MegaSplat, a 256 texture splat mapping system..

Discussion in 'Assets and Asset Store' started by jbooth, Nov 16, 2016.

  1. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Update one video now available, covering the changes in the first patch (being uploaded to the asset store now)

     
    Seneral likes this.
  2. Mark_T

    Mark_T

    Joined:
    Apr 25, 2011
    Posts:
    303
    Thanks for the update. It looks really cool.
    One question about the optimization you mentioned in the video about the macrotexture faded overlapping in the distance mode. This optimization is done inside the texture array or in the shader? If I build my own shader with Amplify Shader Editor will I get the performance optimization?
    I'm also curious to see how the new per texture scaling works.
    Thanks!
     
  3. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    It's done in the shader, so you'd need to do the same optimization inside ASE if you're building your own shader. However, I don't think ASE supports forced conditional branching yet? And even if it did, you'd need the ability to use gradient samplers for anything inside of the branch, which is also something ASE doesn't support yet. (And technically, Unity doesn't support them for texture arrays yet either; for MegaSplat, I had to add them myself).
     
  4. RandAlThor

    RandAlThor

    Joined:
    Dec 2, 2007
    Posts:
    1,291
    Just saw your new video and have a question.
    Can i make now a shader with amplify shader and your asset that also can animate between wet and dry land textures or settings (not only usefull for land textures)? Or can i do this alredy alone with your asset like the asset from forst ( https://www.assetstore.unity3d.com/en/#!/content/74897 ) ?
    This would be a nice feature and it would be nice if it can be added.

    I like your features that are into it very much and now also i can use splatmaps.
    :)
     
  5. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Since you have 256 textures available, you could just have a dry/wet version of your texture and paint the wet version where you want it wet. You could combine that with flow and refraction to produce the effect of water moving over the surface as well, causing the texture on the bottom layer to distort. The included shaders will do all of this; the amplify integration makes it possible to do a lot more though; such as doing this without having separate textures, perhaps painting a wet mask over your level into a vertex channel.
     
  6. Deleted User

    Deleted User

    Guest

    Sounds great, sorry if it's a silly question but I've seen asset store systems that have issues with this before. No issues with replacing the splat exported from WM via a script for example and then painting layers atop of it (without any odd behaviour?), also I take it you can select which layers are parallax and they are automatically / manually faded from the camera's world position for performance purposes?
     
  7. RandAlThor

    RandAlThor

    Joined:
    Dec 2, 2007
    Posts:
    1,291
    Thank you jbooth for the fast answer.
    I do understand that i can use some textures for wet and some for dry things.
    What i mean is, like that pack from forst, to animate from one to the other and hope that this can be done with your too in the future.

    For example i walk thrue a dry place and then it began to rain. Now the textures blend from dry to wet and it look more real. After a while it then blends animated back from wet to dry if the rain stopped. I know this is not what everyone but with all this pbr shaders now this is looking realy nice.

    There is a video on the asset from forst that show this better then i can write and with more features but i hope for just an animated wet/dry blending for your asset if it does not slow it down to much maybe as an alternative shader in the future!?

    It would help my project come more to live and is just a feature wish for the (hopefully not so far) future.
     
  8. Kaneleka

    Kaneleka

    Joined:
    Sep 23, 2013
    Posts:
    18
    Using the technique you've used in your shader is it possible to remove the apparent seam along tile edges? Wondering if just duplicating the vertex values along adjacent edges would work here...
     
  9. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    The latest submitted patch has a tool which allows you to setup mapping from a color in an image to a brush to apply to a mesh, so you can take an image output from your preferred landscape generation system and apply texture choices based on this mapping. I don't read traditional splat maps where weights are stored in R/G/B/A channels in directly, because MegaSplat doesn't work the same way as these systems, so that data isn't directly compatible.

    I see. I don't have specific support for that effect, though you could do it by animating the vertex data on your meshes. You could also roll this type of shader in Amplify Shader Editor using the MegaSplat nodes. One of the main reasons for adding ASE support was so people could get the benefits of MegaSplat 256 splat map texture blending, but in whatever custom shader they'd like.

    Seem? You mean if you have textures that don't tile naturally?
     
  10. Kaneleka

    Kaneleka

    Joined:
    Sep 23, 2013
    Posts:
    18
    See center frame @ 10:17 in your latest video at the top of the page. It's not very noticeable during a fly over, but may become an annoyance in an FPS style game with many small tiles.
     
  11. Deleted User

    Deleted User

    Guest

    Not sure I follow you, when you say "mesh" are you referring to Unity's terrain "mesh" when you've imported the HM? When you say you'd "map a color to an image brush", do you mean export one of the colour layers of the splat or per layer weightmap data, import it to a texture set in your tool and it will automatically fill the entire terrain with that texture layer / applied to the correct height / gradient? Then you rinse / repeat one layer at a time until you have blended all the splat / weightmap height data?

    Generally you're going to have multiple tiles, you're not going to paint it all manually. The whole point of RGBA splat (Unity) / Weightmap (Unreal Engine) systems is to apply a base blended overlay of your entire terrain setup, because it'll line up with terrain generation data, it doesn't look odd and it won't take you months to paint hundered's of tiles.

    The rest is extremely trivial, macro variations by definition of major systems tends to be detail textures with a UV scaling offset, then it's just a matter of adding noise masks and camera world position offsets to change / split up any tiling issues. You can literally open UE and copy their grass example to do it..

    Sorry if I'm missing something here, but I do seem to be missing the point a little I believe?
     
    Last edited by a moderator: Nov 28, 2016
  12. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Oh, that, yeah, that's just a bad split in the source mesh that I didn't line up properly. That isn't something intrinsic in the shader..

    No, I mean mesh. Unity's terrain system is a closed box, and you can't really use this technique with it unless they open that box up and give developers lower level access to the data.

    Yes, I think you haven't quite grokked the difference between traditional splat map techniques and MegaSplat yet. If you watch the video posted above I give an example of using a tool to quickly fill the landscape with textures, similar to how you would with a splatmap image. I don't use traditional splatmap images because they only support 4 textures blended by 4 weight values. MegaSplat supports 256 textures per layer (one texture per vertex per layer), with a single blend value between 2 layers. So the data doesn't really map 1:1 with a traditional splat map. Instead, I allow you to make a color value (RGB) to a brush (A collection of textures which can be chosen based on noise, angle, or height, and can paint into either layer or both layers at once). So you map green to your grass brush, which might blend several moss textures together on one layer, and some rock textures on another layer, then blend those together with a noise function. Then you map yellow to your sand brush, which maps desert sand on flat terrains and sand with rocks on slopes. Etc, etc. You could easily create dozens of mappings, which map to any number of brushes, which combine textures from your 256 possible textures into a massive combination of surface.

    The big difference here is driven by the fact that the technique is fundamentally different; we want to take advantage of all the textures we have available, and not be limited to just 4 textures. To do this, the toolset and workflow ends up fundamentally different as well.
     
  13. Deleted User

    Deleted User

    Guest

    No I do get you, as said you can do something extremely similar in Unreal (there about's) and it's not something unheard of. UE doesn't use splat's it's blended weightmap data.. Also you can single pass a metric ton of textures via sampler's (shared / wrapped) w/ parallax etc... In all fairness UE's system is a million times better than Unity's terrain system let's hope they open that box or push it off a cliff.

    I just really wanted to know if it'd work directly with Unity's terrain system, which I have my answer so thanks..
     
  14. Enoch

    Enoch

    Joined:
    Mar 19, 2013
    Posts:
    198
    I finally got around to getting this and I am in the process of converting my Voxel engine to support it.

    I have run into a issue with the example though. I created a single layer material (MegaSplat_MacroDetailFlow), set it to Triplanar and MSEO, Enabled Per Texture properties and mapped all of the proper texture arrays (albedo, normal and MSEO). However whenever I use the Texture Settings Editor and change uvScale it only seems to the change the texture index. Note the preview image doesn't change but the resulting image on the terrain changes. It's almost like you can use uvscale to map a index to a completely different index but I doubt that is what was intended. I can change the global Texture Scale just fine, it's just the per texture uvscale that acts weird. Any ideas?
     
  15. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    That's really strange, and I just tested this on the current version and it seems to be working fine. I can double check in the asset store version when I get home tonight as well. Also, I believe Reiika has this working with a voxel system and per texture scale?
     
  16. Enoch

    Enoch

    Joined:
    Mar 19, 2013
    Posts:
    198
    Hmm Maybe I have an old version but I think in TextureArraySplat.shader on line 366,
    you had this:
    Code (csharp):
    1.  
    2. #if _TRIPLANAR
    3.              tpuv0_x *= uvScale.x;
    4.              tpuv0_y *= uvScale.x;
    5.              tpuv0_z *= uvScale.x;
    6.              tpuv1_x *= uvScale.y;
    7.              tpuv1_y *= uvScale.y;
    8.              tpuv1_z *= uvScale.y;
    9.              tpuv2_x *= uvScale.z;
    10.              tpuv2_y *= uvScale.z;
    11.              tpuv2_z *= uvScale.z;
    12.  
    and I think this is what you wanted:
    Code (csharp):
    1.  
    2. #if _TRIPLANAR
    3.              tpuv0_x.xy *= uvScale.x;
    4.              tpuv0_y.xy *= uvScale.x;
    5.              tpuv0_z.xy *= uvScale.x;
    6.              tpuv1_x.xy *= uvScale.y;
    7.              tpuv1_y.xy *= uvScale.y;
    8.              tpuv1_z.xy *= uvScale.y;
    9.              tpuv2_x.xy *= uvScale.z;
    10.              tpuv2_y.xy *= uvScale.z;
    11.              tpuv2_z.xy *= uvScale.z;
    12.  
    I changed the code and it seems to work. I assumed that .z component is the texture index in the texture array and you probably don't want to scale that. But I am not the expert, you are :).
     
    jbooth likes this.
  17. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Yes, thanks! Must have missed testing it in triplanar mode - Time to set up a regression testing scene for all the shader variations..
     
  18. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    I just downloaded the asset but can't use it because Unity hangs when importing Assets/MegaSplat/Shaders/TextureArraySplatLayered.shader. I've tried re-importing the file, deleting it and importing it again etc. but nothing works. Can you help?
    I'm running the latest Unity 5.4 on Mac OS 10.12.1 and a Nvidia Titan X card.
    Thanks.
     
  19. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Wait longer. It takes several minutes to compile that shader (it has a lot of options, and each option basically generates 2x the number of shaders, so you quickly end up with hundreds of variants that need to be compiled)..
     
  20. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    It just completed now. I'll be more patient next time. While I was waiting I read through the documentation. Have I got it right that I'll need to bake my terrains and objects once I'm finished texturing if I want to mark them as static for light mapping?
     
  21. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Yup. This will write out a mesh with everything baked into it, but I'm hoping to automate this workflow soon. Basically, have a step where it generates all the meshes for you, replacing the scenes ones with them and removing the VertexStreamComponents automatically. You can then insert this into your build or baking pipeline and never think about it again. That said, I was hoping Unity would ship their new feature which allows you to store assets in scenes, because then I wouldn't need to write them out to disk at all; but who knows when that will ship.
     
  22. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    No problem. I can manually bake assets for now. Looking forward to using MegaSplat on my project.
     
  23. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    So, last night, I managed to get this stuff working on standard unity terrains. Basically, instead of the mesh processor, I do this work inside a geometry shader. This is pretty nice, but bumps the minimum requirement on the shaders up a bit. (They'd be separate shaders, so no worries)

    This leaves some open questions around tooling, since you can't use the Vertex Painter to paint the data, and Unity's terrain tools expect a totally different format and don't support hundreds of textures. So I haven't figured out exactly what to do, but I'm sure I can figure something out.

    That said, this would make integration with packages like MapMagic and other systems which only use the standard terrain possible, which is exciting.

    There's also the possibility to ditch the mesh preprocessor. That would certainly be more friendly, but invoking the geometry shader has a heavier runtime cost that the preprocessed meshes do, not to mentioned the increased system requirements. So with that, I think it's better to preprocess the meshes except in the cases where you can't (terrain). Voxel engines and things with dynamic topography might want to use the geometry shader though..

    After this first patch is approved, I'd love to get some feedback from the people using this package about what would be the most useful things to work on (besides bugs, of course). Some of the things include:

    1. Terrain Support
    2. Auto-Bake out utilities for easier lightmapping/static batching support
    3. Rendering textures from the painted meshes, to use as macro textures in the distance
    4. Integrations with existing landscape generation systems
    5. More shaders or effects, etc..

    Thoughts?
     
  24. Mark_T

    Mark_T

    Joined:
    Apr 25, 2011
    Posts:
    303
    I watched again the last video. The new "Splat from ColorMap" is really nice. Very nice. While you were setting the textures for each color on the map, I was thinking: how cool it would be to have an spawner to instantiate different objects distributed on the terrain/mesh using the very same approach "Spawn from ColorMap" Trees, boulders, houses, NPCs, you name it. Any prefab you want.And scale them according to the alpha channel of the ColorMap. No Alpha, everything is 100%, size wise. I suppose (and I might be wrong here) that part of the code is already there. You just have to finish and release a new tool/asset. I would love to use such a tool.
     
  25. jc_lvngstn

    jc_lvngstn

    Joined:
    Jul 19, 2006
    Posts:
    1,508
    jbooth, as far as supporting Unity terrains, one option might be to support going back and forth between the two...sorta. I'm not sure how well creating a megasplat mesh from a Unity terrain would work, but going from a megasplat mesh to a Unity mesh may be doable.
    I know it wouldn't really support realtime updates, but it would work for those who just need static terrains.

    Just tossing that out there...you know your tools better than I :)
     
  26. TechDeveloper

    TechDeveloper

    Joined:
    Sep 5, 2016
    Posts:
    75
    Hello.

    does this support metalness workflow?
    I have downloaded many megascans textures (metalness) and I dont want to have to download the other
     
  27. Mark_T

    Mark_T

    Joined:
    Apr 25, 2011
    Posts:
    303
    "Rendering textures from the painted meshes, to use as macro textures in the distance" would be my first option. It would be very useful to bake all the channels(albedo, normal, etc?), including the heightmap placed in the alpha channel of the painted textures. I was thinking paint with the double layer, render/bake and save the macro-texture with the name of the mesh, something like (mesh01_albedo, Mesh01_normal, etc.) in a chosen folder and automap the freshly baked/rendered textures with the new shader, bake a copy of the mesh, remap/retexture and voila, you have an optimized clone of what you painted. A lot of manual and tedious manual work solved auto magically.
    Just as an innocent question: would it be possible to use fx/postprocessing on the render/bake? Or even 3rd party assets? It would be powerful this way.

    Auto-Bake out utilities for easier lightmapping/static batching support." would be my second choice.
     
  28. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    It only supports metallic workflow right now, so you're all set..
     
  29. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    Several of my diffuse textures actually use the same detail texture. When creating the texture arrays I can simply add the same detail tex to different slots so everything matches. Would the system spot that some of these are actually the same, and would there be any benefit, or should I just make unique detail textures for everything?

     
  30. ChinChiaYeh

    ChinChiaYeh

    Joined:
    Sep 14, 2016
    Posts:
    23
    Hi,
    Thx for the great tool. I have some trouble with it.

    1. I use Vertex painter pro to Deform my terrain. But the Meshs become borken when I paint on it.

    2. Macro MSEO texture seems does't work.

    3. Macrolayerd Overlay Blend Mode looks a little strange. Colors become too white when blend with lighter texture. If the Overlay effect is same as Photoshop woud be perfect.
     
  31. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    There's no benefit to sharing them, as they'll end up duplicated in the array.

    I'll take a look at that this soon- I haven't tested the deformation tools in quite some time, they may have gotten broken at some point.

    Which shader are you using? and what settings?

    [/QUOTE]

    Are you rendering in Linear or Gamma space? Because in Gamma space, which is Unity's default, the center point of an Overlay blend is not 128, 128, 128, but rather something around 187 or 85 (depending on gamma curve and direction).

    See: http://filmicgames.com/archives/299

    I could do this conversion in the shader, but it would make that blend mode more expensive when running in Gamma, though be a closer match to what people expect.
     
  32. wdw8903

    wdw8903

    Joined:
    Apr 2, 2015
    Posts:
    48
    Hello, glad to see you release a new plugin.
    I'm currently use your vertex painter, and mesh will screw up when I update the mesh(add some edges in maya) with vertex stream script. I have to bake the mesh in unity first then use other script export the mesh back to maya. This workflow is not so convenient.
    Does this megasplat need the same workflow for updating mesh? Or do you have better solution?
     
  33. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Same workflow- There's not really a great way to restore data when you fundamentally change a mesh by adding/removing vertices. I've seen people try to do this by mapping the nearest point, etc, and that can work, but it's highly assuming of the type of information being painted. For instance, if you interpolate vertex colors to populate the new vertices then it's likely fine, but if that information is a texture index or a pivot point, then the interpolated value would be totally wrong. So at best, something could be done to make some of the cases work..
     
  34. Enoch

    Enoch

    Joined:
    Mar 19, 2013
    Posts:
    198
    I have some questions about using procedural textures in a TextureArray. The only way to use a substance it seems is to export all the generated textures to a file and then have unity load it as a regular texture. Is this correct? I would like to be able to change some of the procedural properties at runtime but I can't seem to come up with a workflow for this. Any ideas?

    Even if we don't have runtime support, is there a more efficient way to get procedural textures (substances) into the texture array even within the editor?

    The main issue I see is that TextureArrayConfig uses Texture2D for its sourceTextures as opposed to just Texture. ProceduralTexture inherits from Texture but not Texture2D and ProceduralMaterial.GetGeneratedTexture returns a ProceduralTexture. It looks like in TextureArrayConfigEditor line 61 you do a Graphics.CopyTexture and this will take a Texture as opposed to the more specific Texture2D. Is there any way we could change all Texture2D references to simply Texture references?

    Also looking at your code it looks like I could, if I could store a reference to the full Texture2DArray, actually do the Graphics.CopyTexture myself at runtime after any change to a procedural texture was complete (then do a Texture2DArray.Apply). It might be a little tricky but I think it's possible, this might solve the issue of procedural textures that change at runtime. Any thoughts?

    Thanks for the help.
     
  35. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Well, texture arrays are a hardware feature and only work with actual textures. You could, however, render out a substance at runtime and construct a texture array with it - but then you wouldn't get texture compression, so that's going to take a ton of memory if the substances are of reasonable size.

    If by efficient you mean to workflow, then you could make the TextureArrayConfig take a list of substances and bake them out to textures automatically whenever the array needs to be rebuilt. This would make updating easier, but would be no different at runtime.

    Maybe? But at the end of the day, it all has to map to a Texture2D as that's what the hardware uses. Otherwise you could stuff a cube map or 3d texture in there..

    I think it could all be done, but remember that it means that the entire array would be uncompressed in memory, as every element in the array needs to be in the same format/size/etc. If you're only running on DX11 hardware, you could potentially integrate a GPU encoder for compression and do the compression at runtime as well, but that's deep into uncharted territory...
     
    Enoch likes this.
  36. Enoch

    Enoch

    Joined:
    Mar 19, 2013
    Posts:
    198
    This is one of those situations where I really want some introspection into exactly what Graphics.CopyTexture actually does.
    I may try to see if I can make it support both procedural result textures and regular texture assets. The documentation for CopyTextures is now making me think I can and without the save out to file business for Procedural textures (see below). That would be ideal.
    True but Graphics.CopyTexture requires an element index for both src and dest. For Single indexed textures (Texture2D) this is always 0 but for the indexable ones you have to specify which face/index your copying from or too.
    Yeah this is what I am trying to avoid. If I am working with say 100 materials at 1024 I really need them to be compressed (that would be well over a gig of memory for just the 3 textures I need: albedo, normal and MSEO). But here is the thing. I am not convinced Graphics.CopyTextures isn't an all GPU operation, so it might not need to be uncompressed. The documentation suggests that the textures remain compressed for operation, why else would they require the format be the same. The documentation even goes as far as to say that non block based formats can't do partial copies, and that block based compression formats need to copy in units based on the compression block size.

    This makes me think that what I want to do is possible: regenerate a texture at runtime compressed and copy that GPU texture data to somewhere else on the GPU (in this case to some index into the texture2darray). I am probably going to do some research on this and see if I can get it to work.
     
  37. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Yes, as long as your formats are the same, Graphics.Copy will just copy the memory in there and it will all work. Do substance surfaces generated at runtime come out as compressed textures? If so, then your golden..

    Let me know how this goes; it would be nice to support substances as a texture import format as well as Texture2D, even if just for the non-animating, editor only case of compiling a texture array.
     
  38. ChinChiaYeh

    ChinChiaYeh

    Joined:
    Sep 14, 2016
    Posts:
    23
    Meshs borken after using Megasplate Mesh Converter.

    The terrain looks the same even I removed Macro MSEO texture.
    Smoothness and Metallic slide bars doesn't work either.


    I am rendering in Linear.
    The example I blend the textures.
     
  39. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Ok, I just tested this in Linear and Gamma rendering. The only way I could replicate the effect your seeing was if I was rendering in linear mode, but did not set the TextureArrayConfig "linear" checkbox to true. Can you make sure the checkbox on the TextureArrayConfig is also checked?

    Looking into the other issues now.
     
  40. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Ok, looking at this there are a few things going on.

    First, the smoothness and metallic sliders are actually in the wrong section. They are the default values for smoothness/metallic across the texture array when none are provided via textures, or the per-texture properties. These should be moved into the proper section, and hidden when those textures are available. I can also add sliders for these on the macro texture level as well, so if you want a smoothness for the macro texture as a whole you can set it.

    The MSEO values for the macro texture are being blended in, essentially, normal blend mode at all times right now. If you switch your blend mode to normal and set the fade so you blend completely into the macro texture, you'll see it coming through. The biggest question is, perhaps, what does overlay/etc mean for specular values? I could switch these to do those blend modes, but it's unlikely that the values you have for the macro texture make sense outside of the blend normal mode.

    For instance, should the grass become metallic because the macro texture says so? Or the smoothness be be cranked up on a rock? How would you expect blending metallic and smoothness to work on a macro texture in these blend modes? I'll do a little research and see if anyone has done any work on this, but if you have a specific use case that would be very useful.
     
  41. ChinChiaYeh

    ChinChiaYeh

    Joined:
    Sep 14, 2016
    Posts:
    23
    I did not checked Linear in last example.
    When Linear is checked, the Blend effect is great. But the color become super white.

     
  42. jrhtcg

    jrhtcg

    Joined:
    Jul 13, 2013
    Posts:
    34
    As for me I use GAIA to create my terrains. To use your shader I have to export the terrain as a mesh, and then run the exported mesh through a splitter, and finally import that mesh into my project. GAIA also has a real nice procedural way to texture the terrain, but there is no way to integrate that with MegaSplat because GAIA currently only exports normal and splat maps, no color map. It would be nice if you could provide a tool to create a colormap from an existing terrain. Something like sample the terrain at every x,y, and generate a colormap from the sample.

    Another pain point is that right now If I reimport the terrain after making changes in GAIA, I lose any paint jobs I did on the terrain (this is because I recreate the terrain GameObjectgame after I import the mesh changes). Also using a mesh terrain instead of a unity's terrain, Occlusion baking is a must, and to use that you have to flag the mesh as static, which tends to disable the ability to make changes to the paint job.

    I am sure I can simplify some of these work flow issues, but I guess my point is using something like GAIA and MegaSplat together is a MegaPain atm! ;) It is probably worth the effort though, but anything to reduce the pain would be helpful.
     
  43. ChinChiaYeh

    ChinChiaYeh

    Joined:
    Sep 14, 2016
    Posts:
    23
    Maybe metallic / smoothness blending is not necessary in most terrain. But Ambient Occlusion should be needed for detail.
     
  44. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    So, the patch currently waiting in the queue contains a tool to map colors in an image to brushes in the system. Once you set up a mapping, you can then effectively import a paint job from any third party app that will generate a guide or color image. This should make it much easier to take data from something like Gaia and map it to megasplat textures.

    I'm also working on getting everything working with Unity terrains and hope to have something for that in the next patch. (Painting terrains is another interesting problem, since you won't be able to use Unity's tools for that, but I want to solve the shader first).

    I've also been in touch with Adam who creates Gaia, and he's interested in getting a direct integration working, but has been swamped lately and hasn't had any time.

    Agreed, I'll switch that to use the blend modes in the next patch - I can also post the change here so you can replace it on your local copy.

    Try changing the mode to Splat's on Top; this controls if the blend is from macro->Splat or from Splat->Macro, with the 0 value being fully macro/splat. So in macro->Splat with 0% blend, you get 100% macro texture (and no splats). I think you want it the other way, where you get 100% splats at 0% and blended overlay at 100%.
     
  45. wdw8903

    wdw8903

    Joined:
    Apr 2, 2015
    Posts:
    48
    Is it possible to save the painting data to textures not into vertex color/UV? So I only need a mesh with proper UV, then I change vertices or edges won't affect existing painting data.
     
  46. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Good idea! I'm planning on writing a tool which bakes out the diffuse/spec/normal/etc data into textures for use with macro texturing. When I do that, I can also have it bake out the indexing data, so you can project that back onto the vertices. This will have another side benefit, in that you can paint a terrain using the vertex painting tools, output these textures, then use it with the new unity terrain shader.
     
  47. Mark_T

    Mark_T

    Joined:
    Apr 25, 2011
    Posts:
    303
    I realize that it might be beyond the intended scope of Megasplat, but it would be extremly useful, convenient and powerful to render/bake the desired/requested channels, use the standard Megasplat shaders or point to an Amplify (or not) different custom shader and map the baked objects based on naming convention rules and reload new meshes in a new or in an already existing/chosen scene (with a particular lighting maybe?)with the newly baked and mapped objects. Or even using the new 5.5 Look Dev window to check the painting in different lighting conditions. This way you might even have different painting stages saved in predetermined locations and you can always roll back to a previous stage, just in case you/re not happy with your final result.
     
  48. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Can't you just prefab the thing when your done painting it as a new prefab? All the data is stored in a component, not in the mesh itself, so you can save off as many copy's or unique scenes as you want..
     
  49. Mark_T

    Mark_T

    Joined:
    Apr 25, 2011
    Posts:
    303
    Sorry, but I don't get it. Paint and bake. Save prefab. Paint again. Save prefab. The first prefab is going to be different than the second one?
    And I was also talking about remapping the bakes.
     
  50. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,445
    Yeah, you can make as many prefabs of the mesh or dupe them in the scene and paint across multiple instances of them at once. The data isn't written directly into the vertices (unless you bake the mesh out to disk as a new asset), but rather stored in a component on that instance and applied to the additionalVertexStream. The source mesh is never modified.

    Or are you saying paint one thing, then apply that paint job to a totally different mesh?