Search Unity

Feedback or comments needed for texture formats

Discussion in 'General Graphics' started by mingwai, Nov 9, 2017.

  1. mingwai

    mingwai

    Small Graphics Potato Unity Technologies

    Joined:
    Jan 16, 2017
    Posts:
    52
    In Unity nowadays, we have TextureFormat and RenderTextureFormat, also we have TextureImporterFormat. I have some questions about how users use them. So I have a few questions:

    1. What is your method of choosing a format to use?

    2. Do you struggle with choosing a correct format?
    Correct means the format fits perfectly with your needs, it is supported on your platforms etc.

    3. Does the current API help you to make the decision? Do you rely on the APIs e.g. SystemInfo.SupportsRenderTextureFormat?

    4. Any other difficulties you have when you work with texture formats?
     
  2. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    I'm very new to Unity, but looking at the links I was surprised EAC formats are available given I wasn't seeing them in the editor for texture compression. Not having much experience with Unity I'm going to answer based off my gut...

    1. I would imagine that these being enums common formats would have the same underlying value, but I would stick with using the appropriate enum for the use case (TextureFormat for a straight texture my shaders are going to sample from, RenderTextureFormat for a texture I intend to render to, and presumably TextureImportFormat would be formats the texture importer knows how to load?)

    2. It seems like Unity has made certain choices forcing my hand to not try and target multiple fallback cases (ie. multiple compression methods, linear and gamma) It seems like I'd try and author maps / combine channels outside of Unity, and then import and use texture compression and see if the result looks acceptable.

    3. For targetting say GLES3.x I'm probably just going to use what the spec says is more or less guaranteed (I believe Unity has a software decompressor anyway and falls back on uncompressed textures if unsupported?), but as far as the API helping I'd imagine it ensures I use the correct enum class (or at least makes me feel naughty casting one to the other possibly)

    4. there's a current thread somewhere here about a person finding YUV support ceased working, I'll see if I can link it EDIT: https://forum.unity.com/threads/textureformat-yuy2-21-is-not-supported-on-android-platform.503440/
     
    Last edited: Nov 9, 2017
    mingwai likes this.
  3. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    343
    Regarding a render texture format:

    1. We choose the main render texture format based on the GPU vendor. Nvidia Tegras generate a lot of banding and require 32bit format, while other vendors seem to be ok wit the 565 format.

    2. It needs some hardcoding, but not too much struggle

    3. We use SystemInfo.SupportsRenderTextureFormat just to make sure that the target device really supports 565, so we can roll back to 32bits if it doesn't

    4. Not exactly related to texture formats, but it would be great to be able to change the "use 32 bit display buffer" on the fly on the device, and not only before the android build
     
    mingwai likes this.
  4. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I use a lot of different formats to try to optimize texture use. I put a lot of data into textures for shader work, so uncompressed formats are common, with varying channels or precision amounts. We also optimize for texture size by choosing explicit compression formats (2bpp/4bpp on mobile, etc)

    And I run into a lot of bugs in Unity with texture formats:

    - HDR formats getting clamped to LDR ranges on mobile
    - HDR formats getting written to disk as Gamma when set to linear, and thus not coming back as the same data they were written in
    - Alpha8 Sprites being removed from the Texture Importing UI in recent versions of Unity because SingeChannel is now in the same enum as Sprite. We use Alpha8 Sprites for most of our UI, which is all based on SDF and MSDF techniques.
    - Not being able to write all formats to disk
    - ARGB or RGBA - RGBA throws errors when you Get/Set Pixels and tells you to use ARGB, but ARGB is no longer an import format selectable from the UI (but still from code). I write some stuff that has to work across multiple versions of Unity, and nearly every new release means new #ifdef's to fix changes to the texture importer pipeline.
    - TextureArrays seem to get serialized into YAML when saved as asset files and force ASCII is on, which creates slow load times. There's also no inspector for Texture Arrays assets (you can write your own, but I can't release it cause it will likely conflict with one if anyone else does).
    - Documentation is very incomplete about what each formats expected capability is (is this scalar, signed, unsigned, which platforms does it work on with what caveats?)

    We also script an extensive set of importer scripts to automate the process for our less technically minded coworkers.

    Yes, between the bugs, lack of documentation, and undocumented differences in how formats are treated on different devices it can be quite time consuming having to test a new format before you can OK it's use. Especially if that test requires building asset bundles and deploying to 10+ different devices and platforms because you can't trust that it will actually work in all cases it's supposed to.

    No, because the format might be supported, but that doesn't mean it's going to work correctly. If you put a bunch of data into an RGBAHalf texture it will work fine on PC, but get truncated to 0-1 values on devices that claim they support the format. You can't rely on the API or the docs to tell you what is going to work.

    Yes, I want better control of what happens to the texture data throughout the pipeline. Lets say I have a normal map which has been cast from a high res to low res model. Often, the bulk of the deviation between these two models is subtle and exists entirely within the 0.4 to 0.6 range of the texture. Normal baking pipelines output a 16bpc image, but Unity quantizes this down to 8 bits, so now we have 8 bits of information representing values that are mostly within 20% of the range. By the time this hits the compressor, the subtle gradients have been squashed to nothing, and your specular response looks crunchy.

    What I would love to be able to do is modify this data before it goes into the compressor and before it gets quantized. Basically, normalize it to use the full 0-1 range through a method of my choosing and store off some data associated with the texture about that normalization, then send it to the compressor, and in the shader, un-normalize it, allowing me to get much higher quality normals with the same amount of storage and only a MULADD in the shader.

    I would also like Unity to compress textures off the main thread, so it doesn't tie up my machine importing a project for an hour or 18 (One of our games takes 18+ hours to compress texture into android format if the cache server cannot be used).

    I want the "Can I fix it for you?" thing removed from normal maps. I often pack extra data into things which look like normal maps, and have written my own texture controls to prevent the one from showing up in the editor- but if you use the "bump" texture in your property definition, this check will still be triggered. A "grey" default texture value with linear 0.5, 0.5, 0.5, 1 would also be very useful.

    I would like to be able to generate my own mips easier. Right now we do it by hand and put it into a DDS file.

    I would like texture type and format to be better separated in the importer UI. Right now, Sprite, Normal Map, Single Channel, etc all exist in the same enum. I blame normal map for this, since it's both a choice of texture format and how to process the texture. Now that enum is all messed up. Anyway, this means you can't have a sprite which is a single channel because it's either sprite or single channel. Will single channel be R8 or A8 on my device? Who knows. If I sample .a in the shader, will that work with an R8 Texture? Only way to find out is to test a bunch of different devices, or not use those options.

    Oh, and everything for managing texture memory is completely outdated in Unity - it treats platforms as if they are a single device, but an android device could have 512mb or 4gb of ram.

    On the flip side of this, regular users are in my experience overwhelmed by the complexity of texture options. Just linear/sRGB is too much for most people to understand - so I get that the challenge here is not an easy one.
     
    Last edited: Nov 17, 2017
    NotaNaN, R0man, a436t4ataf and 10 others like this.
  5. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,520
    We use PNG with the mipmaps side by side and combine them on import. Is a fairly quick operation and allows you to change texture settings that you are unable to change in DDS files.

    On all other points, +1. Running into similar things here.
     
    mingwai likes this.
  6. The-Last-Orion

    The-Last-Orion

    Joined:
    May 30, 2013
    Posts:
    2
    You just defined the problem I've been pondering on for over a week. By chance do you know a way to enforce different texture compression formats for different devices on either iOS or Android? For instance, my game crashes due to low memory in an iPhone5s (1gb memory) but runs ok in 2gb devices. So I packed everything using PVRTC 4bit but then some textures are hard to stomach. What I would like is, keep the PVRTC 4bit format for low-end devices while changing the compression to better quality for devices with 2+gbs of ram (say, ETC2 8bit maybe). I tried using onpreprocesstexture() to no avail (it is said that editor folder is stripped in builds, so it's not working on mobile builds). Any info would be much appreciated.
     
    mingwai likes this.
  7. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Use asset bundles and deliver separate manifests to each device. We do this in a complex build process, where we build several different versions of the asset bundles for each level of device.
     
    mingwai likes this.
  8. The-Last-Orion

    The-Last-Orion

    Joined:
    May 30, 2013
    Posts:
    2
    I was afraid you'd say that :) We use a similar complex pipeline for assetbundles. Needless to say they have their own drawbacks. Thank you for the answer though.
     
    mingwai likes this.
  9. DrummerB

    DrummerB

    Joined:
    Dec 19, 2013
    Posts:
    135
    1. What is your method of choosing a format to use?
    We're currently moving a research project to Unity (from a custom OpenGL rendering engine). The project relies a lot on 3d textures and voxel data. The formats we use are a compromise between image quality and memory requirements.

    2. Do you struggle with choosing a correct format?
    Yes. For instance with a 3d texture of size 512x512x512, a single byte per pixel is already 135 MB. Therefore, we have to be careful to use the minimal amount of bytes per pixel.
    For some of our data, we would like to use a single unsigned integer channel (i.e. GL_R16UI), but this does not seem to be supported in Unity. There is a RGBAUShort render texture format, but no corresponding texture format. And there is no RUShort variant at all.

    3. Does the current API help you to make the decision? Do you rely on the APIs
    We don't need to make these decisions at runtime.

    4. Any other difficulties you have when you work with texture formats?
    Loading (3d) textures at runtime is quite cumbersome currently. LoadRawTextureData is only implemented on Texture2D, but not on Texture3D or RenderTexture. Because of this, loading a 3d texture from disk at runtime is more difficult than in should be. You either have to use native rendering plugins, just for loading a texture or try hacking a solution using compute shaders.

    It would be nice to have access to all available texture formats, even if it means losing some platform or graphics library compatibility. Some (research) projects don't need to be compatible with every platform, but they need to have better access to the platform they are built for.

    It would also be very welcome to have a LoadRawTextureData for Texture3D and RenderTexture as well. Thanks!
     
    gnurk, MNNoxMortem and mingwai like this.
  10. mingwai

    mingwai

    Small Graphics Potato Unity Technologies

    Joined:
    Jan 16, 2017
    Posts:
    52
    (Just in case you want to have an early idea about what we are doing)

    In 2018.2 beta, we already have GraphicsFormat enum which exposes all the available formats. It also comes with a bunch of useful functions for you to check the details of each GraphicsFormat. Take a look at the API here:

    GraphicsFormat Enum
    https://docs.unity3d.com/2018.2/Doc...ce/Experimental.Rendering.GraphicsFormat.html

    GraphicsFormatUtility
    https://github.com/Unity-Technologi...time/Export/GraphicsFormatUtility.bindings.cs

    Of course these don't solve all the problems we have. So we are still working hard on it. Feel free to raise questions.
     
  11. DrummerB

    DrummerB

    Joined:
    Dec 19, 2013
    Posts:
    135
    This looks very promising! Thanks for letting us know.

    One issue I was dealing with recently related to texture formats was a mismatch between the format of the texture and the format of the data I was trying to upload (see feedback). Currently LoadRawTextureData requires the formats to match, but this isn't required (by all) graphics APIs, certainly not OpenGL. Instead, you can specify the format of your data when you call glTexSubImage3D.

    In the end, I wrote a native plugin that I can pass a pointer to my raw data to, which then calls glTexSubImage3D. For this, I used a custom enum of formats, similar to the new GraphicsFormat enum (although not as extensive) that I could use to specify the actual format of the data (e.g. GL_R, GL_UNSIGNED_SHORT).

    On a related note, glTexSubImage3D also takes width, height, depth and offsetX/Y/Z parameters which you can use to upload only a part of a texture at once, e.g. a single slice of a 3D texture. This is especially useful for bigger 3D textures, that you cannot load at once, without blocking the main thread for seconds (big issue in VR). As far as I know, this can only be fixed with native plugins currently.
     
  12. mingwai

    mingwai

    Small Graphics Potato Unity Technologies

    Joined:
    Jan 16, 2017
    Posts:
    52
    Does Graphics.ConvertTexture() helps?
     
  13. DrummerB

    DrummerB

    Joined:
    Dec 19, 2013
    Posts:
    135
    I don't think it does currently. Maybe with the graphics format refactor it will?

    For example, I have a binary pre-generated 3D texture of size 512 x 446 x 459 with a single unsigned short channel. I couldn't figure out a way to load this data into a texture (efficiently) without native plugins. As far as I can tell, I cannot create a Texture3D or RenderTexture of this format (RUShort). Will this be possible with the new graphics formats?

    Currently I create a 3D RenderTexture of type RFloat and use gltexsubimage3d to load the "RUShort" data into it (10 slices per frame to avoid frame drops).
     
  14. dbarnhart-angle

    dbarnhart-angle

    Joined:
    Mar 15, 2018
    Posts:
    6
    I was just looking at this. It would be useful if GraphicsFormatUtility could return the texel size in bytes, when that makes sense. However, none of this is much good if the Unity APIs cannot create any hardware supported texture formats outside of the current set. For example, R8_UINT would be useful to me, and is required by DX11.
     
  15. dbarnhart-angle

    dbarnhart-angle

    Joined:
    Mar 15, 2018
    Posts:
    6
    Relatedly, the inability to set UAV bind flags _without_ also setting render target bind flags is a performance problem for 3D textures (e.g. LUTs, SDFs). I made a feature request about this.
     
  16. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    1,084
    Hey!
    I'm not sure if it's a bug, but using textures as single channel either
    - doesn't work as intended or isn't useful (more RAM usage than RGB) or
    - doesn't seem intuitive and doesn't seem to have good documentation
     
  17. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    So I filed a bug a long time ago on Unity incorrectly serializing HDR textures by gamma correcting them when they are linear and the bug was resolved as wont fix because they decided someone might be relying on the bad behavior. But for the life of me I can’t figure out how anyone could be relying on a format that doesn’t load the same data you just saved, and to such an extent that it visibly doesn’t look like what you saved at all. The reality is you just can’t rely on it at all.
     
    R0man and a436t4ataf like this.
  18. ddangelo

    ddangelo

    Joined:
    Oct 17, 2016
    Posts:
    8

    I'm facing the same issue and your native plugin sounds like a good way to solve this. Do you mind sharing it?
     
  19. Everett_P

    Everett_P

    Joined:
    Apr 20, 2020
    Posts:
    1
    Hello, I figured this is the best place to request this feature, but I'm not certain on the logistics of it. Would it be possible to include an RGBA RenderTexture format where the alpha channel is an 8 bit depth value? Essentially an RGBD format? I understand it would be low resolution, but for my needs I'm having to generate a depth texture and an RGBA texture, but I'm not using the alpha of the RGBA, and I don't require the higher resolution of a 16 or 24 bit depth value.

    If its the case that the depth texture must first be generated before it has any meaning than so be it. but i figured If it was possible I may as well ask.
     
  20. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    Not sure if this is the right place

    But please can we have GPU Accelerated texture compression?

    Godot it going to implement this and I think Unity should also invest in this area, Compressing textures is way slow.

    https://github.com/darksylinc/betsy
     
  21. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,023
    @Lars-Steenhoff
    Some people are looking into it. Will take some time, though ;)
    And, as usual, no promises...
     
  22. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    We actually just looked into this during hackweek. It's a very mixed bag :(

    Pros:
    • GPU compression might be faster,
    • GPU could compress things while CPU is doing something else,
    Cons:
    • GPU compression is sometimes slower (in terms of throughput), especially for formats that are fast for CPU anyway (DXT1/DXT5),
    • GPU compression is often slower (in terms of latency), especially if GPU & PCI-E bus has not been recently busy and needs quite a bit of time to speed itself up to regular clocks,
    • Potential for "results are different depending on which GPU & which driver you imported with" are massive. Even for CPU compressor, we had to patch it to produce identical results between Intel CPUs and AMD Ryzen CPUs. GPU compression is the same issue only 100x worse.
    • Some of the texture formats that are the slowest to compress right now (e.g. ASTC) don't have a good GPU compressor implementation for them right now. Similar for BC7, there are some implementations but they are not great.
    What we also found out: it's possible to speed up the CPU-based texture compression for some formats (e.g. BC7) with a very small quality loss (~1dB) but that compresses 10x faster than today. I suspect similar might be true for ASTC and friends.
     
    ferretnt and Lars-Steenhoff like this.
  23. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    Anything that makes compressing faster is welcome, and if you could make it happen on the background without blocking the editor that would be ideal.

    From what I understand is that there may not be fully developed GPU compressor implementation yet for some texture formats, Is that not something for unity to develop further?
     
  24. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,520
    I think that is the bottom line. Texture compression generally comes down to going through a search space of potential solutions and then selecting the best option. This makes it a fairly slow process. The real factors are in how much you are limiting your search space while still keeping the potential best solution in there. (Pretty similar to motion vectors with video compression really, but not the same as compressing jpeg for example, which doesn't need a search.)

    Anyway, back to the background task idea. I'd propose to do a quick compress first and then do a better compress as a background task after that. So:
    - Start import
    - Do quick compress
    - Release import blocking
    - Start full compress
    - Update texture with the full version

    The advantage is that there will be a compressed texture available after the first blocking import, so the rest of Unity can also continue as normal. (Less of a big rewrite on the engine itself.) But on the other side you can have good texture compression without having to wait for it.

    It's generally not that difficult to make a decent estimation of a compressed version. Making this actually coincide with the best compressed version is very difficult though, so you generally do need a search step for that. I'd say that doing a quick best guess version first and then doing a search that doesn't block the importing process really has a lot of advantages.
     
  25. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    The engine could also work on uncompressed right? no need for a quick pass I would think?
     
  26. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    Yes, that's what the "Compress Assets on Import" being off does -- it does no texture compression at import time, but rather defers that until build time.

    Anyway, yes the comments wrt "keep compressing in the background" or "compress with fast setting first, then continue compressing in the background" all make sense. Just someone has to do it :)

    But it also feels liike by now the discussion is nothing like what it started with, which was about texture format C# enum APIs, right.
     
    Lars-Steenhoff likes this.
  27. DrummerB

    DrummerB

    Joined:
    Dec 19, 2013
    Posts:
    135
    Thanks for your efforts in expanding the texture format support!

    Another useful improvement to texture formats would be more information about what formats are supported on which platforms. Lots of the texture formats' documentation currently just says "Note that not all graphics cards support all texture formats, use SystemInfo.SupportsTextureFormat to check."

    This is useful to decide at runtime what format to use, but sometimes it would be beneficial to select a lowest common denominator format, which would require knowing which platforms support which formats. Since you have to run unit tests on all supported platforms anyway, it would be nice if this information could be gathered and documented somewhere.
     
    Lars-Steenhoff likes this.
  28. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,023
    This can, unfortunately, be driver-, API- and GPU-dependent, for example on Android. There are certain formats that are guaranteed to be supported, and I guess it could be useful to list at least those.
     
  29. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    Yeah I've drafted an "overall overview" manual page that gives a 95% correct answer for which texture formats are where (the rest 5% is "complicated" as Aleksandr mentions), but that got lost in some docs publishing process apparently. I'll poke the people involved to actually ship it as part of actual docs :)
     
  30. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    A very short summary is:
    • PC, PS4, XboxOne: DXT1, DXT5, BC4, BC5 (all hardware), BC6H+BC7 (on GPUs made in last 10 years).
    • WebGL: DXT1, DXT5.
    • iOS/tvOS: ASTC is preferred on modern Apple devices (since A8 chip, year 2014). If you need older devices or Crunch compression, use ETC1/ETC2 (since A7 chip, year 2013). Use PVRTC is you need even older devices.
    • Android: ASTC on most modern devices (Qualcomm GPUs since Adreno 4xx, year 2015; ARM GPUs since Mali T642, year 2012; NVIDIA GPUs since Tegra K1, year 2014; PowerVR GPUs since GX6250, year 2014). If you need older devices or Crunch compression, use ETC1/ETC2; this is supported on all GPUs that can do OpenGL ES 3.0. If you need even older devices (e.g. ARM Mali 4xx), then only ETC1 is available.
     
  31. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,791
    Couple of things.

    1. Why would anyone use ETC1 when it looks worse than ETC1? This suggestion seems bizarre to me. (IMO ETC1/2 has no place on iOS, it's ASTC for everything and PVRTC if you care about old devices).

    2. Can we get a way to switch between a "base" compression format for iOS, and actually all platforms, without having to write a script that manually adds overrides to every single meta file (that then creates a ton of version control noise, and also triggers re-compressions for everyone on the project, even if they're switched on other platforms).
     
  32. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    Did you mean something else besides comparing ETC1 vs ETC1 :)

    Not sure I understood this request. Can you elaborate?
     
    richardkettlewell likes this.
  33. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,791
    Yeah, oops. I meant ETC1 vs PVRTC.

    Almost always PVRTC looks better in my eyes, so I’m confused at the suggestion of using the less compatible and worse looking ETC1 over PVRTC.

    iOS doesn’t have the equivalent of changing the default compression globally (like Android, where in the build settings you can override the compression used without having to change all the meta files in your project).

    iOS now defaults to PVRTC unless it’s overridden per texture, I would like to be able to choose what it defaults to, so I don’t have to edit thousands of meta files to switch between compression formats.
     
  34. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Also, it would be nice if there was an equivalent to SupportsTextureFormat that used your target platform instead of the editor environment. For instance, if I'm packing texture arrays and caching them on disk, I might need to pack them as ASTC, but I will have the following issues:

    - SupportsTextureFormat will return NO, because OSX does not support that format. Ideally what I want is that the build settings has some kind of switch for what the default texture format for that platform should be and get that (ie: newer iOS only == ATSC)
    - When packing the arrays, Unity will throw a constant stream of warnings which say the texture format isn't supported. But again, I'm packing for my target device, not the editor.
    - Packing to ATSC will crash Unity on both my windows and OSX boxes, even though I can successfully pack other unsupported formats on them. Though some users do seem to be able to pack these formats without crashing.
     
    R0man, AcidArrow and Lars-Steenhoff like this.
  35. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Quick question for unity folks, how do you convert a RenderTextureFormat into GraphicsFormat?

    It's super annoying that RenderTextureDescriptor will happily take a RenderTextureFormat as ctor parameters, but there are no API to set RenderTextureFormat later, only graphicsFormat is exposed.

    I get that Unity is trying to get us to use RenderTextureDescriptor in GetTemporaryRT calls because it already supports more features. My problem with it is that setting GraphicsFormat manually means you are on your own when it comes to platform difference.

    And the SystemInfo.GetGraphicsFormat isn't helpful because it takes DefaultFormat, yet another enum that isn't RenderTextureFormat...

    Why does it have to be this hard?
     
  36. DrummerB

    DrummerB

    Joined:
    Dec 19, 2013
    Posts:
    135
    Use GraphicsFormatUtility.GetGraphicsFormat
     
  37. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Maybe I am crazy but 2019.4 LTS only had SystemInfo.GetGraphicsFormat in the doc? Is it deprecated or internal?
     
  38. DrummerB

    DrummerB

    Joined:
    Dec 19, 2013
    Posts:
    135
    It may be experimental or just not documented. I'm not sure. I'm pretty sure it should work in 2019.4
     
  39. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Yeah you are right, I guess that's we have to face when making custom SRP :)

    (kinda feel bad I am once again relying on Experimental namespace... I might stick to old GetTemporaryRT syntax whenever possible.)

    https://github.com/Unity-Technologi...rt/Graphics/GraphicsFormatUtility.bindings.cs
     
  40. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    DrummerB likes this.
  41. ferretnt

    ferretnt

    Joined:
    Apr 10, 2012
    Posts:
    412
    With apologies for bumping an old thread (because I'm dealing with exactly this), AcidArrow recently said:

    > Can we get a way to switch between a "base" compression format for iOS

    I think the request here (really regarding iOS) is that modern devices basically all support ASTC, so if you're building a modern (64-bit only) iOS game it would be great to set a flag in Player Settings, or otherwise, so that using "Default" options and specifying compression type "normal" results in (say) ASTC6x6 textures rather than PVRTC 4-bit. (I haven't checked the device matrix lately to know if Unity 2020 still supports non-ASTC-capable iOS devices, but I do know we don't!)

    This would give way higher quality textures for moder iOS games by default, without having to manually write a script to run through all textures and set their compression to ASTC in the player settings, and allow re-use of all the work Unity does to handle edge cases of "what if this input texture channel set has this quality setting", which is surprisingly complex to write for a large project.

    So it would be great if the generic "Low Quality", "Normal Quality" and "High Quality" options were actually mapped to some block sizes of ASTC by default. In other words, I guess I'm asking for a way to just say "forget about legacy iOS devices and PVRTC, and pick some sane graphics formats for modern iOS devices automagically without setting iOS-specific texture overrides."

    You can call this flag "IH8PVRTC".
     
  42. florianpenzkofer

    florianpenzkofer

    Unity Technologies

    Joined:
    Sep 2, 2014
    Posts:
    479
    According to our docs team the missing documentation is caused by a bug in our documentation tooling. We are fixing that.
     
    DrummerB likes this.
  43. florianpenzkofer

    florianpenzkofer

    Unity Technologies

    Joined:
    Sep 2, 2014
    Posts:
    479
    @ferretnt We have plans to change the default compression format for both Android and iOS to ASTC. It‘s not decided yet when that will happen.
     
    ferretnt likes this.
  44. kadd11

    kadd11

    Joined:
    Mar 11, 2018
    Posts:
    33
    Sorry to necro this thread, but something related to texture formats that I have been fighting with for a while is that the editor defaults to using the default format of the current build target, rather than the format of the machine that the editor is running on (e.g., if my build is set to target Android, but I am running Unity on a windows machine, it will default to ETC2) without an easy way to globally control it.

    Maybe this is desirable in some situations, or maybe many people just don't notice because I assume Unity silently swaps out the texture for the correct format if it is, say, provided as a material property. But for doing more advanced things, it leads to issues such as spamming warnings like
    'ETC2_RGB' is not supported on this platform. Decompressing texture.
    or things just not behaving as they would either in an actual Android build or in a Windows standlone build, because it's trying to use Android texture formats on a Windows machine.

    Is there any sane way to deal with this situation without decompressing the textures every time? This applies to both textures included as assets in the project in addition to textures downloaded via a web request or pulled from the file system and created with
    Texture2D.LoadImage
    .
     
  45. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Not that I'm aware of- this has been a problem in MicroSplat for several years now, as the only way to save a texture array is via a scriptable object, and it will spam these errors in the editor when you're set to a non-desktop platform. I submitted bugs on it a long, long time ago.
     
  46. florianpenzkofer

    florianpenzkofer

    Unity Technologies

    Joined:
    Sep 2, 2014
    Posts:
    479
    I don’t know why we have this warning in the Editor. We should probably remove it.
    @kadd11 can you please explain why you would want BCn in the Editor when Android is the build target?
     
  47. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    It's not about having BCn in the editor when Android is the build target, it's that the PC you're running the editor on does not support the ATSC texture format you've saved into the file, which is not a texture in the Library, but an asset in the project. This happens because you're not using the normal library work flow, where the editor knows that it's grabbing something from the library and needs to uncompress it into a native format.

    In my case, I was using texture arrays before Unity had any support for them, so I create them and save them into a scriptable object. When an import happens, I check the build target format and recompress them into the new format if it's changed. So when you are set to desktop you get DXT, and when you are set to android you get ATSC, for instance. But that means that you have (when set to android):

    Assets/Foo.asset <- Texture Array in ASTC format

    And when you use that asset, the runtime goes "Hey, we're on desktop, we don't support ASTC" and uncompresses the texture, but throws a warning about it because it's thinking "Hey, the library should have this in DXT for us, not ASTC" (but your not loading from the library)

    Given the limited nature of Unity's texture import pipeline (No 3d textures, Texture Array is done via a large sheet texture instead of a list of textures, hundreds of formats not supported, linear EXR 16bit textures get completely destroyed by gamma correction when saving and loading if rendering is set to gamma, etc, etc), this is a problem that will only grow. As well, there are many cases where you want to load textures from external sources already compressed into the native format, all of which will throw this warning in the editor.
     
  48. florianpenzkofer

    florianpenzkofer

    Unity Technologies

    Joined:
    Sep 2, 2014
    Posts:
    479
    I agree that the the warning is not helpful n the Editor, but what I got from @kadd11‘s post is that there are other issues caused by the runtime decompression.
    The only thing that Incan think of is when you reinterpret data using CopyTexture.
     
  49. kadd11

    kadd11

    Joined:
    Mar 11, 2018
    Posts:
    33
    Ah, sorry, when I said "things just not behaving as they would..." it might have been poor phrasing. I did not necessarily mean "things don't work", it's just that what happens under the hood does not match what would happen in a build on either platform. E.g. the work/cost of the decompression, the shader is sampling an uncompressed texture which it might never actually do in a build, etc. Even if things "work", the disconnect between the editor and a build isn't great.

    And to comment on your question as to why you would want a desktop format even if the build target is set to Android (on top of what jbooth said, which I agree with): I'm not sure why you would ever want a format that isn't supported on the platform that the editor is running on. At that point, what graphics-related things (i.e., doing something other than reading the raw bytes) can I even do with that texture that doesn't cause unity to decompress it?
     
  50. florianpenzkofer

    florianpenzkofer

    Unity Technologies

    Joined:
    Sep 2, 2014
    Posts:
    479
    The main reason is that by using the same compression settings in the Editor you get to see how the compressed texture looks like. You can see the compression artifacts pretty much exactly as on the target platform.