Search Unity

Assign 16-bit grayscale pixels to Texture2D

Discussion in 'General Graphics' started by SimonJ9, Apr 17, 2019.

  1. SimonJ9

    SimonJ9

    Joined:
    Feb 5, 2018
    Posts:
    17
    Hi guys, I'm trying to render a dicom image with Texture2D, which is in 16-bit grayscale. I have loaded the pixel bytes already. This is what I got (with unlit/texture shader):

    upload_2019-4-17_0-0-21.png

    I divided each pixel by 0xFF to make it 8-bit and put it into the RGB channel of a Color32.
    However, when I convert the image with Photoshop and import into Unity, this is what it renders:

    upload_2019-4-17_0-1-48.png

    I'm trying to figure out where I did wrong. I didn't find any texture format supporting 16-bit grayscale, so I'm trying to compress the 16-bit value into 4 channels of 32-bit ARGB (which is 8-bit each). Any suggestions on this problem?
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,348
    Unity can take 16 bit greyscale images and keep them in that format, at least since 2018.

    Save the 16 bit greyscale image to a PNG, PSD, or TIF in the assets folder. Select it, and change the type to Single Channel. Change the format to R 16 bit.
     
  3. SimonJ9

    SimonJ9

    Joined:
    Feb 5, 2018
    Posts:
    17
    Hi, Thank you for your reply. I found the import settings for the texture generated, but I'm trying to construct the 2d texture from the byte array I extracted from the dataset. Is there anything similar in scripts? I found R16 in the Textureformat, but that seems only works for the red channel.
     
  4. SimonJ9

    SimonJ9

    Joined:
    Feb 5, 2018
    Posts:
    17
  5. DianaRito0304

    DianaRito0304

    Joined:
    May 16, 2022
    Posts:
    4
    @SimonJ9, The site you posted is no longer available. Have you had updates on this issue? I'm having the same problem... How did you solve it?
    Thanks a lot!