The data type in the properties of the shader described in the unity documentation can use "Integer" to represent a real integer. But when I use "Integer" in the shader, I get a syntax error. "Int" can be used correctly, but the type is baked into a float and I need a real integer. The shader code is as follows: Shader "Custom/NewUnlitShader" { Properties { _MainTex ("Texture", 2D) = "white" {} _ExampleName ("Integer display name", Integer) = 1 } SubShader { ........ } }
What version of Unity are you using, because this was a feature that was added in 2021.1. If you're using a 2020 or 2019 version of Unity it won't work because the option doesn't yet exist.
I'm also having this issue. I'm using Unity 2021.3.19f1. The use of "Integer" is wholly unrecognized, despite the documentation saying otherwise: https://docs.unity3d.com/2021.3/Documentation/Manual/SL-Properties.html This is what happens when I attempt to use "Integer":
@swishchee unrecognized by what exactly? From the screenshot it looks like it's not recognized by the syntax highlighting in your editor.
Ah ha, you're right! I initially chose to ignore the error in the IDE (Rider), but then I proceeded to get some unrelated type definition errors that I THOUGHT may have been attributed to this unrecognized type. It turns out, however, that the type error I was getting had to do with redefinitions in different passes. However, I'm getting another error that I'll make a new post about, because I am unsure of how to address it, and it's definitely not related to this. I managed to get this code to work just fine without any unity errors, except for the one that's fundamentally related to the Metal API.