So I am looking to get tilebased lighting implemented in my 2D game (similar to RimWorld) and the 2D URP lighting system or any 3rd party tool really works well in this way so I am trying to build it myself and running into an issue. This is my code (in a fragment shader) to apply lighting data (and the lightmap texture is generated dynamic at runtime) on a custom tilemap for a 2D game in Unity: Code (CSharp): // pull in the lighting data for the fragment float4 lightmapColor = SAMPLE_TEXTURE2D(_LightmapTex, sampler_LightmapTex, float2(localUVX, localUVY)); // using white in the lightmap is indicator it has no lightmap value int hasLightmapColor = ( lightmapColor.r == 1 && lightmapColor.g == 1 && lightmapColor.b == 1 && lightmapColor.a == 1 ) ? 0 : 1; // if we don't have a lightmap value then we should be using the default light color which is // determined based on the time of day that is update each frame lightmapColor = hasLightmapColor == 1 ? lightmapColor : _DefaultLightColor; //this gets the main unlit fragment value from the main texture float4 mainTex = i.color * SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.uv); // finally apply the lighting data using the alpha as the intensity mainTex.rgb = mainTex.rgb * (lightmapColor.rgb * lightmapColor.a); I want to use bilinear instead of point filter mode as I don't want my lighting to be pixelated but when I do, the outside edge of the lighting has a white ring like the screenshot attached which makes sense since it is transitioning from the edge of the lighting data to white which is used to indicate no lighting data (no matter what color I use, I get a weird edge). Is there any way to use bilinear filtering mode for light data without this artifact or a different way to get smooth unpixellated 2D lighting?