Search Unity

How is light probe data input into shaders

Discussion in 'General Graphics' started by misharg, Jun 8, 2017.

  1. misharg

    misharg

    Joined:
    May 30, 2017
    Posts:
    5
    Hello, I'm trying to use light probes with a completely custom render loop. Currently trying to reproduce what Unity is doing for light probes to their shader variables.

    Unity appears to use a SphericalHarmonicsL2 per object, generated by LightProbes.GetInterpolatedProbe which is then converted into SH parameters used by shaders (SHAr, SHBr, SHAg, SHBr, SHAb, SHBb, SHC). However the individual SH coefficients from SphericalHarmonicsL2 don't seem to be mapped directly to values, for instance when I try to use SphericalHarmonicsL2[0 (red), 0] it appears to be different from SHAr.x. How does unity map the individual coefficients from SphericalHarmonicsL2 onto the shader parameters?

    Alternatively, pseudocode of SphericalHarmonicsL2.Evaluate would also show how the colors are being generated by Unity from normals and I could remove the parts that the shaders are doing to reproduce what Unity is doing for light probes to their shader variables.

    Edit:
    Figured out the values. In case anyone else needs this the answer is

    for each color c [red c=r or 0, green c=g or 1, blue c=b or 2]
    SHAc = float4(probe[c, 3], probe[c, 1], probe[c, 2], probe[c, 0] - probe[c, 6])
    SHBc = float4(probe[c, 4], probe [c, 5], 3 * probe[0, 6], probe[0, 7])
    and SHC = float3(probe[0, 8], probe[1, 8], probe[2, 8])

    then, as in UnityCG.cginc
    L0.c = SHAc.w;
    L1.c = SHAc.xyz * normal;
    L2.c = float5(SHBc.xyzw, SHC.c) * float5((normal.xyzz * normal.yzzx), normal.x^2-normal.z^2);
     
    Last edited: Jun 8, 2017