Is there a way to get doubles to work with sine and cosine on GPU compute shaders? I suppose you could roll your own functions to support doubles, but I just wonder if they're using anything faster than a Taylor series or CORDIC on the GPU for calculation of sine/cosine? I discovered a new mathematical formula based on the Discrete Fourier transform that lets you output any array size of frequency domain for a given input array of time based samples, allowing me to get music frequencies based on piano pitches rather than an even distribution that the DFT/FFT normally produces, especially with the FFT requiring powers of 2, for complex number analysis of music. So I'm basically rebuilding Unity's FFT on the GPU, and it works, but I want to test with doubles. The sine and cosine operations on HLSL intrinsic functions only output floats though.