Not gonna lie, I don't understand a single thing about compute shaders, so the main question will be "can this be done or not" and only then it'll be "how to actually make this done". I have an object with compute shader attached to it via C# script. This object is not camera, it can be absolutely anything, but I prefer it to be a standard sphere. I need to capture low-res 360 degree image of an environment, including non-static objects, from that game object. In real-time, of course. I could use realtime reflection probes, but they don't suit my needs. But it does absolutely everything the same way as I need my sphere to do. I was thinking about casting small amount of rays in all directions and getting color from hit objects, but then I realized it isn't possible or I couldn't find any solution to get object data from raycasting (which, btw, is done weirdly). So, the point is, can it be done and if yes then how? Is there at least some examples of how to project world to texture without attaching script to camera? Thanks in advance.