I'm curious how much detail you all have been able to push on using the regular console to develop games. I'm finding it to be near the performance of my iphone 8, and no idea why. Perhaps it might be because I'm using 2K Textures for my rocks and stuff? Maybe Deferred Rendering? Maybe real time (Sun or Moon) light? Maybe the Standard Shader? But here's an example (image) No rocks in this image though. But nevertheless I face the same issue. As you can tell, there's not really that much going on. Even if I turn off the god rays, same issue. I'm getting nearly 30FPS and it's not a steady 30FPS, it's a jerky kind so you can't even really turn the player without noticing the hiccups. I know it certainly should be able to handle more detail than this, does getting a devkit really open that much more power for FPS? The stats for the creators program doesn't seem like it would really be that bad off performance wise (4 Gigs of ram), full GPU Utilization, CPU 4 primary cores and some shared cores. My main goal with the creators program is to get a really good demo or at least really good test for Xbox to test out for a Dev Kit. But if the full power of the Xbox isn't much different than this, then console dev just isn't for me, because I could have sworn the Xbox could hit more than 300K Triangles before killing performance and certainly more than 50 shadow casters and 247 batches. my iphone can handle more detail than this and stay steady.