TL;DR -- skinning seems too slow on new power machine, dude gets curious about cpu vs gpu skinning, how does gpu skinning work with respect to custom vertex-shaders etc. So the backstory: got a brand-new rather powerful work machine (i7-4900MQ, Quadro K2100M, 32 GB RAM), used this occasion to finally upgrade my project straight from Unity 4.3.4 to 4.5.1 and also switch to GL-ES to GL for the time being --- 3 big changes at once, not the brightest of ideas, but I did work out all the kinks in the end. Then I noticed the scene still renders so very much slower than on my previous 2-years-old (and also in many other ways much-lesser) hardware! Toggled through game-objects and realized the single skinned mesh renderer was causing a framerate halving. When I disabled it, I got a higher framerate within the expected range (60-100 FPS), when re-enabled I was at 15-30 FPS tops. In the editor's Game View, that is. (Also --- soft shadows, 1 dir-light enabled, forward path with custom-made deferred post-process-lighting.) Changing the bones in Quality Settings from 4 to 2 and 1 and back made no difference whatsoever.. And: this was regardless of whether GPU skinning was enabled or not (I'm on Pro edition). The SkinnedMeshRenderer was a 4-pass one (9k verts, 12k tris, 4 submeshes). I switched to another much lower-poly single-pass SkinnedMeshRenderer, performed only slightly better, still in the 30-40 FPS range. Pretty ridiculous but I suppose might be a skinning regression in 4.5.1 or some such, other C++ gamedevs outside the Unity ecosystem have been doing skinning on both CPU and GPU often 10s or 100s of meshes for perhaps 15 years or so on now-ancient PCs and on various long-gone console generations --- but throw a single skinned mesh at Unity on a 2014 high-spec mobile workstation and you get a whopping 20 FPS...... so anyway, I digress; now--- Now why do I post this in ShaderLab?! Here's why: I noticed that GPU vs. CPU skinning didn't seem to make a difference. That got me wondering... what the heck actually is going on with Unity's skinning internally, and how does it do GPU skinning? And does it notify in the Console if perhaps it falls back to CPU for specific SkinnedMeshRenderers or for the current game run? Does it even ever fall back or is GPU skinning always guaranteed to happen when enabled? Note I'm running Unity with -force-opengl but heck, I don't suppose GPU skinning is DX-only, I guess it works for all the Mac users so it's gonna work on Windows GL with up-to-date NVIDIA drivers! So I forced my way into this thread and when @MakeCodeNow replied this: ..then I realized we need a freaking in-depth guide to what exactly is Unity's GPU-skinning is doing to our custom vertex shaders, if anything? So I'm not using Surface Shaders but custom-coded vertex+fragment Cg programs. No probs & works nicely. But now I'm hearing: "GPU skinning means it's done in a vertex shader" and I'm wondering, not in the vertex shaders I have written so far! Does it wrap my vertex shader? How about some basic "in-depth behind-the-curtains internals" documentation! I now know that "Unity has a lot of magic in its shader pipeline which is probably why you don't see that code" and also probably why I don't see any high-performance rendering of a simple single SkinnedMeshRenderer. So details please, anyone at UT and any others with the inside track! What does "GPU skinning" do to my custom vertex shaders --- how do I know for sure it's active on the GPU vs. CPU -- where's that GPU-skinning vertex-shader source code Unity seems to somehow add to vertex shaders (I haven't seen such stuff in the usual cgincs and it would be odd for Unity to publish all other shader sources but not this)?