I am working on a shader that creates geometry per vertex and I finally have the geometry part working. However, what I want to do now is apply a texture map to it. When I attempted this, it placed the geometry that I created at each vertex, but then it applied the entire _MainTex texture to each of the new geometries (spheres), which wasn't what I was expecting. What I am trying to do is sample the color at a given uv coordinate based upon the "source" mesh. To clarify, if I have a plane with a ten by ten grid, and I place a sphere at each vertex, I want to be able to color all 100 spheres, by making a single texture map and applying it to the grid. The spheres would generate their color value based upon a textures UVs coords, and allow me to recreate a photographic image on the spheres (assume the plane is laying flat and the us are 0-1). My question is it possible to access the "source mesh" on a geometry shader and apply it to run time created geometry? I couldn't readily find any examples that mimic the effect I was going for. But it would be like using a photograph (texture), to drive the colors on a digital lite-brite, I am not interested in coloring the individual lights with an array, I realize that this is a really broad question, but I am not seeking a specific solution, as much as the knowledge of how the relationships within the geometry shader work.