Sorry, still no actual code or pretty pictures!
I just wanted to write up a quick note related to my second topic in my previous post. I did some Googling to see if any other people have implemented a similar system and indeed some people have. In fact I found an article by Microsoft that describes exactly what I was talking about.
In the article, they are creating procedural materials dynamically, so they have to calculate normals dynamically as well.
From the article:
"One solution is to run the shader multiple times and compute the difference in height at each sample point. If we calculated the height one pixel to the right of the currently rasterized pixel and one pixel above the currently rasterized pixel, we could compute tangent and bitangent vectors to the central pixel. Doing a cross product on these would give us the normal for that point."
What I found funny is how they start talking about the ddX and ddY functions in HLSL but in the end they still use the render target + second pass method.
"The solution that this sample uses by default is to render the perturbed heights of the objects in the scene into an off-screen render target. That render target is then read back in on another pass. For each pixel on the screen, its right and top neighbors are sampled. Tangent and bitangent vectors are created from the neighbors to the central pixel. A cross product between these will give the normal."
I now feel very confident about this method of doing things and I will proceed to implement lighting in this manner. I will probably branch off of my existing planet codebase so I can easily compare the differences between the brute-force noise calculation vs the "deferred" style.