Alternate Title:
EF BB BF - The Three Bytes of Doom
Last night I decided to whip out a quick SlimDX / DirectX 10 project implementing Perlin Noise using Shader Model 4.0. In my Perlin Noise on the GPU article, I mentioned how much easier it would be to implement Perlin Noise using SM4.0+ vs SM3.0. I had done a quick port of Ken Perlin's Java example implementation in FX Composer a couple months back, so I thought I would be able to implement the stand-alone version in less than an hour.
I wanted to test it first in a pixel shader, so I made my own custom vertex as well as a class that would build a fullscreen quad vertex buffer. I took the Perlin Noise HLSL code and put it into a header file, and made a simple two-technique shader that included the header.
I fired up the code, but I quickly got an exception (E_FAIL: An undetermined error occurred (-2147467259)) at the point where I was trying to set the InputLayout of the vertex buffer to the shader signature. Not a very useful exception.
At first, I thought it might have been a problem with my custom vertex format. After looking it over and comparing against other custom vertex formats I've made in the past, I determined the issue wasn't there.
Next I looked at how I was creating the vertex buffer and index buffer for the fullscreen quad. That all appeared in order too.
After determining that the issue was not in my custom vertex format or my fullscreen quad, I slowly stepped through the code keeping a close eye on all of the variables. I didn't think there was a problem with my shader because no errors were being output from the compiler, and it was spitting out a valid Effect. However, when stepping through and watching the variables, I saw that even though it created an Effect and the IsValid property was true, it said the TechniqueCount was zero! In fact, all of the counts were saying zero. It was as if the compiler was seeing it as an empty file.
So, I next looked at my shader in detail. I thought maybe something funky was happening with the included header file, so I inlined all of the header code. I still got the exception. I thought it might be some random issue in one of my noise functions, so I changed them to all return 1.0. Exception. I triple checked all of my input and output structures. I looked over my technique description. I changed my vertex shader to be a simple pass-through and the pixel shader to just return white. Exception.
What the heck was going on? I had other shaders that were compiling just fine. So, just as a "stupid" test, I copied one of the my other working shaders and pasted all of the code into my fx file, overwriting everything in it. I still got the exception!
Now I knew something was really messed up somewhere. Here I had two fx files with the exact same HLSL code in them, but one was compiling while the other was not.
I opened both files up using the Binary Editor in Visual Studio to see byte by byte. The file that was not compiling had three extra bytes at the beginning - EF BB BF. I deleted these three bytes, and everything worked!
It turns out that that byte sequence is the Byte Order Mark. It is specifying that the file is UTF-8 encoded. Apparently this is the default for all files created in Visual Studio 2008. Unfortunately, the FX compiler in DirectX can't read UTF-8 files and just dies and returns an empty Effect.
I did a quick Google search after fixing the problem and I saw that several other people had the same problem and eventually came to the same solution. I really wish the compiler would return an error in a situation like this.
What I find very interesting is the fact that I have been programming shaders for over two and a half years, and I have never run into this problem before.
Hopefully this post helps someone else if they encounter this same problem.
Until next time...
No comments:
Post a Comment