32-Bit worse than 16-Bit... you are probably thinking I'm on crack, but let me tell you it's true, and it's just a symptom of what turns out is a very bad hardware bug in the Voodoo 5 6000. More on that later though, firstly a bit of background to how I managed to find this problem. Just a note that this 'tale' contains some very light spoilers to DeusEx.
This story begins in February 2001 when I first got my V56K. One of the first things I did was try out various games with FSAA. I was kind of puzzled when playing DeusEx there seemed to be more banding with FSAA than I would otherwise expect. This was particularly noticeable in a couple of places. One of them was the zoom into Bob Page's eye right at the start of the intro. The other was scene (also in the intro) where the camera scrolls right though the Hell's Kitchen free clinic showing all the bodies. Here is a link to a correct 32-Bit screenshot from Deus Ex that shows what I expected to see (note it's fairly dark so you may want to view it fullscreen). However, this is not what I got at all. Here is a link to a different screenshot showing what I was actually getting. Note that there is now a fairly extreme amount of banding on the near wall, and that the image is brighter overall. This somewhat puzzled me, since it seemed to look worse than I was getting with my old Voodoo3. So, I decided to test things out with FSAA off, and the output that was produced was normal. At the time I didn't think much of it, in game itself I can't say I noticed any problems, but I didn't really play it. I mostly just got about half way through the Liberty Island mission, and that's it.
Skip forward to earlier this year. I was over at a friends place, and he had a 'spare' 17" monitor just lying around. Since I only had an old 14" he offered to lend it to me. Obviously I took up his offer and grabbed the monitor. Most unusually when we started to play Quake 3 bright red's yellows and whites began to bleed across to the right of screen. This wasn't anything that I really noticed with my old monitor. There was a tiny little bit of bleeding, but it was 1 pixel at most, so I hardly noticed it. However with the new monitor, the bleeding could be quite severe in places covering what I would guess to be at least 10 pixels (example). I never really took much notice of it since I was thinking it was a monitor issue. It mostly only effected interface components, and was rarely visible in the game world itself. In addition, it only occurred, as far as I could tell, in Quake 3. Turning off FSAA seemed to fix the problem.
We now skip forward to last week. I'm sitting here, bored with nothing to do, so I decide to replay DeusEx. I had been intending to replay it for sometime in all the glory that is the 4x FSAA. So I start a new game and I notice a tiny bit of banding in places. I didn't particularly take much notice of it, as I was expecting it. So, I was going through that game and I was up to Versalife in Hong Kong. The banding was beginning to kind of annoy me, so I decided to try the D3D Renderer in 32-Bit (I was previously using Glide). So I change the renderer and increase my gamma a fraction since things were looking a tad dark. Most unexpectedly the white boxes around object began to bleed red across the screen just like Quake 3. To compensate for this problem, I decreased my monitor's contrast that I have previously found decreased the bleeding in Quake 3. So I continued to play the game until I came to Paris. This is where I really noticed that something was very obviously wrong. In a huge number of dark places, there was absolutely no detail in the textures. The banding was very severe. It looked as bad as when doing 2 pass multitexturing in 16-Bit. This was making little sense. DeusEx should be multitexturing, and I was using 32-Bit... something was amiss.
So I did a few experiments. I started to see what effect different FSAA modes would have. 4x and 8x were both producing similar image quality. 8x just had better Antialiasing. Interestingly though, No FSAA and 2x FSAA didn't have any banding, and also, oddly enough, the output in those 2 modes, was quite a bit darker than 4x and 8x. In fact, they looked much better. This intrigued me quite a bit. Something was obviously amiss, but what could it be...
I play a bit more and I am in Morgan Everett's place and notice more banding, then it hits me... maybe, just maybe, a bit or 2 is being lost somewhere. My initial thought would be that maybe the hardware is broken in that it's dividing, then combining each of the samples, rather than the correct method of combining then dividing. I had already noticed such a bug in the Glide drivers and I was thinking perhaps there were just replicating the behaviour of the actual hardware in the drivers. It seemed somewhat reasonable, and as it turns out, I was on the right track, but that wasn't the problem at all.
I wrote a small program, called Illuminate (download here if you want it) that produces a series of gradients. What I was aiming to do was see if the 4x FSAA modes were losing bits. If they were, dark gradients with gamma correction set reasonably high should reveal if there was a combining problem. So, I run the program, and as expected, Illuminate revealed all colour channels in all 4x and 8x FSAA modes were actually losing 1 bit. I was quite shocked. I was even more shocked when I ran it in 16-Bit mode with No FSAA mode, and the resulting gradients were actually BETTER than what was produced in 32-Bit 4x and 8x FSAA. Don't believe me? Here's a Normal 32-Bit No FSAA shot, here's a shot that simulates the output in 32-Bit FSAA, and finally here's a shot from 16-Bit No FSAA. If you look carefully, you'll actually notice that 3dfx's 16-Bit, is actually giving effective 22-Bit output, however the 32-Bit FSAA shot is only giving effective output of 21-Bit! Note that the 16-Bit shot has higher green precision than the 32-Bit FSAA shot. I was obviously stunned. All 3 shots have had gamma correction applied so make it easier to see the colours.
4x and 8x FSAA are only outputting 21-Bit, not 24-Bit as expected. The results were the same with both 16-Bit and 32-Bit FSAA. Very strange I thought. The problem didn't exist with 2x FSAA. Next thing I did was attempted to isolate which bit was actually causing the problem. I modified Illuminate to highlight the various pixels (by messing with the gamma table) that had a certain bit (* key on keypad). The results were initially strange. I fully expected the least significant bit to be the one missing, and that would just suggest a simple combining error... but the results were anything but. The actual missing bit is not the least significant; it's the MOST SIGNIFICANT bit. This was very very strange indeed.
So I mess with the gamma table a bit more and blank out all of the colours >127 and it had no effect at all on screen output. Then it hits me, is it possible that the inputs into the gamma table are being divided by 2, and then the gamma adjusted value being multiplied by 2? Why this would be occurring, I don't know, but if it were, it would be causing serious problems. Any gamma curve that wasn't flat would be incorrect if halving the input and doubling the output. You would begin to get outputs that are above 1.0!
I modified illuminate a bit more, and start to increase the gamma correction (+ and - keys). The results were, as I feared. As I increased the gamma correction, all of the colours became brighter, including colours that were already at maximum. As I pushed gamma even higher, colours started to bleed across to the right as I previously experienced in Quake 3 and DeusEx. The card was definitely seemed to be doing:
output = 2*gamma_table[input/2];
I was shocked, because it also appeared that the card was doing the multiplication in analog space. The output to the monitor was becoming way out of spec. In fact, it was getting up to 2 times out of spec in some cases.
Next thing to try was to modify illuminate to produce a gamma table for only colours 0 to 127. In theory this gamma table should work properly if the hardware is prescaling then postscaling the output. The results ended up being as I expected. The output from illuminate was 100% correct with the 'special' gamma table. (toggle with space bar) No bleeding when increasing gamma correction, and other than the 1 bit lost, the output was identical to no FSAA and 2x FSAA. I then think back to Deus Ex, maybe, this messed up gamma correction, is what is actually causing all the banding problems.
I modified the GlideXP drivers to compensate for the messed up Gamma handling in 4x and 8x modes and run it. Almost like magic, all of the problems that I had previously noticed disappeared. No longer was the banding so obvious. It was still there if you knew exactly where to look, but it looked much better than what I was getting before.
In the end, the one thing I am yet to work out is why the hardware is doing this in the first place. Having gotten various people to try out illuminate, the problem is only showing itself on Voodoo 5 6000s. It does not occur on Voodoo 5 5500s. As a guess I'm thinking it's related to Analog SLI, but it's impossible for me to actually know.
So, there you have it, my tale about how I found a rather severe bug in my Voodoo 5 6000. After using this card for about 18 months now, I have to say it was worth the amount I payed for it, $0. There is no way I could ever recommend someone buy one of these beasts. Too many image quality problems, too many incompatibility problems, and the card is just too damn big.