zlacker

[parent] [thread] 1 comments
1. johnwa+(OP)[view] [source] 2023-03-06 10:58:46
As soon as other cards had 32 bit color, voodoo cards with 16bit color looked a lot worse.
replies(1): >>rasz+gi
2. rasz+gi[view] [source] 2023-03-06 13:30:39
>>johnwa+(OP)
Nvidia 16 bit color depth up to GF256 was rendered internally at 16bit precision and looked really bad, while their 32bit was just a marketing bullet point due to 100% performance hit when enabled. 3dfx had 16bit frame buffer, but internally rendered at 32bits and output was dithered resulting in >20bit color https://www.beyond3d.com/content/articles/59/

"Voodoo2 Graphics uses a programmable color lookup table to allow for programmable gamma correction. The 16-bit dithered color data from the frame buffer is used an an index into the gamma-correction color table -- the 24-bit output of the gamma-correction color table is then fed to the monitor or Television."

[go to top]