Steer clear of any fancy effects, and your games might actually run a bit faster than before. Hardly the worst I’ve ever seen, at least. Enabling bilinear filtering incurs in a lower framerate (those numbers up there din’t tell the story anyway), and also an unneeded look into S3’s dithering solution. It runs decently on 320×240, as long as you disable texture filtering and fog and fancy stuff and… just about everything else… you know, maybe you should just take your chances with a N64. Tough luck, CPU guys: Turok requires a 3D card. It would be a bit like trying to run a 4K game on a regular Xbox One. Texture filtering! On a card that, by most accounts, needs 8 cycles for bilinearly filtered textures! Those engineers must have seriously overestimated their chances. Unfortunately, many S3-specific games wanted to run on at least 512×384, with bilinear filtering. The only way to run Blood 2 properly: in 320×240 on Low settings, with disabled lightmaps. If you stuck a Virge in there, kept the resolution low, and didn’t bother with filtering or anything else, your game was gonna look a lot like software rendering… but faster. Back then, it was sometimes hard to get even 320×240 to run well in software mode. It should be pointed out that its popular nickname might be something of a misnomer, as the card did indeed “accelerate” rendering. In Half-Life, the card outright refuses to texture the enviroment, even on 320×240… so I just took this picture at 640×480 for the extra crispness. Also, as my DX has merely 2MB of video memory, I went out of my way to find a 4MB Virge this time, so I could test a few games all the way up to 800×600 (why I would want to suffer like that, nobody knows), or at least run certain tests without the constant threat of dropped textures. While I already had the DX version, I know that the 3D engine was improved for that card, so I wanted to see what kind of framerates I could really get from the original decelerator. So yeah it seems like a conscious decision to cause this problematic sharp filtering.I’ve recently acquired a S3 Virge. There was actually an official statement from ATI regarding this LOD sharpness for the X800 cards. There are also some quirks with the mip mapping in general but those would be less obvious than this LOD issue. I am not sure if that is lower quality than what NVIDIA renders by default but it makes the Radeon image much less irritating. This is remedied by dropping the mip mapping slider to Performance (IIRC). Perhaps ATI identified that sharper LOD is more pleasing in screenshots and if you don't enable AF it looks better than plain bilinear/trilinear in most cases. The LOD thing is probably just their driver doing what they want (for some reason). I think there are a few different issues that were identified with the cards. Some links about the filtering on the HD cards. Probably a X850 and X1950 too but my brain is fuzzy on that. If you can see that aliasing in a still image, guess how it looks in motion. I think you will see the problems with any version of Half Life 2. For some reason with some games ATI has things cranked way too sharp and you get tons of texture aliasing. The problems I was seeing seemed to be a rather straightfoward LOD bias quirk. Will the issue be obvious with a normal install of hl2 and standard settings? Did you have a main link that talked about the issues? It can obviously run quite a wide range of driver releases and it conveniently works great with the last driver released for the 8800 series which I also played a bit with. I did find that I needed to use older drivers for MSAA to work with HL2 though. It actually also is a breaking point for bump mapping with the D3D8 Star Wars Republic Commando. 7.12 has an overhauled OpenGL driver that removed some old extensions. Especially after Catalyst 7.11 (which can only be used with 3870 and older). You will also certainly run into OpenGL issues with older games and a Radeon. I am not sure how the 5000 series behaves. It's not limited to Source but there is something extra special going on there. I did some web searches and apparently this issue had some controversy back then. From what I understand this makes these cards behave more like NVidia. You need to drop its mip map quality down to performance in the control panel. This occurs in at least HL2 and Dark Messiah of Might & Magic. Radeon HD cards have a texture LOD bug with Source engine that causes extreme texture aliasing. Mostly regarding DirectX 9 and later cards. Thread is about comparisons of GeForce and Radeon texture filtering.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |