Thread: Geforce FX
View Single Post
Unread 12-19-2002, 03:49 PM   #12
airspirit
Been /.'d... have you?
 
airspirit's Avatar
 
Join Date: Jul 2002
Location: Moscow, ID
Posts: 1,986
Default

Notice how I said 8500 and up? The old Radeons weren't anything to write home about, and they had a ton of problems, both physical and driver related. Oddly enough, at that time the Geforce cards were perfectly on spec. The tables have turned, requiring NVidia to buy developers rather than make a proper product. Anymore, the only reason that a Geforce card will run better than an ATi card (8500 and up) is if the game was specifically written to do that. For example: NVidia payed out their ass to ensure that UT2K3 would work well on their GF4TI cards and cripple it on the Radeons (some of this was fixed in the latest patch). In the end, it is becoming harder for them to cripple ATi solutions, since doing so would similarly cripple NVidia solutions as well, since it would require severely damaging the DX and GL systems. In the UT2K3 example, while it was optimized for the GF4, the Radeon cards weren't hurt too bad (and now run perfectly, on the new "fixer" patch).

Whenever you see a NVidia label on a game box, you really need to ask why this is so, since the only thing that label gets you in the end is a more expensive graphics card if you buy one to play the game. If they didn't pay off developers, their cards would be as much as 30% less than they cost now, but on the flipside, they wouldn't work for crap in any games nearly as well as they do now.

I urge you guys not to support this kind of Microsoftish behavior.
__________________
#!/bin/sh {who;} {last;} {pause;} {grep;} {touch;} {unzip;} mount /dev/girl -t {wet;} {fsck;} {fsck;} {fsck;} {fsck;} echo yes yes yes {yes;} umount {/dev/girl;zip;} rm -rf {wet.spot;} {sleep;} finger: permission denied
airspirit is offline   Reply With Quote