ATI came to us a couple days ago and mentioned this. Certainly ATI wants this story spun in their favor. We will be looking into this water rendering issue, but we are unsure if it is one of those points that really makes a difference in gameplay. Remember that HardOCP actually uses gameplay to evaluate games so our editors have a logged many hours playing Crysis, not just running the canned demos included. We gave not noticed and obvious impact yet. Obviously driver optimizations are nothing new to the industry, you just have to ask yourself, "At what price?" Not the first time this has come up...
http://enthusiast.hardocp.com/articl...hlbnRodXNpYXN0
Our original Crysis comparisons are noted here in our 8800 GT article. And there are some water screen shots in there as well.
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
Also I think it is worth noting that we have championed gameplay quality as a backbone to video card testing for a while now. Is NVIDIA cheating or are they simply making an acceptable IQ variation to deliver better performance? I would say that YOU would need to be the judge of that and how it impacts your gaming experience should be the basis for your opinion. Cheating the cheaters anyone?
I will interested in seeing if the Elite Bastards article is a regurgitation of what ATI reported to them....
I think it is worth noting that the 169.0X drivers used in all this testing are clearly marked as "Beta." And lastly, it is worth ATI's time now to start slinging mud when the 2900 series card is doing better in actual gameplay than it ever has before? Maybe ATI should have thought about that before he seeded the issue with EliteBastards?
Let's all get the WHQL driver and see what happens then?