aths
2005-08-17, 19:24:01
Bitte durchlesen und schwere sprachliche Fehler korrigieren.
________________________________________________________________
High-End Chip G70: Only flickering AF?
With the launch of the GeForce 7800 GTX, Nvidia showed that the strong performance of its former SM3 Flagship, GeForce 6800 Ultra, may still be topped. Beneath more and improved pipelines, G70 offers higher clock speeds. Usefull new Antialiasing modes were added as well.
In terms of the level of texture quality, however Nvidia seems to evaluate the claims of its high-end customers as fairly low. The new high-end chip G70 seems to produce texture shimmering with activated anisotropic filter (AF). Just as a reminder: the situation with NV40 (GF 6800 series) is the same in standard driver settings, but this can be remedied by activating the "High Quality Mode".
For the G70 chip, the activation of "High Quality", does <I>not</i> bring the desired effect–the card still shows texture shimmering. Did Nvidia already with the NV40-Standard accustomed the user for AF textur flickering–so that their is no longer a real options for the G70-Chip to produce a flickering-free AF?
The anisotropic filter should in fact improve the quality of textures–but on your new card for 500 bucks, the result is texture shimmering. One can hardly consider this a real "qualitiy improvement". Good quality naturally has an impact on the rendering speed. "Performance" meaning "power" is "the amount of work W done per unit of time t", "performance" meaning "act" or "appearance" includes the quality of it. That is why the performance does <I>not</I> increase with such "optimized" AF.
Someone spending some 100 bucks on a new card problebly doesn't want to play without anisotropic filtering. One should also know that the 16x AF mode in the G70 chip, as well as the NV40 chip, renders some areas with 2xAF at max only even thought 4x+ AF would help here to have better textures. 16x AF naturally don't mean to have for the entire image any texture sampled with 16x AF, it depends on the degree of distortion of the resampled texture. But for some angles, even an 16:1-distortion gets only 2x AF resulting in blurry textures there.
That was may be acceptable at times of the R300 (Radeon 9500/9700 series), today this is of course outdated (sadly, the R420-Chip also suffers this restriction. In fact, since NV40 Nvidia mimiced ATIs already poor AF-pattern). For SM3 products like the NV40 or G70 we don't understand such trade-offs regarding the AF quality.
Right at the launch of the GeForce 7800 GTX only one site in the entire net described that problem of flickering textures: The the <a href="http://www.hardware.fr/articles/574-5/nvidia-geforce-7800-gtx.html" target="_blank">article on Hardware.fr</a> by Damien Triolet and Marc Prieur. We from 3DCenter offered for the launch, beside benchmarks in the Quality-mode only (by carelessness, we believed what Nvidia wrote in the Reviewer's Guide) only technical stuff like the improvements in the pixelshader. But as long as this problem with the textures exists on the G70, we don't see a reason to publish more about the shader hardware: First of all, the multitexturing has to be satisfactory, then we could talk about other stuff, too.
In older news we stated the cause for the G70-AF-flickering would be undersampling. We have to owe Demirugs investigations, that we now know the matter is more complicated than initially thought. The necessary texel data is actually read from the cache, <I>all needed texel</I> are read, but they are combined in a wrong way. Nvidias attempt to produce the same quality with less work was a good idea, but unfortunately they failed here. This could be a hardware-bug or a driver error.
After all, it does not matter: The G70, delivering a texelpower of more than 10 Gigatexels uses 2xAF for some angles only–while the user enabled a mode called "16xAF"; plus the tendency to have texture with a tendency to flicker, is an evidence of the G70's incapacity. We also have to criticize Nvidias statement in the Review Guidelines. It reads: (we also quoting the bold typing)
<B>"Quality"</B> mode offers users the highest image quality while still delivering exceptional performance. <B>We recommend all benchmarking be done in this mode.</B>
This setting results in flickering on the NV40 and on G70, but Nvidia called this "the highest image quality". An image which claims to show "quality" (even though without "high") <I>must of course do not tend to texture flickering</I>–but it does on Nvidias SM3 product line. For High Quality, Nvidia tells:
<B>"High Quality"</B> mode is designed to give discriminating users images that do not take advantage of the programmable nature of the texture filtering hardware, and is overkill for everyday gaming. Image quality will be virtually indistinguishable from <B>"Quality"</B> mode, however overall performance will be reduced. Most competitive solutions do not allow this level of control. Quantitative image quality analysis demonstrates that the NVIDIA "Quality" setting produces superior image fidelity to competitive solutions <B>therefore "High Quality" mode is not recommended for benchmarking.</B>
However, we also cannot confirm this: The anisotropic filter of the G70-Chips <I>does</i> flicker also under High quality, the current Radeon cards does <I>not</I> flicker even with "A. I." optimizations enabled. We would be really intereted how Nvidias "quantitative image quality analysis" examines the image quality.
<B>MouseOver 16xAF á la GeForce 6/7 vs. 8xAF á la GeForce 3/4/FX</B>
The image shows a tunnel built of 200 segments (causing the "fan out" effect at the border of the picture). Each color represent a new <a href="http://www.3dcenter.org/artikel/grafikfilter/index3.php" target="_blank">MIP-level</a>. The more near to teh center the new color begin, the more detailed (with higher AF-levels) the texture is being filtered. At several angles like e.g. 22.5°, the MIPs on Geforce 6/7 starting very early (near the border), because only 2xAF is provided for these angles. The Geforce 3/4/FX has an angle-"weakness" at 45° but shows a far more detailed image already at 8xAF - due to the fact that the colored MIPs appear later, more in centre.
Essentially, the GeForce 3/4/FX offers the appointed 8xAF for the most part of the image, while just few parts being treated with 4xAF. In opposite, the GeForce 6/7 offers the appointed 16xAF only for minor parts of the image, only the 90° and 45° angles are really getting 16xAF. Most of the other angles are being filtered far less detailed than the actually enabled 16xAF, though. Large areas of the image are just being filtered with 2x and 4xAF, the overall quality improvement is not that good as 8xAF provides on the GeForce 3/4/FX. The overall quality improvement of NV40's and G70's so-called 16x AF is in fact much littler that 8x on GeForce3-FX. ATI also does not reach the traditional GeForce quality, but at least ATI improved the capabilities of their AF with any real new chip generation (R420 is a beefed-up R300 and no real new generation) while Nvidia lowered the hardware capabilities for their AF implementation comparing to their own previous generation. The Reviewer's Guide is silent about that fact, it only highlights the possibility of 16x since NV40, making the Reviewer think this must be better that 8x provides by traditional GeForce products.
But this angel-depency of the GeForce 6/7' AF adds an extra "sharpness turbulence" to the image: Imagine the case, very good filtered textures (with 16xAF) are close to only weak filtered ones (with 2xAF), which catches even the untrained eye unpleasingly. The geometrical environenment of "World of Warcraft" is a good example to see such artifacts (on both any Radeon and GeForce since NV40.) Shouldn't an SM3-chip be able to do better, at least able to provide the texture quality of 2001's GeForce3?
But this now is very important: This AF-tunnel allows no conclusion concerning underfiltering, it only shows which MIP-level is used when. Normally, you should be able to tell which particular AF-level is used, by comparing the pattern with a LOD-biased pattern. But in the case of undersampling, there are fewer texels sampled than actually needed. For instance, undersampled 8xAF is no true 8xAF. The right MIP-map is chosen indeed, but the wrong amount of texels are used. As already stated, this tunnel only shows which MIP-level is used when, not if the AF is implemented correctly.
==== Seite 2 ====
<B>Videos to demonstrate the effect of undersampling on GeForce 6/7 series graphic cards</B>
Note: If you experience stuttering during video playback, you can lower playback speed by pressing Ctrl+Cursor down in <a href="http://sourceforge.net/project/showfiles.php?group_id=82303&package_id=84358" target="_blank">Media Player Classic</a>. The required codec can be obtained by installing the latest version of <a href="http://www.3dcenter.org/downloads/fraps.php" target="_blank">Fraps</a>. Media Player Classic should be configured to automatically repeat the video. During the first run, the video is supposedly going to stutter, but the next time it should run fluidly.
We also advise to disable the overlay playback and use VMR instead. This way you ensure the best 1:1 rendering of the video stream.
The videos are made by Damien Triolet, we have his permission to publish them here. We'd like to thank him for his efforts. The videos were captured in Unreal Tournament 2003. The first texture layer got full trilinear filtering in HQ mode (or A. I. off) while the other texture layers get reduced trilinear only. Since this video uses a demo with only one texture, it is full trilinear filtered in HQ mode, but real games are using more layers. This looks like a special application "optimization" (meaning quality reduction.) Nvidia is silent about that, since AF test tools show full trilinear on any layer. Remark: Do not trust texture quality test tools since the driver treats games in other ways.
= Neues Mouseover =
Normal: prev_layer1.png, mit MouseOver: prev_layer2.png
[Bildunterschrift] For easier use, these images are scaled down. You can see both picutres in full-res here: [Link, layer1.png und layer2.png sieht] On this images you see the difference of the treatment of texture stage 1 and 2. The primary texture stage still get full trilinear in the HQ mode. This is true for any texture stage if you check it with an texture test tool. In UT however, any non-primary stage got heavily reduced trilinear only.
The videos are not rendered using the standard LOD bias of UT2003, rather using a (correct) LOD bias of 0. This means: If the texture filter works correctly, there shouldn't be any flickering effects. All videos are recorded with 8x AF enabled to render them. They video image size is 1024x768, original speed was 20 fps using a slow motion tool, and the playback speed was set to 30 fps.
We advise you <B>to download just one single video first</B>, to check if your machine can play it appropriately. Both the high video resulution and lossless codec result in a high system load. Therefore we also offer a short description of what can be seen in each video.
"Quality" on a GeForce 6800 results in flickering. Furthermore, one can see the only partially applied trilinear filter: "Flickering bands" are followed by "Blurry bands" (areas where the texture is too blurry). In our opinion, this mode shouldn't be named "Quality.", but Nvidia decided to offer this "quality" as standard and advise do all benchmark with such poorly rendered textures.
"High Quality" on the GeForce 6800 is a borderline case: The textures looks if they are just starting to flicker, while they in fact do not flicker. Like all cards of the NV40 and G70 series, the 6800 also shows angle-dependant differences in sharpness, caused by the inferiour AF pattern compared to the GeForce3-FX series graphic cards.
Nvidia's new card features a by far greater raw texture power than the GeForce 6800, but shows remarkable worse textures as well: The annoying flickering is obvious. According to Nvidia's Reviewer's Guide though, this mode deliver "the highest image quality while still delivering exceptional performance." In our opinon, this quality is too poor to be offered to anyone.
When using the GeForce 7800's "High Quality" mode, flickering is reduced and it now does look better than GeForce 6800's standard mode (which however delivers poor image quality). Yet, the GeForce 6800's just flicker-free HQ mode can not be achieved: The GeForce 7800 can not be configured by the user to use AF without flickering textures.
ATI's Radeon X800, even when using standard settings, seems to be far superior to any GeForce 6800 or 7800 already. There are areas which tend to flicker faintly, but altogether, only the angle-dependant AF reduction in the tunnel is distracting. The GeForce 7800's "High Quality" quality is clearly surpassed.
When turning off A.I. on the X800, no remarkable differences to activated A.I. can be seen.
As reference card, a GeForce FX in "High Quality" mode. This shows us two things: Not all GeForce cards show flickering, see ground an wall textures: They are absolutely "stable." Furthermore, the whole tunnel is textured as sharply as it should be when using 8xAF, because of the superiour AF-implementation.
<B>Conclusion:</B>
ATI's Radeon X800 shows: Even with activated "optimizations" (meaning quality reduction), there are no flickering textures. While there is no full trilinear filtering used, this can not be noticed so quickly. Even though ATI's texture filtering hardware does not compute as exactly as a GeForces', the overall image quality is better, for there are not as many questionable "optimizations." Angle dependency when using AF, however, should not be considered as a feature of modern graphic cards any more, ATI's advertising using "High Definition" gaming can thus be seen as an unfulfilled promise straight from the marketing department.
Nvidia, with its current 7800 series, offers graphic cards that can not be recommended to lovers of texture quality–even though texel performance was increased by a factor of 2.5 compared to the GeForce FX 5800 Ultra! Added to the angle dependency (inspired by ATI's R300), there is now the tending to texture flickering. The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture flickering as much as possible. With the 7800, this seems to be useless, even when using "High Quality", the new chip tends to texture flickering.
The quoted passages from Nvidia's Reviewer's Guide can easily disproved. Nvidia makes claims which are clearly disproved by the upper videos. That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, against a Radeon, are wrong: Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse. Thus, the so called "performance" should not be compared either. One should also not compare 7800 standard vs 6800 standard or 7800 HQ vs 6800 HQ, since the 7800's texture quality is lower. Real "performance" includes on-screen image quality.
What advantage do you have of 16x AF if you get at max 2x AF at certain angles only and if you can have texture flickering while other cards provide textures free of flickering? All benchmarks using the standard setting for NV40 and G70 against the Radeon are <b>invalid</b>, because the Nvidia cards are using general undersampling which can result in texture flickering. The GeForce 7 series cannot be configured to deliver flickering-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flickering-free AF quality.
If there should be any changes with new driver versions, we will try to keep our readers up-to-date.
________________________________________________________________
High-End Chip G70: Only flickering AF?
With the launch of the GeForce 7800 GTX, Nvidia showed that the strong performance of its former SM3 Flagship, GeForce 6800 Ultra, may still be topped. Beneath more and improved pipelines, G70 offers higher clock speeds. Usefull new Antialiasing modes were added as well.
In terms of the level of texture quality, however Nvidia seems to evaluate the claims of its high-end customers as fairly low. The new high-end chip G70 seems to produce texture shimmering with activated anisotropic filter (AF). Just as a reminder: the situation with NV40 (GF 6800 series) is the same in standard driver settings, but this can be remedied by activating the "High Quality Mode".
For the G70 chip, the activation of "High Quality", does <I>not</i> bring the desired effect–the card still shows texture shimmering. Did Nvidia already with the NV40-Standard accustomed the user for AF textur flickering–so that their is no longer a real options for the G70-Chip to produce a flickering-free AF?
The anisotropic filter should in fact improve the quality of textures–but on your new card for 500 bucks, the result is texture shimmering. One can hardly consider this a real "qualitiy improvement". Good quality naturally has an impact on the rendering speed. "Performance" meaning "power" is "the amount of work W done per unit of time t", "performance" meaning "act" or "appearance" includes the quality of it. That is why the performance does <I>not</I> increase with such "optimized" AF.
Someone spending some 100 bucks on a new card problebly doesn't want to play without anisotropic filtering. One should also know that the 16x AF mode in the G70 chip, as well as the NV40 chip, renders some areas with 2xAF at max only even thought 4x+ AF would help here to have better textures. 16x AF naturally don't mean to have for the entire image any texture sampled with 16x AF, it depends on the degree of distortion of the resampled texture. But for some angles, even an 16:1-distortion gets only 2x AF resulting in blurry textures there.
That was may be acceptable at times of the R300 (Radeon 9500/9700 series), today this is of course outdated (sadly, the R420-Chip also suffers this restriction. In fact, since NV40 Nvidia mimiced ATIs already poor AF-pattern). For SM3 products like the NV40 or G70 we don't understand such trade-offs regarding the AF quality.
Right at the launch of the GeForce 7800 GTX only one site in the entire net described that problem of flickering textures: The the <a href="http://www.hardware.fr/articles/574-5/nvidia-geforce-7800-gtx.html" target="_blank">article on Hardware.fr</a> by Damien Triolet and Marc Prieur. We from 3DCenter offered for the launch, beside benchmarks in the Quality-mode only (by carelessness, we believed what Nvidia wrote in the Reviewer's Guide) only technical stuff like the improvements in the pixelshader. But as long as this problem with the textures exists on the G70, we don't see a reason to publish more about the shader hardware: First of all, the multitexturing has to be satisfactory, then we could talk about other stuff, too.
In older news we stated the cause for the G70-AF-flickering would be undersampling. We have to owe Demirugs investigations, that we now know the matter is more complicated than initially thought. The necessary texel data is actually read from the cache, <I>all needed texel</I> are read, but they are combined in a wrong way. Nvidias attempt to produce the same quality with less work was a good idea, but unfortunately they failed here. This could be a hardware-bug or a driver error.
After all, it does not matter: The G70, delivering a texelpower of more than 10 Gigatexels uses 2xAF for some angles only–while the user enabled a mode called "16xAF"; plus the tendency to have texture with a tendency to flicker, is an evidence of the G70's incapacity. We also have to criticize Nvidias statement in the Review Guidelines. It reads: (we also quoting the bold typing)
<B>"Quality"</B> mode offers users the highest image quality while still delivering exceptional performance. <B>We recommend all benchmarking be done in this mode.</B>
This setting results in flickering on the NV40 and on G70, but Nvidia called this "the highest image quality". An image which claims to show "quality" (even though without "high") <I>must of course do not tend to texture flickering</I>–but it does on Nvidias SM3 product line. For High Quality, Nvidia tells:
<B>"High Quality"</B> mode is designed to give discriminating users images that do not take advantage of the programmable nature of the texture filtering hardware, and is overkill for everyday gaming. Image quality will be virtually indistinguishable from <B>"Quality"</B> mode, however overall performance will be reduced. Most competitive solutions do not allow this level of control. Quantitative image quality analysis demonstrates that the NVIDIA "Quality" setting produces superior image fidelity to competitive solutions <B>therefore "High Quality" mode is not recommended for benchmarking.</B>
However, we also cannot confirm this: The anisotropic filter of the G70-Chips <I>does</i> flicker also under High quality, the current Radeon cards does <I>not</I> flicker even with "A. I." optimizations enabled. We would be really intereted how Nvidias "quantitative image quality analysis" examines the image quality.
<B>MouseOver 16xAF á la GeForce 6/7 vs. 8xAF á la GeForce 3/4/FX</B>
The image shows a tunnel built of 200 segments (causing the "fan out" effect at the border of the picture). Each color represent a new <a href="http://www.3dcenter.org/artikel/grafikfilter/index3.php" target="_blank">MIP-level</a>. The more near to teh center the new color begin, the more detailed (with higher AF-levels) the texture is being filtered. At several angles like e.g. 22.5°, the MIPs on Geforce 6/7 starting very early (near the border), because only 2xAF is provided for these angles. The Geforce 3/4/FX has an angle-"weakness" at 45° but shows a far more detailed image already at 8xAF - due to the fact that the colored MIPs appear later, more in centre.
Essentially, the GeForce 3/4/FX offers the appointed 8xAF for the most part of the image, while just few parts being treated with 4xAF. In opposite, the GeForce 6/7 offers the appointed 16xAF only for minor parts of the image, only the 90° and 45° angles are really getting 16xAF. Most of the other angles are being filtered far less detailed than the actually enabled 16xAF, though. Large areas of the image are just being filtered with 2x and 4xAF, the overall quality improvement is not that good as 8xAF provides on the GeForce 3/4/FX. The overall quality improvement of NV40's and G70's so-called 16x AF is in fact much littler that 8x on GeForce3-FX. ATI also does not reach the traditional GeForce quality, but at least ATI improved the capabilities of their AF with any real new chip generation (R420 is a beefed-up R300 and no real new generation) while Nvidia lowered the hardware capabilities for their AF implementation comparing to their own previous generation. The Reviewer's Guide is silent about that fact, it only highlights the possibility of 16x since NV40, making the Reviewer think this must be better that 8x provides by traditional GeForce products.
But this angel-depency of the GeForce 6/7' AF adds an extra "sharpness turbulence" to the image: Imagine the case, very good filtered textures (with 16xAF) are close to only weak filtered ones (with 2xAF), which catches even the untrained eye unpleasingly. The geometrical environenment of "World of Warcraft" is a good example to see such artifacts (on both any Radeon and GeForce since NV40.) Shouldn't an SM3-chip be able to do better, at least able to provide the texture quality of 2001's GeForce3?
But this now is very important: This AF-tunnel allows no conclusion concerning underfiltering, it only shows which MIP-level is used when. Normally, you should be able to tell which particular AF-level is used, by comparing the pattern with a LOD-biased pattern. But in the case of undersampling, there are fewer texels sampled than actually needed. For instance, undersampled 8xAF is no true 8xAF. The right MIP-map is chosen indeed, but the wrong amount of texels are used. As already stated, this tunnel only shows which MIP-level is used when, not if the AF is implemented correctly.
==== Seite 2 ====
<B>Videos to demonstrate the effect of undersampling on GeForce 6/7 series graphic cards</B>
Note: If you experience stuttering during video playback, you can lower playback speed by pressing Ctrl+Cursor down in <a href="http://sourceforge.net/project/showfiles.php?group_id=82303&package_id=84358" target="_blank">Media Player Classic</a>. The required codec can be obtained by installing the latest version of <a href="http://www.3dcenter.org/downloads/fraps.php" target="_blank">Fraps</a>. Media Player Classic should be configured to automatically repeat the video. During the first run, the video is supposedly going to stutter, but the next time it should run fluidly.
We also advise to disable the overlay playback and use VMR instead. This way you ensure the best 1:1 rendering of the video stream.
The videos are made by Damien Triolet, we have his permission to publish them here. We'd like to thank him for his efforts. The videos were captured in Unreal Tournament 2003. The first texture layer got full trilinear filtering in HQ mode (or A. I. off) while the other texture layers get reduced trilinear only. Since this video uses a demo with only one texture, it is full trilinear filtered in HQ mode, but real games are using more layers. This looks like a special application "optimization" (meaning quality reduction.) Nvidia is silent about that, since AF test tools show full trilinear on any layer. Remark: Do not trust texture quality test tools since the driver treats games in other ways.
= Neues Mouseover =
Normal: prev_layer1.png, mit MouseOver: prev_layer2.png
[Bildunterschrift] For easier use, these images are scaled down. You can see both picutres in full-res here: [Link, layer1.png und layer2.png sieht] On this images you see the difference of the treatment of texture stage 1 and 2. The primary texture stage still get full trilinear in the HQ mode. This is true for any texture stage if you check it with an texture test tool. In UT however, any non-primary stage got heavily reduced trilinear only.
The videos are not rendered using the standard LOD bias of UT2003, rather using a (correct) LOD bias of 0. This means: If the texture filter works correctly, there shouldn't be any flickering effects. All videos are recorded with 8x AF enabled to render them. They video image size is 1024x768, original speed was 20 fps using a slow motion tool, and the playback speed was set to 30 fps.
We advise you <B>to download just one single video first</B>, to check if your machine can play it appropriately. Both the high video resulution and lossless codec result in a high system load. Therefore we also offer a short description of what can be seen in each video.
"Quality" on a GeForce 6800 results in flickering. Furthermore, one can see the only partially applied trilinear filter: "Flickering bands" are followed by "Blurry bands" (areas where the texture is too blurry). In our opinion, this mode shouldn't be named "Quality.", but Nvidia decided to offer this "quality" as standard and advise do all benchmark with such poorly rendered textures.
"High Quality" on the GeForce 6800 is a borderline case: The textures looks if they are just starting to flicker, while they in fact do not flicker. Like all cards of the NV40 and G70 series, the 6800 also shows angle-dependant differences in sharpness, caused by the inferiour AF pattern compared to the GeForce3-FX series graphic cards.
Nvidia's new card features a by far greater raw texture power than the GeForce 6800, but shows remarkable worse textures as well: The annoying flickering is obvious. According to Nvidia's Reviewer's Guide though, this mode deliver "the highest image quality while still delivering exceptional performance." In our opinon, this quality is too poor to be offered to anyone.
When using the GeForce 7800's "High Quality" mode, flickering is reduced and it now does look better than GeForce 6800's standard mode (which however delivers poor image quality). Yet, the GeForce 6800's just flicker-free HQ mode can not be achieved: The GeForce 7800 can not be configured by the user to use AF without flickering textures.
ATI's Radeon X800, even when using standard settings, seems to be far superior to any GeForce 6800 or 7800 already. There are areas which tend to flicker faintly, but altogether, only the angle-dependant AF reduction in the tunnel is distracting. The GeForce 7800's "High Quality" quality is clearly surpassed.
When turning off A.I. on the X800, no remarkable differences to activated A.I. can be seen.
As reference card, a GeForce FX in "High Quality" mode. This shows us two things: Not all GeForce cards show flickering, see ground an wall textures: They are absolutely "stable." Furthermore, the whole tunnel is textured as sharply as it should be when using 8xAF, because of the superiour AF-implementation.
<B>Conclusion:</B>
ATI's Radeon X800 shows: Even with activated "optimizations" (meaning quality reduction), there are no flickering textures. While there is no full trilinear filtering used, this can not be noticed so quickly. Even though ATI's texture filtering hardware does not compute as exactly as a GeForces', the overall image quality is better, for there are not as many questionable "optimizations." Angle dependency when using AF, however, should not be considered as a feature of modern graphic cards any more, ATI's advertising using "High Definition" gaming can thus be seen as an unfulfilled promise straight from the marketing department.
Nvidia, with its current 7800 series, offers graphic cards that can not be recommended to lovers of texture quality–even though texel performance was increased by a factor of 2.5 compared to the GeForce FX 5800 Ultra! Added to the angle dependency (inspired by ATI's R300), there is now the tending to texture flickering. The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture flickering as much as possible. With the 7800, this seems to be useless, even when using "High Quality", the new chip tends to texture flickering.
The quoted passages from Nvidia's Reviewer's Guide can easily disproved. Nvidia makes claims which are clearly disproved by the upper videos. That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, against a Radeon, are wrong: Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse. Thus, the so called "performance" should not be compared either. One should also not compare 7800 standard vs 6800 standard or 7800 HQ vs 6800 HQ, since the 7800's texture quality is lower. Real "performance" includes on-screen image quality.
What advantage do you have of 16x AF if you get at max 2x AF at certain angles only and if you can have texture flickering while other cards provide textures free of flickering? All benchmarks using the standard setting for NV40 and G70 against the Radeon are <b>invalid</b>, because the Nvidia cards are using general undersampling which can result in texture flickering. The GeForce 7 series cannot be configured to deliver flickering-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flickering-free AF quality.
If there should be any changes with new driver versions, we will try to keep our readers up-to-date.