Archiv verlassen und diese Seite im Standarddesign anzeigen : nVidia: new (and old) filtering stunts
Hakkerstiwwel
2006-03-23, 11:34:31
Et voila Seite 1, um korrektur und Anmerkungen wird gebeten :biggrin:
While the use of Anti Aliasing (AA) and the anisotropic filter (AF) has only emerged among Pc users a few years ago, these features do have been available since more than half a decade now. At those early days, however, they were hardly used, because of either a misjudgement, or the video cards simply did not have enough power to allow playable frame rates with these features enabled. As an illustration, 3dfx’s supersampling method, which until recently was unrivalled, used to be considered as equivalent to a trivial blur filter such as nVidia’s “Quincux” modes. Even specialised Hardware sites, which were well aware of the IQ Features’ advantages, were unable to properly evaluate the AA quality.
While on gaming consoles the use of both AA and AF is just now emerging, nowadays each available Pc Graphics Card features these options. Back in Y2k you were required to have a card such as the legendary Voodoo5 in order to obtain a flickering free image with 4xSGSSA. Today, however even a sub 100 Eur card can deliver smooth edges and hence relatively flicker free images.
How is it then possible, many people certainly wonder, that even the birds are singing from the trees, that nVidia’s flagship card (G70/71) has a tendency to produce shimmering textures. What do you need to do to obtain the best possible image quality, who was at the origin of all these “optimisations”, and how can you escape this dilemma?
The optimal Image Quality
An image on screen in motion has, without IQ enhancing measures, the tendency to be subject to edge aliasing. This undesirable effect in the meantime is more or less under control thanks to among others the new transparency AA modes. For an appealing graphical sensation, however you need more than jag-less edges: good textures are at least as important. Now the next innovation comes into the game: a sharper presentation of distorted textures with the use of the anisotropic filter. And this filter has been the subject to discussions since a long time.
It would be much easier if high-end graphic cards weren’t benched against each other. For the only reason that at the first glance the options as shown in the control panels look similar, all possible settings have been pitted against one another in the past: nVidia’s 16 bit vs 3dfx’s 24 bit rendering, nVidia’s trilinear filter against 3dfx’s Mipmap-Dithering, nV20’s trilinear Af against R200 bilinear AF, as well as the R200’s supersampling AA against nV20’s multisampling AA. (the following sentence has been ignored)
From time to time screenshots were presented for image quality comparison purposes. The ingenuous reader ignored the fact that these screen shots were not very meaningful. So as a consequence it was no problem to adapt to the readers and present screen shots with a bad compression. (bitte jemand hier nachbessern, klingt sch..) Spotting differences between the test-scenes by using screen shots is not always an easy task. This is why very often the relevant scenes are magnified by using a bilinear or bicubic filter. As a consequence, the differences, which very often have the size of just 1 or 2 pixels, are distorted as they are mixed with the adjacent ones. (the correct filter to be used would be nearest neighbour)
The actual benchmarks as a contrast are presented in the most sophisticated ways. Bars are fading into the scene, multicoloured amplitudes are distracting from the fps, but all these figures and charts are meaningless as there is no common denominator: the image quality. In the meantime consensus has been reached for AA: gamma 2.2 adjusted 4xMSAA with the option to activate transparency AA in order to smooth texture aliasing caused by Alphatesting is comparable between Ati and nVidia. In what concerns AF however, only the number in front of the setting counts, regardless the actually applied degree of filtering. Of course this fact is exploited by the GPU vendors to obtain higher fps.
Although in this regard Ati cannot be considered a choirboy either, the “A.I. Low” setting can be attested a convincing combination of high performance and good image quality. So while Ati’s filtering optimizations, as a general rule, have no really adverse impact on IQ, and therefore are hardly noticed by gamers, with nVidia cards it is possible, by just a couple of “incorrect” clicks in the control panel, to obtain only fps-optimized shabby IQ, despite the fact that apparently the best possible IQ has been selected.
Let’s have a closer look at the, in the meantime, rather old Control Panel, which by the way is unofficially available in the meantime in a new, refined version. This Panel, straight after the installation looks as follows:
Screenshot
The (Systlemleistungsslider) is apparently coupled to the various optimizations. So, if the Slider is set to “High Quality”, the various optimizations (trilinear, anisotropic Mipfilter, anisotropic Sample?) can no longer be adjusted. These settings are now greyed, and set to “off”
If, however, the title is part of nVidia’s “optimization list”, the maximum possible quality which you may obtain, is the one which nVidia allows, and not what the hardware permits. UT2004 as an example allows only reduced trilinear (=brilinear) filtering on non primary texture stages. With Far Cry, too, a degradation of IQ could be noted with the appearance of the 77series drivers.
Our plea to nVidia is: please give us a driver setting, which allows it to enable the maximum possible image quality, in other words actually translate the “high quality” mode in-game if selected and displayed in the control panel. After all: who is happy to notice the sudden appearance of a “blister” in his favourite oldie title after having spent several hundreds of eur for a new graphics card?
Such a blister appears if the translation between MIP-maps is not interpolated as it should be. While Ati’s cards tend to oversample with activated AF, and consequently hide their imperfect implementation of the trilinear filter, nVidia’s mathematically more correct implementation is subject to produce more visible artefacts if reduced trilinear or bilinear filtering is applied.
Under certain circumstances such a bilinear filter may even be noticed in the driver’s “HQ” settings. For explanation purposes, here is a small summary of nVidia’s sampling optimizations:
1. trilinear optimization
- enabled by default
- reduced interpolation between MIP-maps (may result in blisters)
2. Anisotropic Mipfilter Optimization
- disabled by default
- switches to a brilinear on TS>0 (D3D) or all texture stages (OGL) (tends to produce MIP banding)
3. Anisotropic Sampling Optimization
- enabled by default
- general undersampling (flickering output)
“Optimization” is an expression used euphemistically by nVidia to describe a speed gain resulting from reduced quality. (to be fair: this is Ati’s interpretation as well). This is why we put the word “optimization” between quotation marks if it does not mean a true optimization, but just a speed gain obtained by reduced quality.
Selecting AA and AF, of course after having installed the driver, and only afterwards switching to the “high Quality” Mode will result, as already foretold, in MIP-Banding in OpenGL titles such as Doom3 or Quake3/4. Responsible for these phenomena are two peculiarities of the Forceware’s control panel:
- in case the “Quality” setting is selected, and AF is activated within the control panel, the automatic “Anisotropic MIPfilter Optimization” is enabled. As far as we know here nVidia has copied Ati’s former driver behaviour, with the difference, that nVidia applies the “optimization” even if AF has been selected in-game (contrary to Ati).
- Not deactivating all “optimizations” separately before switching into HQ, results in OpenGl, that they remain active. Everyone who does not know about this, or has not discovered these peculiarities by chance, will never obtain true “High Quality” in OpenGL titles with nVidia – since a considerable time now (tested with the major driver releases from 66.72 to 83.40)
Vertigo
2006-03-23, 12:41:38
Page 2
So, what does it mean for users forcing AF via nVidia's driver control panel?
How to get „High Quality“ AF:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable "High Quality".
3. Enable AA/AF.
4. Enable LOD-clamp.
How to get bilinear AF at "High Quality" settings:
1. Enable AA/AF.
2. Enable "High Quality".
3. Run a game using OpenGL.
low-grade "High Quality" | real "High Quality"
How to get "brilinear" filtering in Unreal Tournament 2004:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable „High Quality“.
3. Enable AA/AF.
4. Download this UT2004 testlevel [herunterladen].
5. Start the game, open the UT2004-console (tab key) and enter "firstcoloredmip 1".
Screenshot : testlevel | testlevel with "firstcoloredmip 1" enabled
How to get flickering textures in Riddick (this can be reduced to a minimum by enabling "High Quality" via nVidia's driver control panel):
1. Don't select a high resolution, for instance 1024*768.
2. Don't use 8xS or any supersampling-anti-aliasing mode.
3. Force 8x or 16x anisitropic filtering.
4. Run Riddick.
Detail taken from this scene. | flickering textures as animated GIF
Due to its clear-cut bumpmaps, Riddick is suitable for pointing out nVidia's poor filtering quality at default driver settings. These effects are mostly the same in any other game.
How to get flickering textures in racing games:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable "High Quality".
3. Enable AA/AF .
4. Disable LOD-clamp (in fact it is disabled at default driver settings).
LOD-Clamp enabled | LOD-Clamp disabled
According to our tests, many programmers of racing games often use a negative LOD-bias for textures. There are two possible reasons for this. First of all, the asphalt textures are so much distorted because of the player's angle of view that they absolutely require AF. As this filter is either not available or drags too much performance on older cards, the textures are sharpened by using a negative LOB Bias.
This however causes flickering textures on nVidia graphics cards in such a way that nVidia needed to add a LOD-clamp switch into their driver control panel, which fixes the general texture-LOD bias to 0.0. nVidia recommends to activate the clamp switch when using anisotropic filtering.
On the other hand, a texture LOD of 0.0 obviously changes too early to the next lower mipmap causing fuzzy textures. Finally, in certain cases even the best possible quality offered by nVidia graphics cards is not as good as ATI's standard settings with activated optimisations. This issue has been discussed in several threads on our forum.
___
Page 3
What is nVidia's intention?
This time, we don't believe in unintentional driver bugs. Apparently, these filtering tricks are to pull the wool over the hardware reviewer's eyes. All driver bugs have one thing in common: they always lead to higher framerates and lower quality. It's a curios thing, that some of our colleagues published images of graphics card reviews showing obvious MIP binding. The assumed better (sharper) image quality was bought dearly by flickering textures.
Most of our colleagues do their reviews using nVidia's default driver settings. This is even more remarkable, because you won't use nVidia's default driver settings when you are interested in having the best image quality. If you want to get rid of flickering textures, the first remedy is enabling "High Quality" settings via nVidia's driver panel. But most reviewers don't seem to notice the texture flickering caused by default driver settings. We think that's a problem, because a benchmark test must guarantee that every single graphics card is testet in a similar manner and textures don't flicker on Radeon cards at default driver settings and are still sharp.
The potential customer is likely to buy the graphics card, that offers the best performance at "Quality" settings. Many reviewers say then, that this graphics card is powerful enough to handle "High Quality" as well. If you wonder why the performance drops at "High Quality" by about 30% in Quake 4 (worst case scenario), you are told that even reliable hardware magazines only measured a performance drop of 2-3%.
There is a simple solution: every reviewer plays the benchmark game for at least half an hour and keeps an eye on the image quality. Then he chooses a scene with sharp textures and takes a screenshot of this scene (one by each graphics card). If the reviewer notices differences between the images, he checks the driver settings again und tries to enable the best quality. Afterwards, he takes new screenshots and when the image quality is comparable, the driver settings fit the bill and the benchmark can be started with exactly these settings.
What's the cause for alarm, my card is doing well without flickering?
It is hard to judge such a subjective thing like "3D-quality on a computer monitor" in an objective way. So we won't say that nVidia's image quality on NV40/G70 based graphics cards is inadequate or insufficient. Especially on CRTs of different manufacturers the tube itself has greater influence on the image than texture filtering optimisations. Actually, texture filtering optimations are quite useful. In games like Serious Sam 2 or Quake 4 the combination of filter optimisations and quality mode runs these games up to 50% faster than high quality mode without optimisations.
Nevertheless, filtering optimisations worsen image quality for the benefit of framerate and to the disadvantage of comparability. Furthermore, not anybody spends 500€ on a graphics card that runs considerable slower at higher quality settings. So it's up to the hardware reviewers to run their benchmarks at comparable quality settings.
What made us do that?
The current problem has not been introduced by the G70, but nVidia pushed the optimitsations so far, that many users noticed them at first go.Forceware 78.03 dissolved the problem at least at "high quality". Nevertheless, the texture flickering at default driver settings ("quality") remained. But several years ago, ATI initiated filter optimisations with the R200 (Radeon 8500). The R200 had a poor quality on filtering textures, that were rotated by 45°. Furthermore, these textures were only filtered bilinear.
nVidia's NV20 (GeForce3) filtered textures independent of angles and fully trilinear as well. However, both settings where compared and of course the R200 won the benchmarks more often than not.With additional AA, nVidia's 2xRGMSAA was faster than ATI's supersampling, what balanced the scores. Yet image quality was not comparable.
ATI's R300 (Radeon 9700) astonished the experts, because it was as far as 3 times faster than the GeForce Ti4600 with AA/AF enabled. Though the R300 had better features than the NV25, like Pixel Shader 2.0 or 6x sparse MSAA, a part of the performance was brought out by filerings optimisations, again. The quality of AF was better than on the R200 and trilinear filtering could be used in combination with AF. Anyhow, the determination of the LOD was still improper. Catalyst 3.2 introduced an optimisation for texture stages, that set filtering to bilinear when texture stage was >0. This feature has been removed in the meantime.
For all that, image quality was quite equal to nVidia - at least on Screenshots. Once again, reviewers ran their benchmarks with different image qualities without reserve. Still nVidia's NV3x was quite independent of texture angles, but nVidia was forced to add more and more "optimisations" to the drivers to increase performance by sacrificing quality bit by bit. As we all know, this still wasn't enough to catch up with the R300.
The NV40 established Shader Model 3.0 and at last rotated grid MSAA up to 4x. Unfortunately, the NV40 adapted the dependence of angles for the anisotropic filering. But ATI'S R420 was even more powerful (thanks to Catalyst A.I., that even improved the performance of the R300 by up to 40%) and so nVidia added all of ATI's filtering optimisations.
As ATI introduced the Catalyst Control Center, the texture stage optimisation was removed and the R520 (Radeon X1800) offered the option "high quality AF", which is almost as good as AF on GeForce3.
Notwithstanding the filtering is much faster than you would expect it. We wonder whether this will be a reason for nVidia to remove their "optimisations" or not.
Conclusion
Seemingly both, ATI and nVida, do research intensely into efficient AF. In our opinion, nVidia is slightly in front, but this advantage is taken away by their GPU-design, that lacks in parallelisation of sample- and ALU-clocks. On nVidia cards, an anisotropic filtering, that lasts for several sampleclocks, may jam the other ALUs. ATI doesn't have these problems, because their ALUs are more independent. Additional sampleclocks, that are inevitable caused by the AF-logic, are used for overfiltering without affecting the computing power.
Even though the high quality filter on nVida graphics cards became better since Forceware 78.03, huge framerate drops in certain situations seem to be inevitable and quality still isn't faultless. On the other hand we don't like that ATI doesn't allow application specific optimisations without filtering optimisations and vice versa. Either all useful optimisations for the application are disabled as well as all filtering optimisations, or the filtering optimisations are enabled when Catalyst A.I. is in use.
nVidia's G70 is a very strong performer, but it is not totally convincing in consideration of texture quality. ATI's area AF on R520 and R580 is outstanding and it seems to be very simpel, to choose the graphics card with the best image quality. On the other Hand, nVidia SLI offers a great performance with supersampling anti-aliasing or hybrid modes, that may compensate for flickering textures. Supersampling anti-aliasing doesn't influence MIP banding, but flickering in general. Running an SLI-System to fight texture flickering with supersampling anti-aliasing doesn't seem to be very economic, but that's a common issue condisering high end hardware.
So, what is remaining after all? On the one hand, ATI and nVidia offer extreme powerful high end hardware. Despite different architectures, the performance of their graphics cards is pretty similar. On the other hand, most of the hardware reviewer out there run benchmarks using default driver settings, that don't provide comparable image quality. There are some exceptions to this. We want to compliment ComputerBase and PC Games Hardware on their current testing methods and the users on our forum providing our community with matchable benchmark results.
We want to appeal to our colleagues, that image quality does deserve closer attention and that the quality of their future reviews can be increased by exacting comparison, even if this takes more time.
Acknowledgement: Thanks a lot to aths for several corrections and suggestions and thank you, xfire, for running the benchmarks. After all, I want to thank several members of our forum, without their help writing this article would not have been possible.
Wenn Du die .RTF brauchst , schicke mir einfach eine PN mit Deiner E-Mail-Adresse.
Mr. Lolman
2006-03-23, 12:51:45
Dickes Dankeschön euch beiden :up:
Ich schick Leo gleich ne PN.
Hakkerstiwwel
2006-03-23, 14:17:50
Page 2
So, what does it mean for users forcing AF via nVidia's driver control panel?
How to get „High Quality“ AF:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable "High Quality".
3. Enable AA/AF.
4. Enable LOD-clamp.
How to get bilinear AF at "High Quality" settings:
1. Enable AA/AF.
2. Enable "High Quality".
3. Run a game using OpenGL.
low-grade "High Quality" | real "High Quality"
How to get "brilinear" filtering in Unreal Tournament 2004:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable „High Quality“.
3. Enable AA/AF.
4. Download this UT2004 testlevel [herunterladen].
5. Start the game, open the UT2004-console (tab key) and enter "firstcoloredmip 1".
Screenshot : testlevel | testlevel with "firstcoloredmip 1" enabled
How to get flickering textures in Riddick (this can be reduced to a minimum by enabling "High Quality" via nVidia's driver control panel):
1. Don't select a high resolution, for instance 1024*768.
2. Don't use 8xS or any supersampling-anti-aliasing mode.
3. Force 8x or 16x anisitropic filtering.
4. Run Riddick.
Detail taken from this scene. | flickering textures as animated GIF
Due to its clear-cut bumpmaps, Riddick is suitable for pointing out nVidia's poor filtering quality at default driver settings. These effects are mostly the same in any other game.
How to get flickering textures in racing games:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable "High Quality".
3. Enable AA/AF .
4. Disable LOD-clamp (in fact it is disabled at default driver settings).
LOD-Clamp enabled | LOD-Clamp disabled
According to our tests, many programmers of racing games often use a negative LOD-bias for textures. There are two possible reasons for this. First of all, the asphalt textures are so much distorted because of the player's angle of view that they absolutely require AF. As this filter is either not available or drags too much performance on older cards, the textures are sharpened by using a negative LOB Bias.
This however causes flickering textures on nVidia graphics cards in such a way that nVidia needed to add a LOD-clamp switch into their driver control panel, which fixes the general texture-LOD bias to 0.0. nVidia recommends to activate the clamp switch when using anisotropic filtering.
On the other hand, a texture LOD of 0.0 obviously changes too early to the next lower mipmap causing fuzzy textures. Finally, in certain cases even the best possible quality offered by nVidia graphics cards is not as good as ATI's standard settings with activated optimisations. This issue has been discussed in several threads on our forum.
...]
vorgeschlagene Aenderungen in bold
anmerkungen
- saetze bitte nicht mit but oder so anfangen
thomas62
2006-03-23, 14:19:19
vorgeschlagene Aenderungen in bold
anmerkungen
- saetze bitte nicht mit but oder so anfangen
hallo
und für die nicht so english begabten , geht sowas evtl. auch in DEUTSCH
mfg tomtom
Vertigo
2006-03-23, 14:34:39
Hakkerstiwwel, danke für die Korrekturen!
Hakkerstiwwel
2006-03-23, 14:35:21
hallo
und für die nicht so english begabten , geht sowas evtl. auch in DEUTSCH
mfg tomtom
Sorry, ich steh jetzt etwas auf dem Schlauch. Was willst du mir bitte sagen?
thomas62
2006-03-23, 19:56:27
Sorry, ich steh jetzt etwas auf dem Schlauch. Was willst du mir bitte sagen?
hallo
naja dieser ellenlange text ist ja nun mal auf english.!
und es soll ja noch leutchen geben die nicht ganz so gut english können
mfg tomtom
Für den Titel: Stunts (statt tricks) wäre u. U. passender.
Hakkerstiwwel
2006-03-23, 21:23:37
hallo
naja dieser ellenlange text ist ja nun mal auf english.!
und es soll ja noch leutchen geben die nicht ganz so gut english können
mfg tomtom
Mr lolman war so freundlich das Ganze ins Deutsche zu übersetzen :redface:
http://www.3dcenter.de/artikel/neue_filtertricks/
Hakkerstiwwel
2006-03-23, 21:24:54
Für den Titel: Stunts (statt tricks) wäre u. U. passender.
ack, vielleicht kann ein Mod die Kopfzeile anpassen.
Filtering Stunts oder Filter Stunts?
Hakkerstiwwel
2006-03-24, 08:54:32
Filtering Stunts oder Filter Stunts?
"filtering" ist perfekt
ich bin noch dabei ein paar kleinere Korrekturen an Seite 3 vorzunehmen...
thomas62
2006-03-24, 09:38:12
Mr lolman war so freundlich das Ganze ins Deutsche zu übersetzen :redface:
http://www.3dcenter.de/artikel/neue_filtertricks/
DDD AAA NNN KKK EEE ;D ;D ;D :cool: :cool: :cool: :cool:
Leonidas
2006-03-24, 09:52:49
"filtering" ist perfekt
ich bin noch dabei ein paar kleinere Korrekturen an Seite 3 vorzunehmen...
Saustark! Bitte bei mir per PM melden, wenn die Sache spruchreif ist.
Vertigo
2006-03-24, 10:05:43
So, ich habe die Korrekturen in Seite 2 eingefügt und paar Dinge in Seite 3 bereinigt.
Hakkerstiwwel
2006-03-24, 10:20:27
ich bin noch dabei ein paar kleinere Korrekturen an Seite 3 vorzunehmen...
Page 3
What does nVidia want to achieve?
This time, we can’t believe in unintentional driver bugs. These filtering stunts’ intentions seem to be to pull the wool over the hardware reviewer's eyes (klasse Ausdruck). All panel bugs have one thing in common: while they degrade image quality, they lead to higher framerates. It's a curios thing, that some of our colleagues have in the past even published images of graphics card reviews showing obvious MIP binding. Paradoxically the reviewers then used these sharper textures, achieved by undersampling and accompanied by shimmering, as a criterion to demonstrate nVidia’s superior image quality
Even worse is that most of our colleagues use nVidia's default driver settings for their reviews. This is even more remarkable, because no user interested in a clean image will ever be satisfied with these settings. If you want to get rid of flickering textures, the first remedy is enabling "High Quality" settings via nVidia's driver panel. Most reviewers, sadly, seem not to notice the texture flickering caused by default driver settings. In our opinion, this is a problem, because a benchmark test must guarantee that every single graphics card is testet in a similar manner and textures do not flicker on Radeon cards and are still sharp nervertheless.
The ingenuous reader, who thinks having considered all aspects, is most likely to buy * the graphics card, that offers the best performance at "Quality". In case he is then not satisfied with the shimmering textures, he is told that the card is powerful enough to handle even the “High Quality” setting. Inquiries about the astonishing frame rate drops, which may reach up to 30%, are disposed of with the argument that even reputable print-magazines have only measured drops of 2-3%[/b]
chart
There would be such a simple solution: every reviewer plays the benchmark game for at least half an hour and keeps an eye on the image quality. Then he chooses a scene with sharp textures and takes a screenshot of this scene (one by each graphics card). If the reviewer notices differences between the images, he checks the driver settings again und tries to enable the best quality. Afterwards, he takes new screenshots and when the image quality is comparable, the driver settings fit the bill and the benchmark can be started with exactly these settings.
What's the cause for alarm, my card is doing well without flickering?
It is hard to judge such a subjective thing like "3D-quality on a computer monitor" in an objective way. So we won't say that nVidia's image quality on NV40/G70 based graphics cards is inadequate or insufficient. Especially on CRTs of different manufacturers the tube itself has greater influence on the image than texture filtering optimisations. Actually, texture filtering optimizations may be quite useful. In games like Serious Sam 2 or Quake 4 the combination of filter optimisations and quality mode runs these games up to 50% faster than high quality mode without optimisations.
Nevertheless, filtering optimizations worsen image quality for the benefit of framerate and to the disadvantage of comparability. Furthermore, not anybody spends 500€ on a graphics card that runs considerable slower at higher quality settings. So it's up to the hardware reviewers to run their benchmarks at comparable quality settings.
Where is the root of the problem?
The current problem has not been introduced by the G70, but nVidia pushed the optimizations so far, that they have become apparent at the first glance . Forceware 78.03 dissolved the problem at least at "high quality". Nevertheless, the texture flickering at default driver settings ("quality") remained unchanged. The kick-off for the optimizations was given several years ago by Ati’s R200 (Radeon 8.500). The R200’s AF caused a massive deteriortion of it's filtering quality if textures were rotated by 45°. Furthermore, these textures were only filtered bilinear.
nVidia's NV20 (GeForce3) by contrast filtered all textures equally regardless of their angle, and even fully trilinear. Despite these massive differences, both settings where pitted against each other and of course the R200 won the benchmarks more often than not. Had the R200 had a logic for multisampling AA, it would have won all “quality” Benchmarks. As this was not the case, and Ati had to rely on supersampling, with additional AA, nVidia's 2xRGMSAA was faster than ATI's supersampling, what balanced the scores. Yet image quality was not comparable.
ATI's R300 (Radeon 9700) astounded the experts, because it was up to 3 times faster than the GeForce Ti4600 with AA/AF enabled. Though the R300 had better features than the NV25, like Pixel Shader 2.0 or 6x sparse MSAA, a part of the performance was achieved with with filtering optimizations, again. The quality of AF was better than on the R200 and trilinear filtering could be used in combination with AF. Anyhow, the determination of the LOD was still improper. Catalyst 3.2 introduced an optimisation for texture stages, that set filtering to bilinear when texture stage was >0. This feature has been removed in the meantime.
For all that, image quality was quite equal to nVidia - at least on Screenshots. Once again, reviewers ran their benchmarks with different image qualities without reserve. Still nVidia's NV3x filtering algorythmwas quite independent of texture angles, but nVidia was forced to add more and more "optimisations" to the drivers to increase performance by sacrificing quality bit by bit. As we all know, this still wasn't enough to catch up with the R300.
The NV40 established Shader Model 3.0 and at last rotated grid MSAA up to 4x. Unfortunately, the NV40 adapted the angle-dependeny anisotropic filtering. Ati's R420, unfortunately to nVidia, was even more powerful (helped by Catalyst A.I., which even improved the performance of the R300 by up to 40%) and so nVidia added all of ATI's filtering optimisations.
As ATI introduced the Catalyst Control Center, the texture stage optimisation was removed and the R520 (Radeon X1800) offered the option "high quality AF", which is almost as good as AF on GeForce3. Astoundingly this AF proves to be much faster then expected, without giving us room to point with the fingers to one or several optimizations responsible for this speed. We are eagerly awaiting if nVidia will take this as an opportunity as well to concentrate their efforts as well in future more on true quality rather than on optimizations.
Conclusion
Seemingly both, ATI and nVida, do intensive research into efficient AF. In our opinion, nVidia slightly has the edge, but this advantage is annihilated by their GPU-design, that lacks in parallelisation of sample- and ALU-clocks. On nVidia cards, an anisotropic filtering operation, that takes several sampleclocks, may jam the other ALUs. ATI doesn't have these problems, because their ALUs are more independent. Additionally available sampleclocks are used for oversampling,which is badly needed due to their AF-logic, without affecting the computing power.
Even though the high quality filter on nVida graphics cards became better since Forceware 78.03, huge framerate drops in certain situations seem to be inevitable and quality still isn't faultless. On the other hand we don't like that ATI doesn't allow application specific optimisations with disabled filtering optimisations and vice versa. Either all useful optimisations for the application are disabled as well as all filtering optimisations, or the filtering optimisations are enabled when Catalyst A.I. is in use.
nVidia's G70/71 is a very strong performer, but it’s texture quality is not totally convincing.. ATI's area AF on R520 and R580 is so outstanding, that the choice of the grapics card with the best image-quality seems to be obvious to the quality loving gamer. On the other Hand, nVidia SLI offers a great performance with supersampling anti-aliasing or hybrid modes, that may compensate for flickering textures. Supersampling anti-aliasing does not remedy MIP banding, but flickering textures are reducedin general. Running an SLI-System to fight texture flickering with supersampling anti-aliasing doesn't seem to be very economic, but that's a common issue when using high end hardware.
So, what is remaining after all? On the one hand, ATI and nVidia offer extreme powerful high end hardware. Despite different architectures, the performance of their graphics cards is pretty similar. On the other hand, almost every hardware reviewer runs benchmarks using default driver settings that don't provide comparable image quality. There are some exceptions to this. We want to compliment ComputerBase and PC Games Hardware on their current testing methods and all users on our forum, who provide our community with matchable benchmark results.
We hereby wish to appeal to our colleagues, that image quality does deserve closer attention and that the quality of their future reviews can be increased by exacting comparison, even if this takes more time.
Acknowledgement: Thanks a lot to aths for several corrections and suggestions and thank you, xfire, for running the benchmarks. After all, I want to thank several members of our forum, without their help writing this article would not have been possible.
* purchase waere auch eine Option, ich ziehe aber hier “buy” wegen der Naehe zu “buy in an argument” vor
fuer weitergehende Korrekturen und Anregungen waere ich dankbar
11:25 The R200’s AF had a poor quality on filtering textures, that were rotated by 45°. ersetzt durch :The R200’s AF caused a massive deteriortion of it's filtering quality if textures were rotated by 45°, und noch nicht zufrieden damit
11:40: nVidia's NV20 (GeForce3) by contrast filtered all textures equally regardless of their angle, and even fully trilinear.
11:50: Still nVidia's NV3x filtering algorythm
Vertigo
2006-03-24, 10:25:46
fuer weitergehende Korrekturen und Anregungen waere ich dankbar
erste Zeile : ... want to achieve
Hakkerstiwwel
2006-03-24, 10:27:35
erste Zeile : ... want to achieve
gefixed
tx
Vertigo
2006-03-24, 10:33:31
So, der Text nochmal als Ganzes mit allen bisherigen Korrekturen und Änderungen - könnte das jemand vom "Übersetzungsteam" nochmal überprüfen ?!
Page 1
While the use of Anti Aliasing (AA) and the anisotropic filter (AF) has only emerged among Pc users a few years ago, these features do have been available since more than half a decade now. At those early days, however, they were hardly used, because of either a misjudgement, or the video cards simply did not have enough power to allow playable frame rates with these features enabled. As an illustration, 3dfx’s supersampling method, which until recently was unrivalled, used to be considered as equivalent to a trivial blur filter such as nVidia’s “Quincux” modes. Even specialised Hardware sites, which were well aware of the IQ Features’ advantages, were unable to properly evaluate the AA quality.
While on gaming consoles the use of both AA and AF is just now emerging, nowadays each available Pc Graphics Card features these options. Back in Y2k you were required to have a card such as the legendary Voodoo5 in order to obtain a flickering free image with 4xSGSSA. Today, however even a sub 100 Eur card can deliver smooth edges and hence relatively flicker free images.
How is it then possible, many people certainly wonder, that even the birds are singing from the trees, that nVidia’s flagship card (G70/71) has a tendency to produce shimmering textures. What do you need to do to obtain the best possible image quality, who was at the origin of all these “optimisations”, and how can you escape this dilemma?
The optimal Image Quality
An image on screen in motion has, without IQ enhancing measures, the tendency to be subject to edge aliasing. This undesirable effect in the meantime is more or less under control thanks to among others the new transparency AA modes. For an appealing graphical sensation, however you need more than jag-less edges: good textures are at least as important. Now the next innovation comes into the game: a sharper presentation of distorted textures with the use of the anisotropic filter. And this filter has been the subject to discussions since a long time.
It would be much easier if high-end graphic cards weren’t benched against each other. For the only reason that at the first glance the options as shown in the control panels look similar, all possible settings have been pitted against one another in the past: nVidia’s 16 bit vs 3dfx’s 24 bit rendering, nVidia’s trilinear filter against 3dfx’s Mipmap-Dithering, nV20’s trilinear Af against R200 bilinear AF, as well as the R200’s supersampling AA against nV20’s multisampling AA. (the following sentence has been ignored)
From time to time screenshots were presented for image quality comparison purposes. The ingenuous reader ignored the fact that these screen shots were not very meaningful. So as a consequence it was no problem to adapt to the readers and present screen shots with a bad compression. (bitte jemand hier nachbessern, klingt sch..) Spotting differences between the test-scenes by using screen shots is not always an easy task. This is why very often the relevant scenes are magnified by using a bilinear or bicubic filter. As a consequence, the differences, which very often have the size of just 1 or 2 pixels, are distorted as they are mixed with the adjacent ones. (the correct filter to be used would be nearest neighbour)
The actual benchmarks as a contrast are presented in the most sophisticated ways. Bars are fading into the scene, multicoloured amplitudes are distracting from the fps, but all these figures and charts are meaningless as there is no common denominator: the image quality. In the meantime consensus has been reached for AA: gamma 2.2 adjusted 4xMSAA with the option to activate transparency AA in order to smooth texture aliasing caused by Alphatesting is comparable between Ati and nVidia. In what concerns AF however, only the number in front of the setting counts, regardless the actually applied degree of filtering. Of course this fact is exploited by the GPU vendors to obtain higher fps.
Although in this regard Ati cannot be considered a choirboy either, the “A.I. Low” setting can be attested a convincing combination of high performance and good image quality. So while Ati’s filtering optimizations, as a general rule, have no really adverse impact on IQ, and therefore are hardly noticed by gamers, with nVidia cards it is possible, by just a couple of “incorrect” clicks in the control panel, to obtain only fps-optimized shabby IQ, despite the fact that apparently the best possible IQ has been selected.
Let’s have a closer look at the, in the meantime, rather old Control Panel, which by the way is unofficially available in the meantime in a new, refined version. This Panel, straight after the installation looks as follows:
Screenshot
The (Systlemleistungsslider) is apparently coupled to the various optimizations. So, if the Slider is set to “High Quality”, the various optimizations (trilinear, anisotropic Mipfilter, anisotropic Sample?) can no longer be adjusted. These settings are now greyed, and set to “off”
If, however, the title is part of nVidia’s “optimization list”, the maximum possible quality which you may obtain, is the one which nVidia allows, and not what the hardware permits. UT2004 as an example allows only reduced trilinear (=brilinear) filtering on non primary texture stages. With Far Cry, too, a degradation of IQ could be noted with the appearance of the 77series drivers.
Our plea to nVidia is: please give us a driver setting, which allows it to enable the maximum possible image quality, in other words actually translate the “high quality” mode in-game if selected and displayed in the control panel. After all: who is happy to notice the sudden appearance of a “blister” in his favourite oldie title after having spent several hundreds of eur for a new graphics card?
Such a blister appears if the translation between MIP-maps is not interpolated as it should be. While Ati’s cards tend to oversample with activated AF, and consequently hide their imperfect implementation of the trilinear filter, nVidia’s mathematically more correct implementation is subject to produce more visible artefacts if reduced trilinear or bilinear filtering is applied.
Under certain circumstances such a bilinear filter may even be noticed in the driver’s “HQ” settings. For explanation purposes, here is a small summary of nVidia’s sampling optimizations:
1. trilinear optimization
- enabled by default
- reduced interpolation between MIP-maps (may result in blisters)
2. Anisotropic Mipfilter Optimization
- disabled by default
- switches to a brilinear on TS>0 (D3D) or all texture stages (OGL) (tends to produce MIP banding)
3. Anisotropic Sampling Optimization
- enabled by default
- general undersampling (flickering output)
“Optimization” is an expression used euphemistically by nVidia to describe a speed gain resulting from reduced quality. (to be fair: this is Ati’s interpretation as well). This is why we put the word “optimization” between quotation marks if it does not mean a true optimization, but just a speed gain obtained by reduced quality.
Selecting AA and AF, of course after having installed the driver, and only afterwards switching to the “high Quality” Mode will result, as already foretold, in MIP-Banding in OpenGL titles such as Doom3 or Quake3/4. Responsible for these phenomena are two peculiarities of the Forceware’s control panel:
- in case the “Quality” setting is selected, and AF is activated within the control panel, the automatic “Anisotropic MIPfilter Optimization” is enabled. As far as we know here nVidia has copied Ati’s former driver behaviour, with the difference, that nVidia applies the “optimization” even if AF has been selected in-game (contrary to Ati).
- Not deactivating all “optimizations” separately before switching into HQ, results in OpenGl, that they remain active. Everyone who does not know about this, or has not discovered these peculiarities by chance, will never obtain true “High Quality” in OpenGL titles with nVidia – since a considerable time now (tested with the major driver releases from 66.72 to 83.40)
___
Page 2
So, what does it mean for users forcing AF via nVidia's driver control panel?
How to get „High Quality“ AF:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable "High Quality".
3. Enable AA/AF.
4. Enable LOD-clamp.
How to get bilinear AF at "High Quality" settings:
1. Enable AA/AF.
2. Enable "High Quality".
3. Run a game using OpenGL.
low-grade "High Quality" | real "High Quality"
How to get "brilinear" filtering in Unreal Tournament 2004:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable „High Quality“.
3. Enable AA/AF.
4. Download this UT2004 testlevel [herunterladen].
5. Start the game, open the UT2004-console (tab key) and enter "firstcoloredmip 1".
Screenshot : testlevel | testlevel with "firstcoloredmip 1" enabled
How to get flickering textures in Riddick (this can be reduced to a minimum by enabling "High Quality" via nVidia's driver control panel):
1. Don't select a high resolution, for instance 1024*768.
2. Don't use 8xS or any supersampling-anti-aliasing mode.
3. Force 8x or 16x anisitropic filtering.
4. Run Riddick.
Detail taken from this scene. | flickering textures as animated GIF
Due to its clear-cut bumpmaps, Riddick is suitable for pointing out nVidia's poor filtering quality at default driver settings. These effects are mostly the same in any other game.
How to get flickering textures in racing games:
1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering).
2. Enable "High Quality".
3. Enable AA/AF .
4. Disable LOD-clamp (in fact it is disabled at default driver settings).
LOD-Clamp enabled | LOD-Clamp disabled
According to our tests, many programmers of racing games often use a negative LOD-bias for textures. There are two possible reasons for this. First of all, the asphalt textures are so much distorted because of the player's angle of view that they absolutely require AF. As this filter is either not available or drags too much performance on older cards, the textures are sharpened by using a negative LOB Bias.
This however causes flickering textures on nVidia graphics cards in such a way that nVidia needed to add a LOD-clamp switch into their driver control panel, which fixes the general texture-LOD bias to 0.0. nVidia recommends to activate the clamp switch when using anisotropic filtering.
On the other hand, a texture LOD of 0.0 obviously changes too early to the next lower mipmap causing fuzzy textures. Finally, in certain cases even the best possible quality offered by nVidia graphics cards is not as good as ATI's standard settings with activated optimisations. This issue has been discussed in several threads on our forum.
___
Page 3
What does nVidia want to achieve?
This time, we can’t believe in unintentional driver bugs. These filtering stunts’ intentions seem to be to pull the wool over the hardware reviewer's eyes. All panel bugs have one thing in common: while they degrade image quality, they lead to higher framerates. It's a curios thing, that some of our colleagues have in the past even published images of graphics card reviews showing obvious MIP binding. Paradoxically the reviewers then used these sharper textures, achieved by undersampling and accompanied by shimmering, as a criterion to demonstrate nVidia’s superior image quality
Even worse is that most of our colleagues use nVidia's default driver settings for their reviews. This is even more remarkable, because no user interested in a clean image will ever be satisfied with these settings. If you want to get rid of flickering textures, the first remedy is enabling "High Quality" settings via nVidia's driver panel. Most reviewers, sadly, seem not to notice the texture flickering caused by default driver settings. In our opinion, this is a problem, because a benchmark test must guarantee that every single graphics card is testet in a similar manner and textures do not flicker on Radeon cards and are still sharp nervertheless.
The ingenuous reader, who thinks having considered all aspects, is most likely to buy * the graphics card, that offers the best performance at "Quality". In case he is then not satisfied with the shimmering textures, he is told that the card is powerful enough to handle even the “High Quality” setting. Inquiries about the astonishing frame rate drops, which may reach up to 30%, are disposed of with the argument that even reputable print-magazines have only measured drops of 2-3%
chart
There would be such a simple solution: every reviewer plays the benchmark game for at least half an hour and keeps an eye on the image quality. Then he chooses a scene with sharp textures and takes a screenshot of this scene (one by each graphics card). If the reviewer notices differences between the images, he checks the driver settings again und tries to enable the best quality. Afterwards, he takes new screenshots and when the image quality is comparable, the driver settings fit the bill and the benchmark can be started with exactly these settings.
What's the cause for alarm, my card is doing well without flickering?
It is hard to judge such a subjective thing like "3D-quality on a computer monitor" in an objective way. So we won't say that nVidia's image quality on NV40/G70 based graphics cards is inadequate or insufficient. Especially on CRTs of different manufacturers the tube itself has greater influence on the image than texture filtering optimisations. Actually, texture filtering optimizations may be quite useful. In games like Serious Sam 2 or Quake 4 the combination of filter optimisations and quality mode runs these games up to 50% faster than high quality mode without optimisations.
Nevertheless, filtering optimizations worsen image quality for the benefit of framerate and to the disadvantage of comparability. Furthermore, not anybody spends 500€ on a graphics card that runs considerable slower at higher quality settings. So it's up to the hardware reviewers to run their benchmarks at comparable quality settings.
Where is the root of the problem?
The current problem has not been introduced by the G70, but nVidia pushed the optimizations so far, that they have become apparent at the first glance . Forceware 78.03 dissolved the problem at least at "high quality". Nevertheless, the texture flickering at default driver settings ("quality") remained unchanged. The kick-off for the optimizations was given several years ago by Ati’s R200 (Radeon 8.500). The R200’s AF caused a massive deteriortion of it's filtering quality if textures were rotated by 45°. Furthermore, these textures were only filtered bilinear.
nVidia's NV20 (GeForce3) by contrast filtered all textures equally regardless of their angle, and even fully trilinear. Despite these massive differences, both settings where pitted against each other and of course the R200 won the benchmarks more often than not. Had the R200 had a logic for multisampling AA, it would have won all “quality” Benchmarks. As this was not the case, and Ati had to rely on supersampling, with additional AA, nVidia's 2xRGMSAA was faster than ATI's supersampling, what balanced the scores. Yet image quality was not comparable.
ATI's R300 (Radeon 9700) astounded the experts, because it was up to 3 times faster than the GeForce Ti4600 with AA/AF enabled. Though the R300 had better features than the NV25, like Pixel Shader 2.0 or 6x sparse MSAA, a part of the performance was achieved with with filtering optimizations, again. The quality of AF was better than on the R200 and trilinear filtering could be used in combination with AF. Anyhow, the determination of the LOD was still improper. Catalyst 3.2 introduced an optimisation for texture stages, that set filtering to bilinear when texture stage was >0. This feature has been removed in the meantime.
For all that, image quality was quite equal to nVidia - at least on Screenshots. Once again, reviewers ran their benchmarks with different image qualities without reserve. Still nVidia's NV3x filtering algorythmwas quite independent of texture angles, but nVidia was forced to add more and more "optimisations" to the drivers to increase performance by sacrificing quality bit by bit. As we all know, this still wasn't enough to catch up with the R300.
The NV40 established Shader Model 3.0 and at last rotated grid MSAA up to 4x. Unfortunately, the NV40 adapted the angle-dependeny anisotropic filtering. Ati's R420, unfortunately to nVidia, was even more powerful (helped by Catalyst A.I., which even improved the performance of the R300 by up to 40%) and so nVidia added all of ATI's filtering optimisations.
As ATI introduced the Catalyst Control Center, the texture stage optimisation was removed and the R520 (Radeon X1800) offered the option "high quality AF", which is almost as good as AF on GeForce3. Astoundingly this AF proves to be much faster then expected, without giving us room to point with the fingers to one or several optimizations responsible for this speed. We are eagerly awaiting if nVidia will take this as an opportunity as well to concentrate their efforts as well in future more on true quality rather than on optimizations.
Conclusion
Seemingly both, ATI and nVida, do intensive research into efficient AF. In our opinion, nVidia slightly has the edge, but this advantage is annihilated by their GPU-design, that lacks in parallelisation of sample- and ALU-clocks. On nVidia cards, an anisotropic filtering operation, that takes several sampleclocks, may jam the other ALUs. ATI doesn't have these problems, because their ALUs are more independent. Additionally available sampleclocks are used for oversampling,which is badly needed due to their AF-logic, without affecting the computing power.
Even though the high quality filter on nVida graphics cards became better since Forceware 78.03, huge framerate drops in certain situations seem to be inevitable and quality still isn't faultless. On the other hand we don't like that ATI doesn't allow application specific optimisations with disabled filtering optimisations and vice versa. Either all useful optimisations for the application are disabled as well as all filtering optimisations, or the filtering optimisations are enabled when Catalyst A.I. is in use.
nVidia's G70/71 is a very strong performer, but it’s texture quality is not totally convincing.. ATI's area AF on R520 and R580 is so outstanding, that the choice of the grapics card with the best image-quality seems to be obvious to the quality loving gamer. On the other Hand, nVidia SLI offers a great performance with supersampling anti-aliasing or hybrid modes, that may compensate for flickering textures. Supersampling anti-aliasing does not remedy MIP banding, but flickering textures are reducedin general. Running an SLI-System to fight texture flickering with supersampling anti-aliasing doesn't seem to be very economic, but that's a common issue when using high end hardware.
So, what is remaining after all? On the one hand, ATI and nVidia offer extreme powerful high end hardware. Despite different architectures, the performance of their graphics cards is pretty similar. On the other hand, almost every hardware reviewer runs benchmarks using default driver settings that don't provide comparable image quality. There are some exceptions to this. We want to compliment ComputerBase and PC Games Hardware on their current testing methods and all users on our forum, who provide our community with matchable benchmark results.
We hereby wish to appeal to our colleagues, that image quality does deserve closer attention and that the quality of their future reviews can be increased by exacting comparison, even if this takes more time.
Acknowledgement: Thanks a lot to aths for several corrections and suggestions and thank you, xfire, for running the benchmarks. After all, I want to thank several members of our forum, without their help writing this article would not have been possible.
Mr. Lolman
2006-05-16, 22:08:12
So, überprüfte Version:
While the use of Anti Aliasing (AA) and the Anisotropic Filter (AF) has only emerged among Pc users a few years ago, these features have been available since more than half a decade. In those days, however, they were hardly ever used due to the underestimated ability of these features or the video cards simply did not have enough power to allow playable frame rates while using these features. As an illustration, 3dfx’s supersampling method, which until recently was unrivalled, used to be considered as equivalent to a trivial blur filter such as nVidia’s “Quincux” modes. Even specialised Hardware sites, which were well aware of the IQ Features’ advantages, were unable to properly evaluate the AA quality.
While on gaming consoles the use of both AA and AF is just now emerging, nowadays each available Pc Graphics Card already features these options. Back in Y2k you were required to have a card such as the legendary Voodoo5 in order to obtain a flickering free image with 4xSGSSA. Today, however even a sub 100 Euro card can deliver smooth edges and relatively flicker free images.
Many people must wonder how it is even possible that nVidia’s flagship card (G70/71) has a tendency to produce shimmering textures. What do you need to do to obtain the best possible image quality? Who was at the origin of all these “optimisations” and how can you escape this dilemma?
The optimal Image Quality
An image on screen in motion has, without IQ enhancing measures, the tendency to generate edge aliasing. In the meantime, this undesirable effect is more or less under control thanks to the new transparency AA modes among others. For an appealing graphical sensation, however, you need more than jag-less edges; good textures are just as important. The newest innovation that has joined in the game: a sharper presentation of distorted textures with the use of the anisotropic filter. This filter has been the subject of discussions for quite awhile.
It would be much easier if high-end graphic cards weren’t benched against each other. For the only reason that at first glance the options as shown in the control panels look similar, all possible settings have been pitted against one another in the past: nVidia’s 16 bit vs 3dfx’s 24 bit rendering, nVidia’s trilinear filter against 3dfx’s Mipmap-Dithering, nV20’s trilinear AF against R200 bilinear AF, as well as the R200’s supersampling AA against nV20’s multisampling AA.
From time to time screenshots were presented for image quality comparison purposes. An unexperienced reader would not realize that these screen shots were not representing the true quality of the image. Consequently, the companies adapted to the readers lack of demands, easily impressing them with low quality screenshot comparisons. Spotting differences between the test-scenes by using screen shots is not always an easy task. This is why very often the relevant scenes are magnified by using a bilinear or bicubic filter. As a consequence, the differences which very often have the size of just 1 or 2 pixels are distorted as they are mixed with the adjacent ones. (the correct filter to be used would be the “nearest neighbour”)
In contrast, the actual benchmarks are presented in the most sophisticated ways. Bars are fading into the scene, multicoloured amplitudes are distracting from the fps, but all these figures and charts are meaningless as there is no common denominator: the image quality. In the meantime consensus has been reached for AA: gamma 2.2 adjusted 4xMSAA with the option to activate transparency AA in order to smooth texture aliasing caused by Alpha-testing is comparable between Ati and nVidia. Concerning AF, only the number in front of the setting counts, regardless of the actual applied degree of filtering. Of course this fact is exploited by the GPU vendors to obtain higher fps.
Although in this regard Ati cannot exactly be considered innocent either, the “A.I. Low” setting can be seen as a convincing combination of high performance and good image quality. As a general rule, Ati’s filtering optimizations do not really have an adverse impact on IQ, and therefore are hardly noticed by gamers. With nVidia cards it is possible to obtain fps-optimized shabby IQ by just a couple of “incorrect” clicks in the control panel, despite the fact that the “best possible IQ” has been selected.
Let’s have a closer look at the rather old Control Panel, which by the way, is unofficially available now in a new, more refined version. Straight after the installation, this Panel looks as follows:
Screenshot
The (Systlemleistungsslider) is apparently coupled to the various optimizations. So, if the Slider is set to “High Quality” the various optimizations (trilinear, anisotropic Mipfilter, anisotropic Sample) can no longer be adjusted. These settings are now greyed and set to “off”.
However, if the title is part of nVidia’s “optimization list” the maximum possible quality which you may obtain is the one which nVidia allows and not what the hardware permits. As an example, UT2004 allows only reduced trilinear (=bilinear) filtering on non-primary texture stages. With Far Cry a degradation of IQ could be noted with the appearance of the 77-series drivers, too.
Our plea to nVidia is: please give us a driver setting, which allows us to enable the maximum possible image quality. In other words, actually translate the “high quality” mode in-game if selected and displayed in the control panel. After all, who is happy to notice the sudden appearance of a “blister” in his favourite oldie title after having spent several hundreds of Euro for a new graphic card?
Such a blister appears if the translation between MIP-maps is not interpolated as it should be. While Ati’s cards tend to oversample with activated AF and consequently hide their imperfect implementation of the trilinear filter, nVidia’s mathematically more correct implementation is subject to produce more visible artefacts if reduced trilinear or bilinear filtering is applied.
Under certain circumstances such a bilinear filter may even be noticed in the driver’s “HQ” settings. For explanation purposes, here is a small summary of nVidia’s sampling optimizations:
1. trilinear optimization
- enabled by default
- reduced interpolation between MIP-maps (may result in blisters)
2. Anisotropic Mipfilter Optimization
- disabled by default
- switches to a brilinear on TS>0 (D3D) or all texture stages (OGL) (tends to produce MIP banding)
3. Anisotropic Sampling Optimization
- enabled by default
- general undersampling (flickering output)
“Optimization” is an expression used euphemistically by nVidia to describe a speed gain resulting from reduced quality. (to be fair: this is Ati’s interpretation as well). This is why we put the word “optimization” between quotation marks if it does not mean a true optimization, but just a speed gain obtained by reduced quality.
After installing the driver and selecting AA and AF, switching to the “high Quality” Mode will result in MIP-Banding in OpenGL titles such as Doom3 or Quake3/4. Responsible for these phenomena are two peculiarities of the Forceware’s control panel:
- in case the “Quality” setting is selected, and AF is activated within the control panel, the automatic “Anisotropic MIPfilter Optimization” is enabled. As far as we know here nVidia has copied Ati’s former driver behaviour, differing, however in the fact that nVidia applies the “optimization” even if AF has been selected in-game (contrary to Ati).
- Not deactivating all “optimizations” separately before switching into HQ causes them to remain active in OpenGL. Everyone who does not know about this or has not discovered these peculiarities by chance will never obtain true “High Quality” in OpenGL titles with nVidia – since a considerable time now (tested with the major driver releases from 66.72 to 83.40)
Vertigo
2006-05-17, 09:59:09
:eek: und ich dachte schon die englische Version läge six feet under :wink: - ist denn nun eine Veröffentlichung absehbar?
Mr. Lolman
2006-05-17, 11:04:43
Heinrich[/POST]']:eek: und ich dachte schon die englische Version läge six feet under :wink: - ist denn nun eine Veröffentlichung absehbar?
Sry. Aber meine englischsprachige Connection hatte kaum Zeit.
Einer Veröffentlichung steht nun nix mehr im Weg. Imo.
/edit: Ich merk grad, dass der "Systemleistungsslider" noch auf deutsch ist...
Vertigo
2006-05-17, 11:26:56
Mr. Lolman[/POST]']Sry. Aber meine englischsprachige Connection hatte kaum Zeit.
Einer Veröffentlichung steht nun nix mehr im Weg. Imo.
/edit: Ich merk grad, dass der "Systemleistungsslider" noch auf deutsch ist...
Ich habe in dem Zusammenhang neulich mal "quality slider bar" gelesen, was es ja sehr gut trifft
EDIT: irgendwie hatte ich den Text als länger in Erinnerung ...
Mr. Lolman
2006-05-17, 13:08:53
Heinrich[/POST]']Ich habe in dem Zusammenhang neulich mal "quality slider bar" gelesen, was es ja sehr gut trifft
EDIT: irgendwie hatte ich den Text als länger in Erinnerung ...
Shit. stimmt. :redface:
Hat mir der Sack tatsächlich die 2. und 3. Seiten unterschlagen...
Hakkerstiwwel
2006-05-23, 09:26:57
Hatte schon gedacht, dass nie mehr was passiert. Mein Original scheint ja ganz passabel gewesen zu sein. Bitte 3te Zeile ein Komma zwischen "Features" und "or", da "or" inklusive ist.
Mr. Lolman
2006-05-23, 11:28:28
Ok. Das Problem ist, dass der Typ, der das mit seiner (mittlerweile) Exfreundin korrigiert erst nächste Woche wieder Zeit hat... ;(
Vll. kennt ihr noch jemanden. Und meine sonstigen englischsprachigen Connects kennen sich entweder garnicht mit der Materie aus, oder würden es nach ihrem Gutdünken umschreiben.
Sry, für die Verzögerungen...
Vertigo
2006-05-23, 14:57:17
Mr. Lolman[/POST]']Ok. Das Problem ist, dass der Typ, der das mit seiner (mittlerweile) Exfreundin korrigiert erst nächste Woche wieder Zeit hat... ;(
Vll. kennt ihr noch jemanden. Und meine sonstigen englischsprachigen Connects kennen sich entweder garnicht mit der Materie aus, oder würden es nach ihrem Gutdünken umschreiben.
Sry, für die Verzögerungen...
Also wie wäre es denn, wenn die damaligen Übeltäter selbst nochmal zu Werke gehen? Den nötigen Abstand hätte man jetzt doch und so schlecht ist der Text nun auch wieder nicht.
Hakkerstiwwel
2006-05-23, 16:08:10
Touchdown[/POST]']Also wie wäre es denn, wenn die damaligen Übeltäter selbst nochmal zu Werke gehen? Den nötigen Abstand hätte man jetzt doch und so schlecht ist der Text nun auch wieder nicht.
Glaube ich dass es so eine gute Idee waere (hallo ex-Heinrich). Leider ist man immer selber auf verschiedene Uebersetzungen/Redewendungen fixiert. Optimal waere es gewesen wenn z.b Zeckensack das Ganze noch mal ueberlesen haette.
Mr. Lolman
2006-05-23, 20:42:02
Ja, Zecki das durchlesen zu lassen wär schon mal ne feine idee.
Dauert nicht mehr lang, und dann ist der Artikel furchtbar unaktuell (neues NV Control Panel) ;(
Zecki hab ich aber schon länger nicht mehr gesehen. Hat jemand von euch nen guten Draht zu ihm?
Mr. Lolman
2006-05-25, 11:02:16
Da ich nicht weiss, ob und wann Zecki antwortet:
Wir suchen noch dringenst Korrekturleser für den Artikel!
/edit: Da hier niemand reinschaut, mach ich nen Thread im Offtopic auf...
Vertigo
2006-05-26, 15:13:30
Mr. Lolman[/POST]'] Da hier niemand reinschaut, mach ich nen Thread im Offtopic auf...
Oh ja, die rennen Dir glatt die Bude ein. Hat sich schon einer gemaildet? Irgendwie scheint keiner Bock zu haben ... das "Übersetzungs-Team" (http://www.forum-3dcenter.org/vbulletin/showthread.php?t=103273) - wieso kann das keiner von denen machen?! Die 20 Minuten werden sich doch einrichten lassen ...
Melbourne, FL
2006-05-28, 17:21:51
Touchdown[/POST]']Die 20 Minuten werden sich doch einrichten lassen ...
Entschuldigung wenn ich das so sage...aber für den Text braucht man mehr als 20min. Ich setz mich mal ran. Kann mir in der Zwischenzeit mal jemand den Tag für das durchstreichen von Text verraten...also wie unterstreichen...nur eben so, dass das Wort durchgestrichen ist.
Alexander
Melbourne, FL
2006-05-28, 21:09:38
Ok...ich hab die erste Seite mal überarbeitet. Wenn die anderen Seiten ähnlich sind, muss auf jeden Fall noch einiges an Arbeit investiert werden. Insgesamt sind viele Dinge missverständlich oder unglücklich formuliert, so dass der originale Sinn nicht immer erhalten bleibt (ich musste auch den ursprünglichen Artikel zur Hand nehmen, um einige Dinge zu verstehen). Selbst die Version, die ich gerade fertig gemacht habe, müsste noch dringend Feinschliff erhalten:
While the use of Anti Aliasing (AA) and the Anisotropic Filtering (AF) has only emerged among Pc PC users a few years ago, these features have been available since more than half a decade. In those days, however, they were hardly ever used due to their underestimated ability to improve the image quality of these features or due to the video cards that simply did not have enough power to allow playable frame rates while using themse features. As an illustration, 3dfx’s supersampling method, which until recently was unrivalled, used to be considered as equivalent to a trivial blur filter such as nVidia’s “Quincux” modes. Even specialised Hardware sites, which were well aware of the IQ Features’ advantages, were unable to properly evaluate the AA quality.
While on gaming consoles the use of both AA and AF is just now emerging, nowadays each available Pc PC Graphics Card already features these options. Back in Y2k you were required to have a card such as the legendary Voodoo5 in order to obtain a flickering free image with using 4xSGSSA. Today, however However, today even a sub 100 Euro card can deliver smooth edges and therefore relatively flicker free images.
Den Übergang finde ich etwas hart...mir fällt aber gerade keine gute Formulierung ein.
Many people must wonder how it is even possible that nVidia’s flagship card (G70/71) has a tendency to produce shimmering textures. What do you needs to do be done to obtain the best possible image quality? Who was at the origin of started with all these “optimiszations” and how can you escape this dilemma?
The optimal Image Quality
A moving image on the screen in motion has, without the application of IQ enhancing measures techniques, always has the tendency to generate edge aliasing. In the meantime Nowadays, this undesirable effect is more or less under control thanks to AA and has recently been reduced even more by the introduction of the new transparency AA modes among others. However, Ffor an appealing graphical sensation , however, you need more than jag-less edges; good textures are just as important. Anisotropic filtering was introduced long ago to obtain The newest innovation that has joined in the game: a sharper presentation of distorted textures with the use of the anisotropic filter. but Tthis filter has been the subject of discussions for quite a while.
ItEverything would be much easier if high-end graphic cards weren’t benched against each other. Many different settings were benched against each other just becauseFor the only reason that at first glance the options as shown in the control panels looked similar at the first glance, all possible settings have been pitted against one another in the past: nVidia’s 16 bit vs 3dfx’s 242 bit rendering, nVidia’s trilinear filter against 3dfx’s Mipmap-Dithering, nV20’s trilinear AF against R200 bilinear AF, as well as the R200’s supersampling AA against nV20’s multisampling AA. The reviewers probably thought that the obtained scores were close enough to the expected values ;)
From time to time screenshots were presented for image quality comparison purposes. These pictures often showed scenes that were not representative of the effect that they were supposed to show. TheAn unexperienced reader had no chancewould not to realize this and had to believe the words of the reviewers. that these screen shots were not representing the true quality of the image. Consequently, the companiesreviewers adapted to the readers lack of demandsexperience, easily impressing them with low quality screenshot comparisons. Spotting differences between the test-scenes by using screen shots is not always an easy task. This is why very Therefoe, quite often the relevant scenes are magnified by using a bilinear or bicubic filter. As a consequence, the differences which very often have the size of just 1 or 2 a few pixels are distorted as they are mixed with the adjacent ones. (the correct filter to be used would be the “nearest neighbour”).
In contrast, the actual benchmarks scores are presented in the most sophisticated ways. Bars are fading into the scene, multicoloured amplitudes are distracting from the fps values, but all these figures and charts are meaningless asif there is no common denominator: the image quality. In the meantime consensus has been reached for AA: gamma 2.2 adjusted 4xMSAA with the option to activate transparency AA in order to smooth texture aliasing caused by Aalpha-testing is comparable between Ati and nVidia. CHowever, concerning AF, still only the number in front of the setting counts, regardless of the actually applied degree of filtering. Of course this fact is exploited by the GPU vendors to obtain higher fps.
Although in this regard Ati cannot exactly be considered innocent either, the “A.I. Low” setting can be seen regarded as a mainly convincing combination of high performance and good image quality. In generalAs a general rule, Ati’s filtering optimizations do not really have an adverse only have little impact on IQ, and therefore are hardly noticed by gamers. In contrast, Wwith nVidia cards it is possible to obtain fps-optimized shabby IQ by just a couple of “incorrect” clicks in the control panel, despite the fact that the seemingly “best possible IQ” has been selected.
Let’s have a closer look at the rather old Control Panel, which by the way, is unofficially available now in a new, more refined version. Straight afterDirectly following the installation, this Ppanel looks as follows:
Screenshot
The (Systlemleistungsslider) is apparently coupled to the various optimizations. So, if the Slider is set to “High Quality” the various optimizations (trilinear, anisotropic Mipfilter, anisotropic Sample) can no longer be adjusted. These settings are now greyed out and set to “off”.
However, if the title is part of nVidia’s “optimization list” the maximum possible quality which you may can be obtained is the one which nVidia allows and not what the hardware permits. As an example, UT2004 allows only reduced trilinear (=brilinear) filtering on non-primary texture stages. With Far Cry a degradation of IQ could be noted with the appearance of the 77-series drivers, too.
Our plea to nVidia is: please give usprovide a driver setting, whichthat allows usthe user to enable the maximum possible image quality without any restrictions. In other words,That means that the selected “High Quality” is actually translated into the “hHigh qQuality” mode in-game.if selected and displayed in the control panel. After all, who is happy to notice the sudden appearance of a “blister” in his favourite oldie title after having spent several hundreds of EuroDollars for a new graphics card?
Such a blister appears if the translationtransition between MIP-maps is not correctly interpolatedas it should be. While Ati’s cards tend to oversample the textures when AF is with activated AF but
and consequently hide their imperfect implementation of the trilinear filter by application of a slightly negative LOD-Bias, nVidia’s mathematically more correct implementation is pronesubject to produce more visible artefacts if reduced trilinear or bilinear filtering is applied.
Under certain circumstances such a bilinear filter may even be noticed in the driver’s “High Quality” settings. For explanation purposes, here is a small summary of nVidia’s samplfiltering optimizations:
1. tTrilinear optimization
- enabled by default
- reducedes the interpolation between MIP-maps (may result in blisters)
2. Anisotropic Mmipfilter Ooptimization
- disabled by default
- switches to aforces brilinear filtering on TStexture stages>0 (D3D) or all texture stages (OpenGL) (tends to produce MIP banding)
3. Anisotropic Sampling Optimization of the anisotropic sampling pattern
- enabled by default
- general undersampling is applied (image is more prone to flickering output)
The term “Ooptimization” is an expression used euphemistically used by nVidia to describe a speed gain resulting from reduced image quality. (to be fair: this is Ati’s interpretation of the term as well). This is why we put the word “optimization” generally between quotation marks if it does not mean a true optimization, but just a speed gain obtained by reduced quality.
If the “High Quality” mode is selected afterAfter installing the driver and selectingon of AA and AF, switching to the “high Quality” Mode will result in MIP-Banding in OpenGL titles such as Doom3 or Quake3/4 is easily observed. Responsible for theseis phenomenonaon are two peculiarities of the Forceware’s control panel:
- in caseIf the “Quality” setting is selected, and AF is activated within the control panel, the automatic Anisotropic MIPmipfilter Ooptimization is automatically enabled. As far as we knowcan tell here nVidia tried to reproduce has copied Ati’s former driver behaviour., differing, hHowever, there is the difference that in the fact that nVidia applies the “optimization” even if AF has been selected in-game (contrary to Ati).
- Not deactivating all “optimizations” separately before switching into HQ“High Quality” mode causes them to remain active in OpenGL. Everyone who does not know about this or has not discovered these peculiaritiesdoes not make the selections in the right sequence by chance will never obtain true “High Quality” in OpenGL titles with nVidia – and this it true since a considerable amount of time now (tested with themost major driver releases from 66.72 to 83.40).
Mr. Lolman
2006-05-29, 10:33:38
Wow. Thx.
Mr. Lolman
2006-05-31, 10:20:44
Seite 2 & 3 sind angeblich einfacher als Seite 1.... :naughty:
Melbourne, FL
2006-05-31, 13:30:53
Mr. Lolman[/POST]']Seite 2 & 3 sind angeblich einfacher als Seite 1.... :naughty:
Ähm...öhm...*pfeiff*...*wegguck*
Tut mir wirklich leid aber im Moment hab ich einfach nicht die Zeit die anderen Seiten auch noch zu machen. Vielleicht findet sich ja jemand anderes...
Alexander
mofhou
2006-06-01, 22:09:25
So hier mal Seite 2
Seite 2:
"1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering)."
Müsste optimizations heißen (kommt noch ein paar mal vor)
"4. Download this UT2004 testlevel [herunterladen]."
Müsste "test level" heißen, oder? (komm auch ein paar mal vor)
" which fixes the general texture-LOD bias to 0.0. nVidia recommends to activate the clamp switch when using anisotropic filtering."
NVidia müsste groß geschrieben werden imho.
Alles ohne Gewähr...
mfg
mofhou
PS: :up:Für die Anstrengungen des Ü-Teams
Lord_ofThe_Shit
2006-06-01, 22:50:39
mofhou[/POST]']
"1. Disable all filtering optimisations (trilinear, anisotropic and mip filtering)."
Müsste optimizations heißen (kommt noch ein paar mal vor)
In diesem Ausnahmefall ist "optimisation" allerdings die britsche Version, "optimization" wäre das amerikanische Pendant; insofern kommt es halt drauf an, woran man sich orientieren möchte.
edit: mir ist gerade aufgefallen, dass teilweise die amerikanische, teilweise die britische Schreibweise verwendet wird; das sollte man natürlich einheitlich regeln ;)
Lord_ofThe_Shit
2006-06-01, 23:02:59
Ich weiß zwawr nicht, welche Version von Seite 2 nun die aktuelle ist, aber bei einer Version ist mir beim drittletzten Satz auf Seite 2 noch folgendes aufgegfallen:
"On the other hand, a texture LOD of 0.0 obviously changes too early<--> to the next lower mipmap causing fuzzy textures."
So müsste es zumindest meiner Meinung nach sein; aber dass ich mich intensiv mit Englisch beschäftigt habe, ist schon `ne Weile her, deshalb keine Gewähr ;)
mofhou
2006-06-02, 01:02:15
Lord_ofThe_Shit[/POST]']In diesem Ausnahmefall ist "optimisation" allerdings die britsche Version, "optimization" wäre das amerikanische Pendant; insofern kommt es halt drauf an, woran man sich orientieren möchte.
Stimmt,ich wäre dafür die amerikanische zu verwenden, da ja die Grafikkartenhersteller aus Nordamerika kommen und die amerikanische Variante auch beim Googlefght gewinnt ( http://www.googlefight.com/index.php?lang=en_GB&word1=optimisations&word2=optimizations ).
mfg
mofhou
Melbourne, FL
2006-06-02, 09:45:19
mofhou[/POST]']Stimmt,ich wäre dafür die amerikanische zu verwenden, da ja die Grafikkartenhersteller aus Nordamerika kommen und die amerikanische Variante auch beim Googlefght gewinnt ( http://www.googlefight.com/index.php?lang=en_GB&word1=optimisations&word2=optimizations ).
mfg
mofhou
Jup...sehe ich auch so. Ich habe auch versucht die erste Seite "American English" zu machen...kann aber sein, dass ich ein paar Stellen übersehen habe.
Alexander
Basti_1985
2006-10-11, 16:54:57
Ich weiß zwawr nicht, welche Version von Seite 2 nun die aktuelle ist, aber bei einer Version ist mir beim drittletzten Satz auf Seite 2 noch folgendes aufgegfallen:
"On the other hand, a texture LOD of 0.0 obviously changes too early<--> to the next lower mipmap causing fuzzy textures."
So müsste es zumindest meiner Meinung nach sein; aber dass ich mich intensiv mit Englisch beschäftigt habe, ist schon `ne Weile her, deshalb keine Gewähr ;)
So weit ich das verstanden habe, soll das causing ein verkürzten Relativsatz zu mipmap einleiten. Mipmap und causing bilden eine Einheit.
Das causing kann sich auch auf texture LOD of 0.0 beziehen, weil Kommata (neudeutsch: "Kommas") nicht immer zur Abtrennung Pflicht sind. Der Satz wäre dennoch richtig.
Lord_ofThe_Shit
2006-10-12, 21:00:02
So weit ich das verstanden habe, soll das causing ein verkürzten Relativsatz zu mipmap einleiten. Mipmap und causing bilden eine Einheit.
Das causing kann sich auch auf texture LOD of 0.0 beziehen, weil Kommata (neudeutsch: "Kommas") nicht immer zur Abtrennung Pflicht sind. Der Satz wäre dennoch richtig.Ich denke nicht, dass man das so sehen kann; zumindest nicht, dass "mipmap" und "causing" eine Einheit bilden. Um einen Relativsatz handelt es sich zwar ohne Frage, allerdings bezieht er sich auf den kompletten Hauptsatz; nämlich auf die Änderung des LOD. Die Mipmaps selbst tun ja nichts, erst die (zu frühe) Änderung führt zum Qualitätsverlust, der im Relativsatz beschrieben wird.
Viele Grüße
edit: Ich möchte aber nicht vehement behaupten, dass die beschriebene Anordnung nicht richtig ist. Dazu bin ich zu lange aus dem Englischen raus. Außerdem verstehe ich genau, was du meinst.
Mr. Lolman
2007-04-24, 11:34:24
*löschen*
Mr. Lolman
2007-04-24, 11:34:54
*löschen*
vBulletin®, Copyright ©2000-2024, Jelsoft Enterprises Ltd.