Leonidas
2001-09-27, 21:18:39
Ich gebe mal wieder, was im Unreal 2 Forum an Statements der U2-Programmierer war.
http://ina-community.com/forums/showthread.php?threadid=134263
Es ist leider etwas verstreut, deshalb hab ich das hier zusammengefasst:
Chris Hargrove
Look guys, here's the deal. We've already talked about this publicly in interviews and such, so this is not new information, but I figure it's worth reiterating.
First, I'll repost what Mike Verdu said last month:
quote:
--------------------------------------------------------------------------------
Q: What are the system requirements looking like at this point?
A: Right now we're developing a game that will be playable on a PIII-500 with any hardware accelerator card that supports T&L (e.g. GeForce I, 2, 3, ATI Radeon, etc.). The game will look fantastic on a high end system, but should be playable on a good mid-range machine. We are still evaluating how to support lower end systems with TNT2 and Voodoo3/4 class cards... Infogrames will make a final determination later this year.
--------------------------------------------------------------------------------
Translation: If you don't have a card that supports T&L, we will not make any promises at this point that the game will be playable. Everything we're doing tech-wise assumes that T&L is present. This isn't just an assumption of our own technology, it's also an assumption of the Unreal Warfare engine itself. If your card doesn't have T&L, you might be able to run the game, but probably not as well as you might like.
Seriously guys, T&L cards are the baseline these days. Even GeForce2 cards are less than a hundred bucks now. By the time this game ships next year, if you can't afford a T&L card you probably can't afford to buy the game... that's how little they cost.
quote:
--------------------------------------------------------------------------------
it supports Software T&L, which means the CPU is doing the calculations todays CPUS 1Ghz and over, can do what hardware T&L can do and in some cases even better
--------------------------------------------------------------------------------
In terms of the argument about how fast CPUs are these days, yes they're fast... but we expect T&L because the CPU is still going to be very busy doing *other things*. If the CPU gets bogged down doing transformation and lighting calculations that otherwise can be offloaded to the video card (and that's a lot of calculations, given our high poly count), it can be a *big* drain on performance.
The bottom line is, get a T&L card, really. It's not just for Unreal 2, you know... plenty of games out there already expect it, and that trend will continue.
Chris Hargrove
quote:
--------------------------------------------------------------------------------
Why not just say "Okay we'll do it but if they turn out to be too slow, don't say I didn't warn you" instead of "Bah I bet they'll be too slow so I won't even bother" ?
--------------------------------------------------------------------------------
There isn't any "we'll do it" involved here; we're going through Direct3D, and we're trying to keep the card-specific checks to a minimum. So how well the card performs depends on the hardware and the D3D drivers. The only thing that we control is which Direct3D features we take advantage of, and those (along with other related things like polygon counts) are chosen based on what we feel is now a "standard" baseline. T&L falls in this baseline.
I'm not saying that you shouldn't even try using a non-T&L card if you want... you can feel free to do that if it makes you happy. But I'm very familiar with the amount of data we're pushing through the pipeline, and I'm telling you that your expectations shouldn't be too high if you don't have T&L. In terms of the "I won't even bother" thing, the only way we could get around a lack of T&L is by scaling back our content, and I'm sorry but we are not about to do that.
Matthias Worch
Cyborg: Dude, don't you think you're embarrasing yourself by trying to tell us why something in our game works/doesn't work? Nobody would like to support the widest possible range of available cards more than us - that does include the Kyro2 and we'll definitely look into making it work. But right now it's not supported, and that's because the game fails to find a T&L unit on the 3D card, not because of some texture compression scheme or something else. End of story
The decision to make the UW engine rely on hardware T&L was made over a year ago when it looked like ALL major card manufacturers were implementing T&L chips in their upcoming cards. 3DFX had already announced it for their post-Voodoo5 card (and the company was struggling anyway) and the two other major 3D card manufacturers (ATI and nVidia) already had it in their GeForce and Radeon line. And then (just a few months ago) PowerVR released a new 3D card that defies that convention by not having hardware T&L on-board. And we somehow have to deal with it. That's regretable and as I said we'll certainly look into adding support for Kyro2 (and possibly other low-end 3D cards), but this isn't a "just do it real quick" thing, we're talking about changing a major assumption that the engine is built on. An assumption that was made based on the fact that all major 3D card companies had T&L only cards in their product lineup/pipeline. PowerVR breaking that rule isn't exactly our fault...
Chris Hargrove
quote:
--------------------------------------------------------------------------------
An assumption that was made based on the fact that all major 3D card companies had T&L only cards in their product lineup/pipeline. PowerVR breaking that rule isn't exactly our fault...
--------------------------------------------------------------------------------
Not only is this not our fault, but as far as I'm concerned it was a very bad idea on the part of PowerVR.
No offense to PowerVR or anything, but this isn't the first time they've made a somewhat flawed decision in terms of 3D hardware development. ZBuffer-less HSR, anyone?
Instead of whining at us for wanting T&L, you might be better off complaining to PowerVR for releasing a substandard product. I don't care how much else the card has or how it may seem to perform in some games... hardware T&L is a standard these days.
Having an old card which predates widespread T&L is one thing. But having an old card is analagous to having an old computer; sometimes you just need to upgrade to get good performance in the newest games (I've had to live with it just like everybody else).
Having a relatively new card which *still* doesn't have T&L is something else, and if that new card doesn't perform all that well because of the lack of a now-standard feature, then it's no better in my eyes than an old card. Return it or throw it out, and get something better.
P.S. Oh, and Cyborg... your repeated "it has T&L it's just doing it in software" statements are pretty ridiculous since I've already stated twice why that is not sufficient for us. Do I need to cut&paste that same text over and over before it sinks in?
Chris Hargrove
quote:
--------------------------------------------------------------------------------
People don't care whether their cards support T&L or not because most of them either don't know what T&L is or don't know if their cards support it. What's important in the end is how your whole system will perform. You cannot throw away a new card with no T&L support just because someone tells you so. You can, however, throw away a card because it's simply not performing well. But I've yet to see a game with bad performance on a Kyro I/II. Even a new game like Max Payne, as mentioned by someone else previously, runs surprisingly well. I will take your words if you're saying that they're not gonna run UT2, but is there anyway you can test it at the moment? Or would the code need to be modified before you can try? I think it would be interesting if you could perhaps benchmarks the Kyro before the official game comes out so that we can figure out in advance whether or not it will be fast enough to be playable
--------------------------------------------------------------------------------
I'll be honest in saying that the Kyro hasn't really been tested here, so it's quite possible that I could be completely wrong in my estimation of its capabilities. That said, the lack of T&L gives me a very strong feeling that it will not be up to the task of handling U2. There are several reasons for this:
One:
Unreal 2 (and in many cases the Unreal engine in general) is largely CPU-bound and bus-bound as far as performance goes, not hardware-bound. Many games are not CPU-bound, and the CPU has some room to spare to do software transform & lighting calculations. That's not the case for us.
Two:
While the transform side of T&L has been taken advantage by plenty of games over the past couple years (since even the software emulation of it can be performed fairly quickly), the use of the lighting side has been a more recent development. Prior to decent T&L cards, asking the hardware (via an API like Direct3D or OpenGL) to do lighting calculations of any kind was often a death sentence for performance. For example, Direct3D has a large number of lighting & material state settings, and without T&L the calculations using all these settings can be horribly slow. Hence many game engines have bypassed this form of lighting and instead performed lighting manually in software using simpler calculations. This style of lighting may not look as good as the (more accurate) lighting mentioned above, but it was necessary for a long time if your game was to perform acceptably. The resulting vertices would be passed to the card in "Untransformed and Lit" form, so only the "T" in T&L was used.
But times are changing, the hardware has gotten better, and it's time we start taking advantage of it. So unlike many previous games, we are using the "L" in T&L, and very extensively. When a card is missing T&L, it's not the lack of the "T" that concerns me most as far as performance goes, it's the lack of the "L".
Three:
We are rendering a huge number of polygons. The FPSs of a couple years ago hardly ever had models that exceeded 1000 triangles or so. In contrast, plenty of our models are over 3000 triangles, and a few go as high as 5000 or more. Many of our levels are packed full of complex geometry, and there are some scenes where we're pushing the 100000 triangle barrier, with lighting, multitexture, and so on. And then there's particle effects...
When you combine all this with the fact that the CPU has to handle running a *game* in addition to the graphics, it's easy to see that we're pushing the envelope here.
Once you combine these three things together, you can see why my confidence in any non-T&L card is not all that high.
Matthias Worch
quote:
--------------------------------------------------------------------------------
I understand your point of view completely. It would be much easier if we all used the same video card because then compatibility wouldn't be a problem anymore. But we all know the Kyro is different than traditional renderers, and that leads us to an interesting topic. What are you views regarding PowerVR technology? In a non-biased way if possible, I'd like to have an honest opinion.
--------------------------------------------------------------------------------
All I'd say would repeat what Tim Sweeney said in an Interview, so I'll just point you there: http://www.voodooextreme.com/3dpulp...eytbr_2001.html
Keep in mind that this was done in April before the Kyro2 was released and that they don't touch the issue T&L support. Kyro2 using TBR isn't really the issue for us, that works fine. The missing T&L chip on the Kyro2 (in a time when all other 3D manufacturers equip their cards with T&L) is the issue.
Scott Dalton
I want to stress that we're not out to avoid supporting any cards. We want and will try to support the largest number possible. It would be ludicrous of us to do anything other than that as it hurts the number of potential people that can play this game.
What Chris and Matt are reacting to above is the assertions of Cyborg11 that T&L won't be an issue. In games that truly take advantage of T&L, a card lacking true hardware T&L is going to suffer performance wise in an otherwise identically equipped system that has good hardware T&L GPU support.
We don't want people to get the wrong idea and think that T&L won't be a large factor in determining the game's performance, because it will be. Any amount of time your CPU uses having to emulate T&L hardware will be draining performance away from all the other CPU intensive things we're doing, and you really don't want that.
The games mentioned throughout this thread as "making heavy use of T&L" push a fraction of the polygons we're looking at. Those games were designed to run without T&L acceleration, and as such don't give the GPU a lot of work to do. Our typical scenes have hundreds of times more polys than UT scenes. Sure, if you have to software emulate T&L for 500 level polys and 2500 character polys you're going to be alright. Up that to 80,000 level polys and 15,000 character polys and your CPU is going to have a lot harder time keeping up, given that we're counting on it to handle numerous other demanding tasks.
If you feel that the Kyro 2 is best for you, then go for it. However, keep in mind that it may not be the ideal card for Unreal 2, or any other game utilizing heavy hardware T&L. That's not to say that it won't run, but that it might not run as well as some other hardware out there.
I can't give you any absolute comparison specs at this time, but know that we take into account hardware T&L support as our baseline. If your card lacks it, you're falling below that baseline and may or may not be happy with your performance.
Chris Hargrove
quote:
--------------------------------------------------------------------------------
AFAIK Kyro III will have hardware T&L but don't forget those on a budget like me are those who buy cards like Kyro II so they can have some money left over to buy games too.
--------------------------------------------------------------------------------
This is not a valid argument, since a GeForce2 can be purchased for less than the Kyro II.
quote:
--------------------------------------------------------------------------------
It's not an impossible herculean task - it just requires optimisation in your engine and let the KyroII drivers do the rest.
--------------------------------------------------------------------------------
"Just"? The emulation of non-existant hardware features is handled by the card driver, not by us, so code optimization on our end is not relevant in this situation. We are responsible for the code that works with the data before it goes through the rendering API, not after. Hence the only real optimization we could do in this case is reducing the amount of data we send down the pipe, and that would reduce the graphical quality. We're not about to do that.
quote:
--------------------------------------------------------------------------------
So it's not going to be lack of time, resources or anything else but just a complete lack of interest in supporting a Kyro II.
--------------------------------------------------------------------------------
There is no "support" that we can do for the Kyro II aside from reducing our requirements into the realm of non-T&L cards. That would mean redoing just about every asset that we've made for the game thus far. That will not happen.
This is not a lack of interest in supporting the Kyro II. It is a lack of interest in supporting any non-T&L card. We're supposed to be moving forward, not backward.
quote:
--------------------------------------------------------------------------------
Yes let's all look forward to a monopolistic future dominated by one company so we can all pay over the odds.
--------------------------------------------------------------------------------
More than one company is making T&L cards, you know. I don't care if your card is by nVidia, ATI, PowerVR, or Ye Olde Video Card Shoppe, it doesn't matter to me. What matters is that your card supports the features that will be "standard" by the time this game is released. T&L is one of those. If your card supports our baseline, it should do well. If it doesn't, then it probably won't. That's the bottom line. And it's not an unreasonable expectation, given that you can get T&L cards for only a little more than the price of a single game.
I'm tired of wasting any more time discussing this issue with the few people who don't seem to "get it". Every point that we (any of us at Legend) are trying to make has been made several times already, and that should be more than enough for it to sink in by now.
http://ina-community.com/forums/showthread.php?threadid=134263
Es ist leider etwas verstreut, deshalb hab ich das hier zusammengefasst:
Chris Hargrove
Look guys, here's the deal. We've already talked about this publicly in interviews and such, so this is not new information, but I figure it's worth reiterating.
First, I'll repost what Mike Verdu said last month:
quote:
--------------------------------------------------------------------------------
Q: What are the system requirements looking like at this point?
A: Right now we're developing a game that will be playable on a PIII-500 with any hardware accelerator card that supports T&L (e.g. GeForce I, 2, 3, ATI Radeon, etc.). The game will look fantastic on a high end system, but should be playable on a good mid-range machine. We are still evaluating how to support lower end systems with TNT2 and Voodoo3/4 class cards... Infogrames will make a final determination later this year.
--------------------------------------------------------------------------------
Translation: If you don't have a card that supports T&L, we will not make any promises at this point that the game will be playable. Everything we're doing tech-wise assumes that T&L is present. This isn't just an assumption of our own technology, it's also an assumption of the Unreal Warfare engine itself. If your card doesn't have T&L, you might be able to run the game, but probably not as well as you might like.
Seriously guys, T&L cards are the baseline these days. Even GeForce2 cards are less than a hundred bucks now. By the time this game ships next year, if you can't afford a T&L card you probably can't afford to buy the game... that's how little they cost.
quote:
--------------------------------------------------------------------------------
it supports Software T&L, which means the CPU is doing the calculations todays CPUS 1Ghz and over, can do what hardware T&L can do and in some cases even better
--------------------------------------------------------------------------------
In terms of the argument about how fast CPUs are these days, yes they're fast... but we expect T&L because the CPU is still going to be very busy doing *other things*. If the CPU gets bogged down doing transformation and lighting calculations that otherwise can be offloaded to the video card (and that's a lot of calculations, given our high poly count), it can be a *big* drain on performance.
The bottom line is, get a T&L card, really. It's not just for Unreal 2, you know... plenty of games out there already expect it, and that trend will continue.
Chris Hargrove
quote:
--------------------------------------------------------------------------------
Why not just say "Okay we'll do it but if they turn out to be too slow, don't say I didn't warn you" instead of "Bah I bet they'll be too slow so I won't even bother" ?
--------------------------------------------------------------------------------
There isn't any "we'll do it" involved here; we're going through Direct3D, and we're trying to keep the card-specific checks to a minimum. So how well the card performs depends on the hardware and the D3D drivers. The only thing that we control is which Direct3D features we take advantage of, and those (along with other related things like polygon counts) are chosen based on what we feel is now a "standard" baseline. T&L falls in this baseline.
I'm not saying that you shouldn't even try using a non-T&L card if you want... you can feel free to do that if it makes you happy. But I'm very familiar with the amount of data we're pushing through the pipeline, and I'm telling you that your expectations shouldn't be too high if you don't have T&L. In terms of the "I won't even bother" thing, the only way we could get around a lack of T&L is by scaling back our content, and I'm sorry but we are not about to do that.
Matthias Worch
Cyborg: Dude, don't you think you're embarrasing yourself by trying to tell us why something in our game works/doesn't work? Nobody would like to support the widest possible range of available cards more than us - that does include the Kyro2 and we'll definitely look into making it work. But right now it's not supported, and that's because the game fails to find a T&L unit on the 3D card, not because of some texture compression scheme or something else. End of story
The decision to make the UW engine rely on hardware T&L was made over a year ago when it looked like ALL major card manufacturers were implementing T&L chips in their upcoming cards. 3DFX had already announced it for their post-Voodoo5 card (and the company was struggling anyway) and the two other major 3D card manufacturers (ATI and nVidia) already had it in their GeForce and Radeon line. And then (just a few months ago) PowerVR released a new 3D card that defies that convention by not having hardware T&L on-board. And we somehow have to deal with it. That's regretable and as I said we'll certainly look into adding support for Kyro2 (and possibly other low-end 3D cards), but this isn't a "just do it real quick" thing, we're talking about changing a major assumption that the engine is built on. An assumption that was made based on the fact that all major 3D card companies had T&L only cards in their product lineup/pipeline. PowerVR breaking that rule isn't exactly our fault...
Chris Hargrove
quote:
--------------------------------------------------------------------------------
An assumption that was made based on the fact that all major 3D card companies had T&L only cards in their product lineup/pipeline. PowerVR breaking that rule isn't exactly our fault...
--------------------------------------------------------------------------------
Not only is this not our fault, but as far as I'm concerned it was a very bad idea on the part of PowerVR.
No offense to PowerVR or anything, but this isn't the first time they've made a somewhat flawed decision in terms of 3D hardware development. ZBuffer-less HSR, anyone?
Instead of whining at us for wanting T&L, you might be better off complaining to PowerVR for releasing a substandard product. I don't care how much else the card has or how it may seem to perform in some games... hardware T&L is a standard these days.
Having an old card which predates widespread T&L is one thing. But having an old card is analagous to having an old computer; sometimes you just need to upgrade to get good performance in the newest games (I've had to live with it just like everybody else).
Having a relatively new card which *still* doesn't have T&L is something else, and if that new card doesn't perform all that well because of the lack of a now-standard feature, then it's no better in my eyes than an old card. Return it or throw it out, and get something better.
P.S. Oh, and Cyborg... your repeated "it has T&L it's just doing it in software" statements are pretty ridiculous since I've already stated twice why that is not sufficient for us. Do I need to cut&paste that same text over and over before it sinks in?
Chris Hargrove
quote:
--------------------------------------------------------------------------------
People don't care whether their cards support T&L or not because most of them either don't know what T&L is or don't know if their cards support it. What's important in the end is how your whole system will perform. You cannot throw away a new card with no T&L support just because someone tells you so. You can, however, throw away a card because it's simply not performing well. But I've yet to see a game with bad performance on a Kyro I/II. Even a new game like Max Payne, as mentioned by someone else previously, runs surprisingly well. I will take your words if you're saying that they're not gonna run UT2, but is there anyway you can test it at the moment? Or would the code need to be modified before you can try? I think it would be interesting if you could perhaps benchmarks the Kyro before the official game comes out so that we can figure out in advance whether or not it will be fast enough to be playable
--------------------------------------------------------------------------------
I'll be honest in saying that the Kyro hasn't really been tested here, so it's quite possible that I could be completely wrong in my estimation of its capabilities. That said, the lack of T&L gives me a very strong feeling that it will not be up to the task of handling U2. There are several reasons for this:
One:
Unreal 2 (and in many cases the Unreal engine in general) is largely CPU-bound and bus-bound as far as performance goes, not hardware-bound. Many games are not CPU-bound, and the CPU has some room to spare to do software transform & lighting calculations. That's not the case for us.
Two:
While the transform side of T&L has been taken advantage by plenty of games over the past couple years (since even the software emulation of it can be performed fairly quickly), the use of the lighting side has been a more recent development. Prior to decent T&L cards, asking the hardware (via an API like Direct3D or OpenGL) to do lighting calculations of any kind was often a death sentence for performance. For example, Direct3D has a large number of lighting & material state settings, and without T&L the calculations using all these settings can be horribly slow. Hence many game engines have bypassed this form of lighting and instead performed lighting manually in software using simpler calculations. This style of lighting may not look as good as the (more accurate) lighting mentioned above, but it was necessary for a long time if your game was to perform acceptably. The resulting vertices would be passed to the card in "Untransformed and Lit" form, so only the "T" in T&L was used.
But times are changing, the hardware has gotten better, and it's time we start taking advantage of it. So unlike many previous games, we are using the "L" in T&L, and very extensively. When a card is missing T&L, it's not the lack of the "T" that concerns me most as far as performance goes, it's the lack of the "L".
Three:
We are rendering a huge number of polygons. The FPSs of a couple years ago hardly ever had models that exceeded 1000 triangles or so. In contrast, plenty of our models are over 3000 triangles, and a few go as high as 5000 or more. Many of our levels are packed full of complex geometry, and there are some scenes where we're pushing the 100000 triangle barrier, with lighting, multitexture, and so on. And then there's particle effects...
When you combine all this with the fact that the CPU has to handle running a *game* in addition to the graphics, it's easy to see that we're pushing the envelope here.
Once you combine these three things together, you can see why my confidence in any non-T&L card is not all that high.
Matthias Worch
quote:
--------------------------------------------------------------------------------
I understand your point of view completely. It would be much easier if we all used the same video card because then compatibility wouldn't be a problem anymore. But we all know the Kyro is different than traditional renderers, and that leads us to an interesting topic. What are you views regarding PowerVR technology? In a non-biased way if possible, I'd like to have an honest opinion.
--------------------------------------------------------------------------------
All I'd say would repeat what Tim Sweeney said in an Interview, so I'll just point you there: http://www.voodooextreme.com/3dpulp...eytbr_2001.html
Keep in mind that this was done in April before the Kyro2 was released and that they don't touch the issue T&L support. Kyro2 using TBR isn't really the issue for us, that works fine. The missing T&L chip on the Kyro2 (in a time when all other 3D manufacturers equip their cards with T&L) is the issue.
Scott Dalton
I want to stress that we're not out to avoid supporting any cards. We want and will try to support the largest number possible. It would be ludicrous of us to do anything other than that as it hurts the number of potential people that can play this game.
What Chris and Matt are reacting to above is the assertions of Cyborg11 that T&L won't be an issue. In games that truly take advantage of T&L, a card lacking true hardware T&L is going to suffer performance wise in an otherwise identically equipped system that has good hardware T&L GPU support.
We don't want people to get the wrong idea and think that T&L won't be a large factor in determining the game's performance, because it will be. Any amount of time your CPU uses having to emulate T&L hardware will be draining performance away from all the other CPU intensive things we're doing, and you really don't want that.
The games mentioned throughout this thread as "making heavy use of T&L" push a fraction of the polygons we're looking at. Those games were designed to run without T&L acceleration, and as such don't give the GPU a lot of work to do. Our typical scenes have hundreds of times more polys than UT scenes. Sure, if you have to software emulate T&L for 500 level polys and 2500 character polys you're going to be alright. Up that to 80,000 level polys and 15,000 character polys and your CPU is going to have a lot harder time keeping up, given that we're counting on it to handle numerous other demanding tasks.
If you feel that the Kyro 2 is best for you, then go for it. However, keep in mind that it may not be the ideal card for Unreal 2, or any other game utilizing heavy hardware T&L. That's not to say that it won't run, but that it might not run as well as some other hardware out there.
I can't give you any absolute comparison specs at this time, but know that we take into account hardware T&L support as our baseline. If your card lacks it, you're falling below that baseline and may or may not be happy with your performance.
Chris Hargrove
quote:
--------------------------------------------------------------------------------
AFAIK Kyro III will have hardware T&L but don't forget those on a budget like me are those who buy cards like Kyro II so they can have some money left over to buy games too.
--------------------------------------------------------------------------------
This is not a valid argument, since a GeForce2 can be purchased for less than the Kyro II.
quote:
--------------------------------------------------------------------------------
It's not an impossible herculean task - it just requires optimisation in your engine and let the KyroII drivers do the rest.
--------------------------------------------------------------------------------
"Just"? The emulation of non-existant hardware features is handled by the card driver, not by us, so code optimization on our end is not relevant in this situation. We are responsible for the code that works with the data before it goes through the rendering API, not after. Hence the only real optimization we could do in this case is reducing the amount of data we send down the pipe, and that would reduce the graphical quality. We're not about to do that.
quote:
--------------------------------------------------------------------------------
So it's not going to be lack of time, resources or anything else but just a complete lack of interest in supporting a Kyro II.
--------------------------------------------------------------------------------
There is no "support" that we can do for the Kyro II aside from reducing our requirements into the realm of non-T&L cards. That would mean redoing just about every asset that we've made for the game thus far. That will not happen.
This is not a lack of interest in supporting the Kyro II. It is a lack of interest in supporting any non-T&L card. We're supposed to be moving forward, not backward.
quote:
--------------------------------------------------------------------------------
Yes let's all look forward to a monopolistic future dominated by one company so we can all pay over the odds.
--------------------------------------------------------------------------------
More than one company is making T&L cards, you know. I don't care if your card is by nVidia, ATI, PowerVR, or Ye Olde Video Card Shoppe, it doesn't matter to me. What matters is that your card supports the features that will be "standard" by the time this game is released. T&L is one of those. If your card supports our baseline, it should do well. If it doesn't, then it probably won't. That's the bottom line. And it's not an unreasonable expectation, given that you can get T&L cards for only a little more than the price of a single game.
I'm tired of wasting any more time discussing this issue with the few people who don't seem to "get it". Every point that we (any of us at Legend) are trying to make has been made several times already, and that should be more than enough for it to sink in by now.