I will be exceited to see actually benchmarks, or are there some already?
And very glad I took my gf5600 back
quote:
ACES! Another post by Warlord Darius:
So what's this mean for use computer illiterate people?
nvidia cards are going to get crappy performance in Half-Life 2 compared to similarly priced ATi cards, and Valve insists that they didn't do it on purpose.
quote:
From the book of Kegwen, chapter 3, verse 16:
and Valve insists that they didn't do it on purpose.
Yeah and how many actually believe that?
[ 09-10-2003: Message edited by: Kegwen ]
quote:
Kegwen had this to say about Tron:
nvidia cards are going to get crappy performance in Half-Life 2 compared to similarly priced ATi cards, and Valve insists that they didn't do it on purpose.
I have an NVidia GeForce 3 > : (
quote:
Azizza had this to say about Pirotess:
Yeah and how many actually believe that?
They must be teamed up with ATI!!!!!!!!
It's a conspiracy.
quote:
Warlord Darius's account was hax0red to write:
I have an NVidia GeForce 3 > : (
I have a GeForce4 Ti4200, which isn't enough either.
quote:
Kegwen had this to say about Knight Rider:
I have a GeForce4 Ti4200, which isn't enough either.
So... are we talking about having all the graphics and clip planes jacked to hell and running smoothing, or minimum requirements here?
quote:
This insanity brought to you by Iulius Czar:
Mmm, sweet vindication for sucking in EverQuest.
No, it's just unfair. You can't release a game that's obviously biased towards one of two major competitors when it's going to be this popular. That's not cool at all.
quote:
Kegwen had this to say about Robocop:
No, it's just unfair. You can't release a game that's obviously biased towards one of two major competitors when it's going to be this popular. That's not cool at all.
Yes you can.
It's called "making money". Most of the Nvidia users either wont know this, or wont care -- its Half Life 2! The few that do care will switch vid card companies, meaning more money for valve if there is a partnership.
If ATI did indeed partner with Valve, its smart of them to do this (make the game best on Radeon cards). This game is so popular, people WILL get radeons for it, at least alot will.
A fun fact too -- At E3, Valve was near the ATI booth, if i remember right, and HL2 was being shown on a Radeon
FYI. I use a GeForce card, and likely always will. So I'm not saying this because im an ATI fanboy or anything. [ 09-10-2003: Message edited by: Falaanla Marr ]
quote:
Falaanla Marr enlisted the help of an infinite number of monkeys to write:
Yes you can.It's called "making money".
If ATI did indeed partner with Valve, its smart of them to do this (make the game best on Radeon cards). This game is so popular, people WILL get radeons for it, at least alot will.
No, that's LOSING MONEY for Valve on loyal nVidia customers, and nVidia also loses money because everyone wants their uber system to run HL2 well, and as such they will go ATi.
ATi makes money here, not Valve. I don't see what they have to gain from it unless they are indeed partnered.
It may be a driver issue with nVidia though, and if they're actually willing to work with nVidia on the issue, they may be able to resolve it before release, and then all this mess can go away.
quote:
Kegwen had this to say about (_|_):
No, that's LOSING MONEY for Valve on loyal nVidia customers, and nVidia also loses money because everyone wants their uber system to run HL2 well, and as such they will go ATi.ATi makes money here, not Valve. I don't see what they have to gain from it unless they are indeed partnered.
It may be a driver issue with nVidia though, and if they're actually willing to work with nVidia on the issue, they may be able to resolve it before release, and then all this mess can go away.
Exactly, they likely are partnered if this is intentional. I mean, ATi would WANT to push NVidia out of business
Its a win win situation for valve. Sure, they lose a few sales from people that actually check out benchmarks and wont sway from a nvidia card, but they keep most of em, and ATi gets some more card sales.
Counterstrike is a cesspool. Valve seems to be the breeding ground for stupidity.
Face it, the "special code" they had to write for FX cards is because they fucked up the original game code in the first place and are covering their asses.
quote:
ACES! Another post by Tatsukaze:
I my Radeon 9800 Pro.
I bet you all that time you're gonna have to play HL2, too.
[ 09-10-2003: Message edited by: Kegwen ]
quote:
Warlord Darius probably says this to all the girls:
So... are we talking about having all the graphics and clip planes jacked to hell and running smoothing, or minimum requirements here?
You can't run best performance, that's about it. The new Nvidia card can run it but because of the fuck up with the coding it doesn't run it as well as the ATI. I think the only ones that are going to care are the hardcore gamers that are running Nvidia's card. It's due to the drivers. I'm pretty sure they had a large article about it on Gamespy about how horrible the drivers are for Nvidia's 5900. My uncle has a beta of Half-Life2 and ran it on a Geforece FX and had little to no problems, unless he ran the game at best performance. [ 09-10-2003: Message edited by: ToastedFritters ]
yes, you'll need to upgrade from your Geforce 3 no matter what.
Buy a 9800 pro, it's cheaper and better.
quote:
Mr. Duck wrote this then went back to looking for porn:
The Benchmarks
So, my TI4400 might actually run HL2 playably? How strange.
quote:
Hello Cuthy had this to say about (_|_):
Slightly old, but what do you think the chances of my ati rage pro working with the game since it hates nvidia so much?
0% chance
quote:
And I was all like 'Oh yeah?' and Peter was all like:
You know, Even If HL2 is skew towards ATi, Isn't Doom3 written in favor to nVidia? Seems it would be an even matchup then.
Nobody cares about Doom3
quote:
Warlord Darius had this to say about Cuba:
I do
ho noes
quote:
Mr. Duck had this to say about Cuba:
The Benchmarks
btw, in case any of you think these are applicable, they aren't. Reason?
quote:
DX9, PS 2.0, Full precision floating-point
2.8 GHz P-IV, 800 MHz FSB
Average of several test DEM files
Check Dealtime for prices
NVIDIA Detonator 45.23
ATI Catalyst 3.7
Detonator 50s will show us how it's REALLY gonna be. This is just in case you didn't bother to read the other threads where this was already said. [ 09-13-2003: Message edited by: Kegwen ]
Consewuently, the 24bit precision that ATI has used for a while became the minimum precision for DX9 and the GCC code in DX9 is more suited to a typical ATi architecture. (the special shader commands are more easily compiled for ATi hardware).
Nvidia got pissed. Their FX series of cards cannot do 24 bit precision. They do 16bit faster than ati, and 32bit slower. so, in order to maintain at least the level of precision in new games, the FX is forced to run at it's slower, higher precision mode. Also, the FX engine has a LOT of extra GCC commands that are nvidia specific, can replace the functionality of DX9 standard GCC commands and run much more quickly on the FX engine. Another disadvantage is that DX9 calls for 8 pixel shader pipelines, ATi implemented 8 individual pipelines each capable of one operation per clock, and nvidia implemented 4 each capable of 2. The downside is that in the nvidia hardware, both operations must be completed on the same pixel. If you have a pixel that needs three operations, then you waste a half clock on the nvidia hardware.
Valve coded HL2 strictly according to the DirexctX 9 specifications. That means 24bit colour, using standard GCC operations in shaders (instead of ATi or Nvidia specific ones) and just generally following the spec to the letter. If they couldn't do something without using a hardware specific method, they found another way.
This is very good for ATi, because when they were designing R3xx, they had DX9 in mind, and optimized the core to the spec. A core matched to spec and code matched to spec work well together.
This is very bad for Nvidia, because even though it can run DX9 code, it's not a true directx9 part. It can't do 8 pixels per pass, or 24 bit colour, and it's slow at standard operations. It's fast at 16 bit precision and whups on the ATi, and if you use the nvidia specific code, then you get improved performance, but as it stands, it's not a DX9 part.
Valve ran the original codepath (pure DX9) on both cards, and were shocked all to hell that it ran like as on the GFX (well, maybe not) but it prompted them to write all new shaders exploiting the faster nvidia specific commands. This gave a considerable advantage, but still not up to the same performance as a comparably priced ATi part.
Should Valve have to write a different shader for each architecture on the market? I wouldn't think so. I'd think that it's the responsibility of the cardmaker to ensure that it performs well with industry standards.
I'm pissed at ATi as well, because they have all but forsaken openGL performance on the new core, and get spanked to hell and back by the Nvidia card that excels at opengl.
I'm expecting that the new DET 50's will either have even more optimized shaders to replace the valve ones (increasing performance without touching visual quality). They may decide to run the whole thing in 16 bit precision, and if that's the case, expect a large increase in speed, but with a big hit to visual quality as well. Other than that, I don't know what to expect.
(and a big part of why nvidia's current part is getting beat by ati's is that it just doesn't have the same raw shader performance. In a dx8 environment, it performs much better, but the shader engine on the 5x00 series of cards is a little subpar. Let's hope that next generation is better)
No, Really. Bite me.
quote:
We were all impressed when Warlord Darius wrote:
Slightly off topic question: Let's say I were to buy an Radeon 9800 mostly for Half-Life 2. At some point down the road a game doesn't play so well with it; can I pop my Nvidia back in freely, or will my computer bitchslap me for changing video cards back and forth?
when you uninstall video drivers, there are normally remnants left over in the registry and such that will hamper performance. EX: A 5900 performs better after a clan install than after being put into a machine that just had a clean install with an ati card in there. (normally best to format->reinstall when switching cards) otherwise, use a reg cleaner between.
No, Really. Bite me.
quote:
Kegwen had this to say about (_|_):
I'll take Business Ethics for 500, Alex.
"The answer is: Something not found in business."
"...what are ethics?"
"CORRECT!"
/bonk
As far as shooters go, I'm looking forward to Battlefield Vietnam, Call of Duty, and DOOM 3.
-Tok [ 09-13-2003: Message edited by: Toktuk ]
quote:
Naimah had this to say about dark elf butts:
TF2 is based on the HL2 engine. They are going to release it shortly after HL2 from what I understand.
Have mad linkage from a reputable source as proof?
-Tok