What's new

New Graphic Card update, quick help!

pandamoan

Banned
A GF4Ti will _not_ look as good in Half-Life 2 as the DX9-capable cards. HL2's visuals are designed around high dynamic range (HDR) rendering, which uses floating-point precision (PS2.0), and according to many sources are noticeably worse on fixed-point DX8 cards.

i was TRYING to say that! i was also TRYING to say, that it didn't matter since those games aren't OUT yet, and with the leak of source code, it may be some time until it IS out.

and dude, quake RULED when it came out. even today it is fun to play, despite it's relative antiquity. what are you a RTS guy? :)

and just so you know, QUAKE RULES!

so have some respect for your game elders... :) (quake, not me of course)

oh and speed and quality BOTH matter to a gamer, in fact finding the line between them is key, and nvidia throwing one away (quality) will ENSURE they get none of my money (on their current crap fx cards).

jamie
 

Doomulation

?????????????????????????
pandamoan said:
oh and speed and quality BOTH matter to a gamer, in fact finding the line between them is key, and nvidia throwing one away (quality) will ENSURE they get none of my money (on their current crap fx cards).
Oh, don't give me that crap! What matters is gameplay and decent speed. Obviously it isn't so fun to play a game while it's lagging. For gfx - I wouldn't care if it's a 2d scroller or not.
And making games for the latest hardware is just plain stupid.
 

pandamoan

Banned
Doomulation said:
Oh, don't give me that crap! What matters is gameplay and decent speed. Obviously it isn't so fun to play a game while it's lagging. For gfx - I wouldn't care if it's a 2d scroller or not.
And making games for the latest hardware is just plain stupid.

none of that is valid when you are considering buying a new gfx card.

the ONLY reasons i buy gfx cards are price/eye candy/speed.

any built in vga card can run nes emus and there are some great nes games!

jamie
 

Hyper19s

Banned
what is this a freekin contest!!!

ATI Verses NVIDIA
9 10
who will win the next fight

nvidia with 10 points ati 9
 

scotty

The Great One
I think that ATI is getting a lot better, I say that they both have great quality. Using it for N64 emulation, NVidia is better. I know when i get my new Computer, Im probably going to get a FX5600
 

flow``

flow``
it's hard to bias a review ya know? it's either this number or that number when comparing. it's not a judgement call. run the benchmark, put down the scores. end of story?

i was a hardcore quaker about 5 years ago. but i could care less for others opinions on games since it takes all types.

it's not worth upgrading, there are bigger and better things then wasting time sitting on your ass doing nothing productive.
 

pandamoan

Banned
lol, sadly i'd have to agree flow. though i'd LOVE to get a new vid card, all this divx will ensure that my money is spent on new hard drives... LOL

or i could fix the oil leak on my car.... hmmmm....

jamie
 
OP
Dogman5

Dogman5

----------
The card should be here by Monday or Tuesday. (9600 PRO)

In my Dell Dimension 8100 PC, it has an AGP 4X slot. Even though it is only 4X AGP slot, will it still yeild me good results? Also - If not, or if so, is there any way to install a AGP 8X slot if it would dramatically increase performance?

And this thing is a breeze to setup I hear, correct?

Thanks!
 

Hexidecimal

Emutalk Bounty Hunter.
Well, good thing I was going to buy a new card next month anyways, because mine has finally decided to hate me, one of the VGA ports just went kaput, so much for never having a problem with the MX series....
 

Xade

Irrelevant Insight
*Sigh*

There are always going to be the ATI fanboys, and there are always going to be those loyal to the nVidia camp.

As you can see below, my alliance is clear. Despite what just about everybody seems to think, the 9800 Pro and company simply aren't destroying the 5900 Ultra at all. Both cards have clear advantages and disadvantages.

It's also worth noting that recent benchmarking comparisons between the two with regard to HL2 were done using the 45 series of Detonator drivers, and NOT the 50 series. Thus, it would seem presumptious to simply draw conclusions based on early demonstrations and tests. Bear in mind that as far as Doom 3 goes, the situation has been very much the reverse.

As HL2 (finally) arrives and nVidia (finally) get their drivers out, only then will true comparisons be able to be made.

Incidentally, my GF5900 Ultra has been serving me just fine, thank you very much, and will do for the forseeable future the way I see it.
 

flow``

flow``
we won't see hl2 until december or much much later (gg source leak), i'd expect nvidia to be much more prepared at that place in time with drivers and optomizing the nv3x line with dx9 games in general

should be interesting how far hl2 is delayed, could have a nice head to head with id/valve :)
 

flow``

flow``
no you can't "install" an 8x agp slot
yes 4x agp should be fine even with an 8x compatible card... i can't say tehre have been any 8x vs 4x benchmarks out recently but the ones i do recall showed little difference
 
OP
Dogman5

Dogman5

----------
flow`` said:
no you can't "install" an 8x agp slot
yes 4x agp should be fine even with an 8x compatible card... i can't say tehre have been any 8x vs 4x benchmarks out recently but the ones i do recall showed little difference

ok thank you :paperbag: :icecream:
 

Tagrineth

Dragony thingy
pandamoan said:
and dude, quake RULED when it came out. even today it is fun to play, despite it's relative antiquity. what are you a RTS guy? :)

and just so you know, QUAKE RULES!

so have some respect for your game elders... :) (quake, not me of course)

oh and speed and quality BOTH matter to a gamer, in fact finding the line between them is key, and nvidia throwing one away (quality) will ENSURE they get none of my money (on their current crap fx cards).

jamie

'kay... a few things re: Quake 1.

It's crap. It's BORING.

If you think I should "Respect my game elders", I should tell you that DOOM is one of my absolute all-time favourite games, period. Hell, I even enjoy playing the _Super Nintendo_ port of DOOM. ^_^

Quake is just plain awful compared to DOOM. The only thing Quake has over DOOM is the real 3D graphics... and the only level in the entirety of Quake that takes advantage of them is Ziggurat Vertigo - which is frankly the only fun level in the whole damn game.

Oh, and I'm an every-game-genre-except-sim-and-sports-gal. =)

flow`` said:
it's hard to bias a review ya know? it's either this number or that number when comparing. it's not a judgement call. run the benchmark, put down the scores. end of story?

Sure it's easy to bias a review. Mess with the settings. "Forget" to restart between runs. Deliberately use a slightly questionable driver (though this has lessened in recent times). Deliberately leave out the results which show Vendor B clearly defeating Vendor A. Discuss all the bugs you encountered when you deliberately made a mistake installing the card, while the perfect installation of Vendor A's card should be hammered home incessantly.

In fact, I think somewhere online there are benchmarks that are deliberately skewed to show a Pentium Classic overpowering an Athlon XP. By a lot. It's all in how you do your testing.

Xade said:
It's also worth noting that recent benchmarking comparisons between the two with regard to HL2 were done using the 45 series of Detonator drivers, and NOT the 50 series. Thus, it would seem presumptious to simply draw conclusions based on early demonstrations and tests. Bear in mind that as far as Doom 3 goes, the situation has been very much the reverse.

wrt the 50 series, did you see Valve's slide on what was wrong with them? They had about 10 separate bullet points on the cheats it introduced to boost performance, including adding clipping planes to timedemos (which doesn't benefit in-game performance AT ALL and never can without potentially destroying the visual quality).

And according to Driverheaven, they STILL HAVE THAT STUPID UT2003 AF cheat... and in fact, it now applies to all Direct3D games, not just UT2003. Wow, the det50's are so amazing...
 

Xade

Irrelevant Insight
And your explanation for the clear reversal of benchmark results in Doom 3?
 

flow``

flow``
i love your close mindedness when it comes to games tag. quake>doom doom>quake who cares. people play what people play. you can play the sims and i wouldn't really care.

oh, intel/amd.. well yeah, intel is doing a pretty good job at beating amd in most benchmarks. (and usually by with x86 chips). and that's been found on a lot of respected sites

your ati fanboy.. er.. girl attitude comes through nicely. oh, fuck hl2 :) gg april '04.

i could care less about that extremely hyped up the ass game.

they used the old 45.23 drivers in those comparisons. why not read yourself a review of the newer 51.xx's and 52.xx's on _released_ games they've benchmarked. the 5900 is right up there on almost all benchmarks and slightly behind ati's 9800xt. nvidia is right up with now with their drivers and have only been getting better the past few revisions.

also, for the record. ati paid 8 million to bundle hl2 with the 9800xt cards. maybe that's why nvidia got blasted the way they did? if nvidia had put out the money i don't think you would've seen those slides released, and instead some nice nvidia-favored benchmarks and anti-ati slides.

doom 3 is opengl with a little dx7? in the mix i believe. nvidia, having superior ogl drivers should come out ahead.

oh tag, what about those new ati 3.8's? gotta love the new redundant gui they added :)
 

Tagrineth

Dragony thingy
Xade said:
And your explanation for the clear reversal of benchmark results in Doom 3?

DOOM3 is fixed-point.

me said:
nVidia's FP16 isn't very much faster than FP32 (the only thing faster is register use!), while fixed-point (DX8 and lower) math is VERY fast. FP24 has nothing to do with it... NV3x has a broken floating-point shader, plain and simple.

It's literally the beefiest DX7-class game ever.

flow`` said:
i love your close mindedness when it comes to games tag. quake>doom doom>quake who cares. people play what people play. you can play the sims and i wouldn't really care.

I was responding to

this is a valid point because ONLY ONE of those 4 titles is out right now, and quake is just fine in it's original incarnation, and really great in FUHQUAKE, thank you very much.

because he mentioned Tenebrae.

Then again, I use Tomb Raider AOD as an example of GFFX's horrid floating-point shader performance. ^^;

oh, intel/amd.. well yeah, intel is doing a pretty good job at beating amd in most benchmarks. (and usually by with x86 chips). and that's been found on a lot of respected sites

Usually with x86 chips? Yay, so occasionally they aren't even x86...

your ati fanboy.. er.. girl attitude comes through nicely. oh, fuck hl2 :) gg april '04.

i could care less about that extremely hyped up the ass game.

So? It shows how awful GFFX floating-point shader performance is.

I don't like HL myself, nor do I like Tomb Raider, but both of the new games clearly show the FX line sucking horribly with real DX9-class feature usage.

they used the old 45.23 drivers in those comparisons. why not read yourself a review of the newer 51.xx's and 52.xx's on _released_ games they've benchmarked. the 5900 is right up there on almost all benchmarks and slightly behind ati's 9800xt. nvidia is right up with now with their drivers and have only been getting better the past few revisions.

Nice to know people don't read my posts anymore. From my PREVIOUS POST IN THIS THREAD:

me said:
wrt the 50 series, did you see Valve's slide on what was wrong with them? They had about 10 separate bullet points on the cheats it introduced to boost performance, including adding clipping planes to timedemos (which doesn't benefit in-game performance AT ALL and never can without potentially destroying the visual quality).

And according to Driverheaven, they STILL HAVE THAT STUPID UT2003 AF cheat... and in fact, it now applies to all Direct3D games, not just UT2003. Wow, the det50's are so amazing...

If I had the time to right now I could find that slide for you. It's really quite damning.

also, for the record. ati paid 8 million to bundle hl2 with the 9800xt cards. maybe that's why nvidia got blasted the way they did? if nvidia had put out the money i don't think you would've seen those slides released, and instead some nice nvidia-favored benchmarks and anti-ati slides.

Mmmhmm. Please look up WHEN ATi paid that 8 million. Hint: It didn't happen until after most of the serious engine work was already done. In fact IIRC it was even after most of those big HL2 performance tests appeared.

doom 3 is opengl with a little dx7? in the mix i believe. nvidia, having superior ogl drivers should come out ahead.

Look up.

Also, wrt DOOM3, there's another factor: Stencil shadows.

If you'll remember, the GFFX line basically has a 4-pixel, 2-texture-per-pixel architecture just like GeForce4Ti... but when doing only Z calculation (no colour), it acts like it has 8-pixel, 1-texture-per-pixel (like the DX9-class Radeons other than 9600 line and 9500 non-Pro).

DOOM3 does several entire passes without actually drawing anything.

Basically the GFFX line is a DOOM3 Accelerator. =)

But um... it isn't the OpenGL driver that's doing the trick, it's Carmack's massive time and effort.

Carmack HIMSELF (quote shouldn't be hard to find) said that nVidia's cores were about HALF as fast as the equivalent ATi parts when using the 'standard' OpenGL w/ floating-point shaders path ("ARB2"). FX line is only faster when it uses Carmack's vendor-specific path.
 

fivefeet8

-= Clark Kent -X- =-
Tagrineth said:
wrt the 50 series, did you see Valve's slide on what was wrong with them? They had about 10 separate bullet points on the cheats it introduced to boost performance, including adding clipping planes to timedemos (which doesn't benefit in-game performance AT ALL and never can without potentially destroying the visual quality).

Actually, Valve spotted those with the 51.xx beta detonators. They haven't said anything about the 52.xx dets yet. HL2 seems to be delayed until next year anyways. By then, there will be new hardware and new drivers by both companies.

Tangrineth said:
And according to Driverheaven, they STILL HAVE THAT STUPID UT2003 AF cheat... and in fact, it now applies to all Direct3D games, not just UT2003. Wow, the det50's are so amazing...

As I understand it, Radeons force Bilinear filtering in UT2k3 if you use the display properties to select AF quality. But do use Trilinear filtering when set to application. The problem with Nvidia's drivers is that they are forcing pseudo bi/tri filtering in Ut2k3 no matter what. But then again lets do a few comparison shots in Ut2k3 shall we.

http://f1.pg.briefcase.yahoo.com/bc...rc=bc&.done=http://f1.pg.briefcase.yahoo.com/

The first shot is from me using the new leaked 52.13 detonators. AF set to application specific. 8xAF-quality per application.

Second shot is taken from an article at 3dcenter.org which used a anti cheat script + modified drivers to force Trilinear filtering.

See for yourself. The IQ from my ut2k3 shot is just as good if not better than the full Trilinear filtered shot.

And as noted on many hardware message boards with actual user testing these drivers, they have improved IQ in Aquamark, Halo, and Tomb Raider all with performance increases.

Anandtech has done some IQ comparisons and performance tests with the 52.14. Take a look at the performance increases and IQ.

http://www.anandtech.com/video/showdoc.html?i=1896&p=1
 
Last edited:

Top