PDA

View Full Version : New video card time...



agallag
30th July 2004, 21:13
I'm probably going to pick up an MSI GeForceFX 5900XT 128MB tomorrow. If anyone has any reason why I shouldn't, or why a Sapphire 9800Pro 128MB would be a better buy, speak now, or forever hold your peace. :D

Chrono_Wanderer
30th July 2004, 21:17
get the 6800GT :p

I won't get the 5900XT, i'd rather go for the 9800pro if u don't need good linux/bsd support. 5900XT is slow in shaders, i don't think it will run half life well...

ati is rumoured to be rewriting their OGL drivers, so maybe performance maybe better with OGL on ATI cards soon... (doom 3!)

mmp121
30th July 2004, 22:22
You didn't say what your gonna use the 5900XT for. If its Linux, then go for it. If your gonna game and use windows mainly, then I don't see why you wouldn't get the ATI 9800 Pro. Its flat out faster than the 5900XT for gaming. Except for maybe Doom3.

agallag
30th July 2004, 23:06
It's for windows only. I do plan to get doom3, but that certainly won't be the only game I'll be playing.

6800GT isn't an option. I refuse to spend $600 (canadian) on a video card. So the 9800Pro is really faster than the 5900XT at most things?

rugger
31st July 2004, 01:18
absolutely, the 9800Pro is faster and nicer and has much better rendering quality.

agallag
31st July 2004, 07:32
Alrighty then... 9800Pro it is.

gangster
31st July 2004, 10:00
or a 6800 standard version for less than $300 and its much faster than a 9800pro and the image quality is now equal, imo.

ZokesPro
31st July 2004, 16:27
What are you going to do with your old card eh? Maybe give it to Zokes, you know...;)

agallag
31st July 2004, 21:16
Originally posted by gangster
or a 6800 standard version for less than $300 and its much faster than a 9800pro and the image quality is now equal, imo.

$300? Not in canadian dollars it ain't.

agallag
31st July 2004, 21:16
Originally posted by ZokesPro
What are you going to do with your old card eh? Maybe give it to Zokes, you know...;)

You really want a Radeon 8500LE with a broken fan? :p

leech
2nd August 2004, 04:14
I'm loving my 6800 GT. It's pretty sweet. Though I still need to do a re-install or something of Windows. Since I'm pretty sure I should be getting more than just 14k in 3dmark01.... though in 3dmark03 I'm getting around 10k...

Odd thing is, I had it overclocked to 6800 Ultra speeds, but it didn't get any higher in 3dmark03... maybe because Overnet was running in the background... :D

Leech

ZokesPro
2nd August 2004, 04:52
Originally posted by agallag
You really want a Radeon 8500LE with a broken fan? :p Sure, why not? :)

agallag
2nd August 2004, 21:06
What do you have now, Zokes?

By the way, I haven't bought the new card yet. They were sold out. I'll probably order one this week if they don't get more in stock.

Chrono_Wanderer
2nd August 2004, 23:28
6800 non-Ultra/GT should be quite a lot cheaper than the 6800GT/Ultra varients, because it has only 12 pipes "enabled"

Performance its like 9800XT iirc.

On a side note, nvidia rendering quality is not worse than ati rendering anymore. I think you can force the real trilinear filtering in the new forceware drivers (i seen it on mine).

I haven't run any 3d games these days tho, so i cannot give you guys the absolute answer on 3d rendering quality (too much work!!!!)

I am waiting for a GeForce 6600 personally... those i think won't cost an arm and can play both HL2 and D3 reasonably well imo.

Mehen
2nd August 2004, 23:52
New ForceWare drivers=evil

with either of the 61.7x drivers tvout doesnt work with my card. also, it messes up all of my nview stuff. that said, the 56.72 drivers (or whatever the latest were before 6x.xx) work great, iq is awesome.

been playin doom3 a bit today, running great!

as for you video card decision, 9800pro is a wise choice.

leech
3rd August 2004, 08:52
From everything I've read on various forums (mostly nvnews.net) unless you have a Geforce 6 series card, there really isn't any reason to use the 6x.xx drivers. The TV-Out doesn't work with these drivers on the Geforce 6 series cards, yet I'm pretty sure they work ok with TV-Out on the 6 series... maybe if I get bored one day I'll try it out. I'm currently using the 62.20 and they seem to work fine here, though I haven't set up the second monitor yet, or Tv-Out. Been too lazy... well, actually sick.

I did give Far Cry a shot... but I don't know if it's due to me being sick, or what, but I didn't find the game all that fun. Granted the outside visuals look fantastic, but the gameplay was just meh. And inside the buildings just looked ok. Also, I had an issue with the game... everytime I shoot someone on a new level, it pauses while it loads or something, which is REALLY irratating. It does it for the first few shots, as if my system doesn't have enough memory to store a muzzle flash or something. I have a 3.06ghz Pentium 4 with 512mb of Ram...And now I can honestly say that my CPU/Memory is the bottleneck on the speeds... after I overclocked them, and ran several benchmarks, they went up from the stock speeds. If I overclock my 6800 GT now, it doesn't do any good!

Maybe it's now time for that AMD64... :D

Leech

GNEP
3rd August 2004, 09:02
Not played the game, but those pauses could be sound related? I'm sure I've seen similar behaviour in other games and it was the gun sounds being used for the first time being loaded...

gangster
3rd August 2004, 14:01
quote:

"I did give Far Cry a shot... but I don't know if it's due to me being sick, or what, but I didn't find the game all that fun. Granted the outside visuals look fantastic, but the gameplay was just meh. And inside the buildings just looked ok. Also, I had an issue with the game... everytime I shoot someone on a new level, it pauses while it loads or something, which is REALLY irratating. It does it for the first few shots, as if my system doesn't have enough memory to store a muzzle flash or something. I have a 3.06ghz Pentium 4 with 512mb of Ram...And now I can honestly say that my CPU/Memory is the bottleneck on the speeds... after I overclocked them, and ran several benchmarks, they went up from the stock speeds. If I overclock my 6800 GT now, it doesn't do any good!"


I saw a test of Farcry vs. memory usage, don't recall where, but it said it was optimized with 768 meg.

gangster
3rd August 2004, 14:02
Originally posted by agallag
$300? Not in canadian dollars it ain't.

Yes, sorry, $285 US is what I paid.

gangster
3rd August 2004, 14:04
Originally posted by Chrono_Wanderer
6800 non-Ultra/GT should be quite a lot cheaper than the 6800GT/Ultra varients, because it has only 12 pipes "enabled"

Performance its like 9800XT iirc.

On a side note, nvidia rendering quality is not worse than ati rendering anymore. I think you can force the real trilinear filtering in the new forceware drivers (i seen it on mine).

I haven't run any 3d games these days tho, so i cannot give you guys the absolute answer on 3d rendering quality (too much work!!!!)

I am waiting for a GeForce 6600 personally... those i think won't cost an arm and can play both HL2 and D3 reasonably well imo.

Image quality on my Leadtek 6800 std. is equal to my previous Sapphire 9700, they reallly have come a long way.

Tjalfe
3rd August 2004, 14:12
Originally posted by gangster
quote:



I saw a test of Farcry vs. memory usage, don't recall where, but it said it was optimized with 768k.

768M I am sure :p

I have 1GB on my machine, but it still stutters the first time you fire a gun after loading the game, as it has not loaded the sound file yet

ZokesPro
4th August 2004, 05:24
Originally posted by agallag
What do you have now, Zokes?

By the way, I haven't bought the new card yet. They were sold out. I'll probably order one this week if they don't get more in stock. Oops, sorry for replying so late.

I have a GeForce2 Pro AGP 64mb, woohoo, hehe.

agallag
4th August 2004, 12:35
9800 Pro 128MB has been ordered. ETA = 2 days (give or take). Old card is being donated to the Zokes Gaming Fund. Yay. :)

ZokesPro
4th August 2004, 15:09
http://gloverlab.biochem.ualberta.ca/~dave/bmn/Source%20Files/woohoo.gif

leech
4th August 2004, 16:20
Honestly, I didn't notice much of a difference from my Parhelia to my 6800 GT. It has excellent 2D image quality, a lot better than I was expecting. I did notice that the text doesn't look quite as crisp, but that may be due to not enabling any AA for text. It looks great in linux though :D I always did prefer Matrox's Glyph anti-aliasing over Cleartype though.

Leech

Chrono_Wanderer
4th August 2004, 20:32
hey leech, which lcd are you using?

leech
5th August 2004, 04:17
I'm not. Using a 21" CRT. I was highly impressed that the 2D image quality didn't kill my eyes. I have it currently set to 1280x1024. I haven't bothered to set up dual-head yet (been too lazy, and well, caught bronchitis recently YUCK!) If I overclock my PC to 3.22ghz (which runs pretty stable, but I usually just leave it at 3.06ghz, since I don't want it any hotter in my room as it is), I get about 12k on 3dmark03.

Leech

Wombat
5th August 2004, 08:43
Originally posted by leech
I'm not. Using a 21" CRT. I was highly impressed that the 2D image quality didn't kill my eyes. I have it currently set to 1280x1024.
Leech Blah, set it to 1280x960.

Kooldino
5th August 2004, 10:33
Originally posted by Chrono_Wanderer
ati is rumoured to be rewriting their OGL drivers, so maybe performance maybe better with OGL on ATI cards soon... (doom 3!)

Doom 3 doesn't run OGL, AFAIK. DirectX 9b or better.

Wombat
5th August 2004, 10:48
Originally posted by Kooldino
Doom 3 doesn't run OGL, AFAIK. DirectX 9b or better. Don't know how they'd be releasing an OS X version, then, which they are.

bsdgeek
5th August 2004, 10:51
I think the DX9 is for sound, isn't it?

leech
5th August 2004, 16:31
@ Wombat;

Why 1280x960? It's looking pretty nice in 1280x1024@100hz.

About Doom 3....

All of iD's games use OpenGL. For the very reason that it's more cross-platform compliant than DirectX.

And I would guess for the sound engine, they're using OpenAL?

Leech

bsdgeek
5th August 2004, 16:58
Originally posted by leech
And I would guess for the sound engine, they're using OpenAL?

I was just guessing because they do say it requires DX9. In truth, I don't know.

Wombat
5th August 2004, 17:00
Originally posted by leech
@ Wombat;

Why 1280x960? It's looking pretty nice in 1280x1024@100hz.
Your CRT is meant for 4:3 resolutions. 1280x1024 is the only common non-4:3 resolution. You're "squishing" more data on the screen, and things are not as long vertically as they were meant to be.

leech
5th August 2004, 17:49
Ok, that makes sense. I just got used to it in 1280x1024, due to the resolution that triple-head puts it in. Quick question. Why does windows always detect my monitor's refresh capabilities wrong? It says I can only do 1600x1200@75hz, but right now I'm running it at 85hz. I installed the inf files for it, and the Device Manager lists it proplery (it's a Compaq P110). But I always have to uncheck that box that says "Hide modes that this monitor cannot support"

Leech

Wombat
5th August 2004, 18:49
Probably the monitor's fault. I think that the EDID data sent over DDC to the video card is wrong.

Admiral
5th August 2004, 20:02
Originally posted by Chrono_Wanderer
6800 non-Ultra/GT should be quite a lot cheaper than the 6800GT/Ultra varients, because it has only 12 pipes "enabled"

It seems they just found a way to enable those pipelines and the fix will be included in the next Rriva Tuner.
Guess I'll be getting myself a Leadtek 6800 nu in the end and not a GT, for 150$ less. :)

Greebe
5th August 2004, 22:06
I don't think it's the EDID that's the problem Rob/Leech. Parhelia has problems with most Sony monitors (who is the OEM).

I've got an IBM P260 (same as a Dell P11x0, Gateway P11x0 series, aka Sony G series chassis if memory serves me correct) and even with the Parhelia customizations file specific for this monitor it doesn't allow me either the max abilities it'll perform or like you, the default refresh rate issue. Even if I make a custom res profile it doesn't work.

If driven with a G550 or G400max it'll do the upper resolutions, but still has the same default refresh rate issue.

leech
6th August 2004, 05:09
Don't have the Parhelia anymore, Greebe. I have an eVGA 6800 GT.

And it is that it won't read the EDID, at least it won't in linux, gives me some errors about not being able to read it properly. But at least with linux, I can enter in the refresh rate ranges into the XF86Config file, and it'll choose the optimum refresh rate for each resolution I go into. With WindowsXP I don't know if there is a way to do the same thing. I've downloaded the drivers from here (http://h18007.www1.hp.com/support/files/monitors/us/locate/64_5817.html) but that didn't help.

Leech

agallag
11th August 2004, 14:11
Take a look at my system specs link in my sig. It has been updated :)

Unfortunately, I had to back out of the donation to the Zokes Gaming Fund. :( Unforseen circumstances. I now owe him lots of beer the next time he's in toronto. Good thing I was able to talk him out of the sexual favours :p

Mehen
12th August 2004, 23:09
oooo, the 6600GT is looking sweet, might be a couple months before an AGP part comes out though, check it out on www.hardocp.com

UtwigMU
13th August 2004, 03:08
Based on tech report's claim about 42 FPS in 1600x1200, this should sit between vanilla 6800 and Radeon 9800.

Basically a higher performing card at R9800's price point.