View Full Version : Wake me when the "G800" gets here. Zzzzzzz.

Steve Snyder
6th October 1999, 07:53
I love Matrox cards for 2D display. I've had 2 of them (Millennium, G200) and they've always worked great with OS/2, Win9x and Linux. They are prompt in keeping their drivers and BIOSs updated. Matrox has earned my loyalty.

So why do I have 3 (!) video cards in my machine? In addition to the G200 I have SLI Voodoo2 cards because I play FPS action games which requires a good frame rate. I've paid close attention to the G400(MAX) benchmarks posted and they simply can't touch SLI/V2 for 3D performance.

I'm not asking for the world: if Matrox can *equal* (not surpass) SLI/V2 at the same resolutions (16-bit @ 800x600 or 1024x768) when playing Quake3, I'll dump the V2 cards in a heartbeat. I *want* the image quality that the G400 provides, but I *need* the performance of SLI/V2.

(Point of reference: I get 57fps @ 800x600 on Q3Test v1.08 demo #1 with all eye-candy enabled and sound disabled. My 1024x768 rate is 39fps.)

I would very much like to replace my 3 video cards with the G400's successor. I hope it gets here soon.

6th October 1999, 08:33
Stay tuned for the Turbo to be released this friday . Give us your cpu and let someone compare.
BTW , in your tests , try sound--high quality and textures--MAX.

In Harm's Way

Steve Snyder
6th October 1999, 09:27
My machine has a P3/550 CPU and 256MB of RAM. It is the 3D video subsystem that is the fps bottleneck on my system. And that's with SLI/V2; any current Matrox card would slow it down even more. I can afford to spend some of those CPU clocks on a video card that needs them for performance (as apparently the G400 does).

Also: I noted that I had the sound disabled when running the Q3Test benchmark to assure those reading my message that I was really comparing *video* performance, unhampered by any audio bottlenecks. Either you or I may have the more CPU-efficient sound card, but that's not a factor in the numbers I quoted.

This is the first I've heard of the Turbo. What is it?

6th October 1999, 10:20

SLi is quicker only in Opengl , G400 in D3D; plus SLI has not the colour capability of the Matrox--see Q3--oops--you can't can you .

General Veers
6th October 1999, 10:35
What are you talking about? V2 SLI works fine with Q3, at least on my comp. I get playable framerates at 640x480 with high quality on, or 800x600 at medium quality.

Steve Snyder
6th October 1999, 11:38
Troop: "SLi is quicker only in Opengl , G400 in D3D; plus SLI has not the colour capability of the Matrox..."

Yes, that's the basis of my desire for a "G800". A Matrox video card that could hold its own against SLI/V2 or V3/3000 on their terms (as I said, 16-bit @ 800x600 or 1024x768). That would still give me the option of gaining visual quality at the expense of performance by selecting 32-bit resolutions.

Troop: "...-oops--you can't can you"

Don't be snide. I am aware that the speed of the V2/V3 comes at the expense of visual quality. Remember all that gnashing of teeth when the specs for the V3 were first revealed?

My regret is that I currently don't have the option of selecting speed vs. quality. I can either have a high frame-rate with my SLI/V2 or I can have a dazzling display with a G400.

6th October 1999, 12:01
Well on my PIII 500/G400 MAX with the new TurboGL driver that is to be released on Friday I get the following scores in Q3Test v1.08 demo #1, 32-bit colour, everything set to max.

640x480 = 80.7fps
800x600 = 66.0fps
1024x768 = 42.4fps
1280x960 = 25.2fps
1600x1200 = 14.9fps
2048x1536 = 8.0fps

6th October 1999, 12:18
Average frame rate doesn't count for much. What matters is the LOWEST frame rate. It's one thing if it dips once or twice because of something in the background, but if every time you walk into a room with guns blazing the framerate drops to 15FPS, the game won't be playable online.

The D3D argument isn't valid for every game, either. For UT, the G400, in D3D mode, can't hang with a V2 SLI rig using the same exact settings as the V2 - 16/high/high/detailed/HUD, all the sound options on hardware, etc.

I have hopes for the soon to be released drivers. Then again, if my games are still unplayable, I'll just go back to the V2 SLI until the V4 comes out. When I bought the G400 MAX, I was really hoping to be able to chuck the V2 SLI rig - but I can't stand getting spanked by someone in HL or UT because their piddly little V2 8MB can update faster than my G400 MAX.

Then again, my 2D desktop is really purdy. :-)

Steve Snyder
6th October 1999, 12:27
Ant: "Well on my PIII 500/G400 MAX with the new TurboGL driver that is to be released on Friday I get the following scores in Q3Test v1.08 demo #1, 32-bit colour, everything set to max."

Those are pretty impressive numbers. The figures for 800x600 and 1024x768 rival those that I get on my SLI/V2.

What accounts for the improvement over posted benchmarks, the TurboGL driver + a PIII?

Also: if new driver is be given the credit for the performance improvement, is there (or will there be) something like for Linux/XFree86?

Thanks for the info.

P.S. 8fps @ 2048x1536x32bpp? That resolution would turn my monitor into a heap of smoking rubble. :-)

6th October 1999, 12:59
Steve: They rival your V2/SLI? Well, considering that V2 doesnt do 32bit color or 512x512 textures the G400 whomps V2/SLI butt

6th October 1999, 14:48
Another one thinking 16bit+16bit=32bit?? http://forums.murc.ws/ubb/wink.gif

I'm kidding, Steve, don't attack me on that, but I got to side with Rick & Ant here.

The G400 in 32bit kicks ass over a V2/SLI combination. But you allready said you only want raw speed, so why not stay at that, and let us enjoy the colours, lighting, bump mapping and the slightly slower speed?


6th October 1999, 15:08
That's going a bit (sic) far. In fact ,in (still) the best FPS ,SLI rulz, unfortunately.Now HL is @16bpp but even so , the current Matrox ICD dogs it .And its ugly.
That's why I told the original poster to wait for Friday .I too want to dissasemble my SLI but I have him beat here : mine's on a MAX.

BTW , anyone posting test-score fps might well show as well as cpu the screen size . The score , @ fullscreen , is much better
on a 17" than a 19" monitor.

In Harm's Way

6th October 1999, 20:56
In Unreal, I am not sure about in UT, on my G400max compared to an SLI Voodoo2, using D3d on both cards, get higher fps on my max. In Glide in Unreal, voodoo rules. In D3d, slow and ineffective. Also, G400 beats the hell out of Voodoo2 SLI in Quake3. Considering that the 3dfx card does not support the highest image quality settings (everything on max), it seems like G400 is a good idea for you.

7th October 1999, 01:03
My scores were fullscreen on a ViewSonic P817 21".

7th October 1999, 01:43
I know this isn't directly related to this topic but i have to ask.

Ant, Is the P817 really worth the price tag?
I was looking at one as i'm updating my monitor soon but baulked a bit at the price ($4037 AUS)...

Gigabyte GA6BX, 128MB PC100 Ram, 16MB G200 Millenium AGP, Creative Labs SBLive!, Netgear FA310TX, Sony 32X CD-Rom, Imation LS120, 5.2 & 12.7GB Quantum HD's, Everything being driven by Win98...

Bored Yet?

[This message has been edited by Delany (edited 10-07-1999).]

7th October 1999, 05:00

I can corroborate Ant's story. On a P3-550 with TurboGL, I'm getting mid-to-high 40's at 1024x768 with sound ON and all eye candy ON (texture slider back one, they still need a little work in that area) in Quake3. My old maximum framerates were in the 50's, but it would slow to 10fps in some areas, bringing the average down to mid-20's.

Now, the maximum is still in the 50's, but the minimum is in the high 20's or low 30's, so the average is in the 40's.

And that's with 32-bit color, 32-bit textures, shadows, etc.

So, not to gloat, but I beat the snot out of your V2 SLI rig, in a color depth you haven't got.

And I've got a REGULAR G400.

- Gurm

G. U. R. M. It's not hard to spell, is it? Then don't screw it up!
The word "Gurm" is in no way Copyright 1999 Jorden van der Elst.

7th October 1999, 05:26
Yup--we got some thing to look forward to: http://www.anandtech.com/html/review_display.cfm?document=1054&pagenum=8

In Harm's Way

7th October 1999, 06:35
Anyone know if the drivers out tomorrow going to be DX7 optimized, and exactly how optimized (once-over, or a full polish and spit-shine)?

As for 16bit vs 32bit, I'll trade bit-depth for resolution any day in a multiplayer game (16bit being as low as I'll go). Hard to see those heads peeking around objects at 100 paces when sniping on doublecross/rats on HL, even at 1024x768. :-)

7th October 1999, 06:44
Whats funnier than bragging about V2 performance on a Matrox board? BWWWWAAAAAHHAHA

7th October 1999, 07:45
You just keep right on laughing while I launch a rocket into your chest in UT and you see your frame rate go down the tubes. :-)

7th October 1999, 16:12
I wouldn't pay the asking price for it, mine is on loan. It is a nice monitor but personally I'd probably go for a high quality 19" and put the rest of the money to good use.

7th October 1999, 16:51
Good point.

For the price of one P817 I could buy 2 Sony 400PS 19" monitors and a G400MAX...

Hmmm...now there's a thought...

the G400MAX is already on the way anyway...

Gigabyte GA6BX, 128MB PC100 Ram, 16MB G200 Millenium AGP, Creative Labs SBLive!, Netgear FA310TX, Sony 32X CD-Rom, Imation LS120, 5.2 & 12.7GB Quantum HD's, Everything being driven by Win98...

Bored Yet?