View Full Version : Matrox Millenium G200 overclockspeed!

5th October 1999, 06:10
what are the maximum settings for a G200? I'm using "Performance Tuner".

5th October 1999, 06:34
Well, I've gotten mine up to 246 MHz GClk (160 MHz MClk), but your mileage may vary.

I say "Here's a monkey wrench... If you bop me over the head with it long enough I maight wake up for a second."

5th October 1999, 12:28
uhm... Im really new to all this overclocking stuff. I use "preformance Tuner" and all I can adjust are "memory clock speed" and "engine clock speed". if I put my MCLK on 160Mhz, the engine goes on 120Mhz... is this ok?

5th October 1999, 14:39
I would suggest downloading MGATweak from http://mgatools.matroxusers.com/ - it was designed for the G200/G400 chipset and will probably give you the best results. If Performance Tuner is telling you 160MHz memory clock (MClk) and 120MHz graphic clock (GClk), it is not accessing the chip quite right. Those values sound backwards, as the MClk on G200s is lower than the GClk (usually divided by 1.5). Give MGATweak a try - I managed to increase my benchmark performance by about 1.5*, mostly from using it for overclocking. If you are going to get serious about it, make sure you have an adequate heatsink (and maybe fan) on the graphics chip.

I say "Here's a monkey wrench... If you bop me over the head with it long enough I maight wake up for a second."

6th October 1999, 08:01
Allrighty, I'll try that! http://forums.murc.ws/ubb/smile.gif
Thankz mate!

6th October 1999, 16:06
Anid - The memory clock will never be lower than the graphics clock.

6th October 1999, 20:20
Opps.... talk about a lapse of thought... I'm used to thinking of the Sclk, not the Gclk... so, in the real world (where I don't visit very often), the values of 160 and 120 would actually be correct. The 120 MHz of the "graphic engine" would correspond to an Sclk of 240MHz, which is what I was thinking of. Sorry about the (gross) error, and thank you Ashely for mentioning it. (Read the sig and it all becomes clear...)

I say "Here's a monkey wrench... If you bop me over the head with it long enough I might wake up for a second."

7th October 1999, 05:46
ok, so now Im using preformance tuner... but uhm...what would be a nice setting? 150Mhz memory clock, and 110Mhz Engine clock would do fine??

8th October 1999, 00:32
Not sure about Performance Tuner but I use MGATweak with the fast settings for SGRAM optimisation on my G200. I am able to o/c and get up to 205 which translates to ~3300 on 3dMark99. The SGRAM optimisation alone was able to bump up the benchmarks significantly.

Has anyone changed the multipliers at all? Any chance of G200 owners posting their settings to see what is possible?

With the older PowerStrip (v2.5x) Maggi, with his Rev. 5 Mystique, was able to o/c to 164 (?) - I would max out at 140...


8th October 1999, 07:30
Never had any luck boosting performance by changing the clock dividers. I have found that I get best performance (and less artifacts) by placing clock-speed ahead of RAM optimization. That is how I've gotten my Mill G200 SD 16MB to 246MHz SClk. Quake2 will run stably with it at 240MHz, with minimal to no artifacting.
But for RAM optimization, I wanted to try and find the default settings of the PD drivers, so I could still change the refresh rate and try to tweak the RAM a little without slowing it to the BIOS default RAM timings. I don't know that these are the actual values, but they are the settings that most closely mimic the baseline (not using any OC) in all benchmarks I could run.
CL:3 RAS-RAS:3 RAS-CAS:3 RAS Min Act:6 RAS Precharge:3 Write Recovery:2 Read to Precharge:Fast Special Mode:1 Block Write Cycle:1 Block Write to Precharge:3 Base Read Delay 0:0.2 Base Read Delay 1:0.2

Taking the RAS-CAS down lower seemed to crash it quicker than anything, I can run it at RAS-RAS:2 and CL:2, but there is a lot of artifacts. The Enhanced PCI Master Read doesn't seem to boost performance much, but does decrease stability and visual quality.

9th October 1999, 09:59
Due to the fact that changing G200 WCLK divider will result in system crash, G200 doesn't have the flexibility of changing dividers as G400. Some G200 owners only change GCLK and MCLK dividers, while let alone WCLK unchanged. That will result in lower throughput from the WARP engine. Although they get better results in 3DMark99 due to better GCLK and MCLK, I didn't recommend such configuration.

However, for G200 PCI WCLK divider can be changed without resulting in a crash. There has been very few G200 PCI around, and the only-one G200 PCI beta tester of MGATweak doesn't have any problem using different WCLK dividers.

As for Enhanced PCI Master Read, G200 PCI performance as measured by 3DMark99 will improve 5% if enabled. There are no noticeable improvement on G200 AGP as I tested this option.

9th October 1999, 21:42
whoa, i'm kinda confused by what i should/can change

168.75 is what the system clock seems to be at by default...

(marvel g200)

clock dividers? huh? it's at 2/1.5/2

9th October 1999, 22:16
Yeah, the clock dividers are where they are supposed to be and so is the SClk. I wouldn't worry about changing the dividers, you'll get maximum benefit from upping the SClk. I've set mine to 192MHz default in BIOS (thanks gbm) and can run it as high as 246MHz if I need extra juice. Basically, just increase the SClk by 5% increments until you find the maximum stable speed. After that you can try playing around with the RAM timings if you feel adventurous.
And Kamzter, for your original question, 120 graphics engine and 160 memory is up there pretty high (240MHz SClk). I've gotten mine to run at 248, but with no stability. It'll do 246 stable, but with artifacts. If you wanna stay with Performance Tuner, 120 is a fast setting, you might be able to get to 123 or 124. I would still recommend using MGATweak if you wanna try and get all you can from your card.

I say "Here's a monkey wrench... If you bop me over the head with it long enough I might wake up for a second."

Lucky Luke
11th October 1999, 11:05
Great forum!

Hi I'm from Italy.
I read all the information in this site
about overclocking G200 and I'm perplexed ...
I've got an AGP G200 with 8mb SDRAM and now is working with this settings:
- PLL SClk 255.00
- Glck 127.50
- Mclk 127.50
- Wclk 127.50
- PCI Retry & Enhanced ... Read = ON
- all placed at lower value except for
- Memory Refresh Counter = 3521
- RAS Minimum Active Time = 6

Is it possible (especially about WClk)?

3DMax 99 Mark give me +/- 1950
AGP is 1x; 2x crash even if I don't
overclock the G200.


11th October 1999, 13:30
hey, I have the same crash when I put my G200 8mbSDRAM on 2x AGP!!! But when I turn off bus mastering, it doesnt crash! but it goes slightly slower then! whats up with this??!?

16th October 1999, 20:42
I have been messing with MGATweak, on my Marvel G200-TV AGP. I have found that with the default divider, when i get to a MClk of about 122-125, i start getting defects. (damn SDRAM) this is with all SDRAM optimizations maxed. I have a fan on my G200, so i figured i should be able to raise the chip higher than that, so i increased the memory divider up to 2.0. That way i can increase my SClk higher, and not hit that memory limit. I found that i could increase it all the way up to a 249.75 SClk (124.88 on all clocks) and everything still runs completely stable. I will play further, but i know my memory doesn't like going much faster, so i'm sure it won't be near as stable.

One other note. I have an SB Live! Platinum, and i found that at times when the Video card is hogging the bus, i get sound breaks and loops. I turned off the PCI retry setting (which gave a VERY slight decrease on the graphics, but it cleared up my sound perfectly.)


16th October 1999, 21:27
Why doesn't my Matrox Tweak thing save my settings? or am I just hallucinating? Cuz everytime I change my settings, and select "save settings" (or something) and then hit apply it doesnt seem 2 save it! When I re-open it, the settings are as they were... the standard settings...