Announcement

Collapse
No announcement yet.

Coexistens of Matrox Millennium & G-force

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Coexistens of Matrox Millennium & G-force

    Yesterday I bourght a Asus Gforce with DDR ram as repacement for my "not fast enough" Millennium-Voodoo 2 SLI combination. Man, what a disappointment the 2D res was (I run 1600x1200 small font on a Viewsonic P815). Fonts was unsharp and the colors were noway near as clear. Quake 3 was however as expected very smooth.

    So what I want some help with is; running the still, surprisingly, supperior 4 year old Millennium for 2D and the G-force for 3D. I use win2k and the 2D is no problem - it works. But none of the games I have will accept the G-force. Try running Q3 on the Millennium.

    I have to inputs on the monitor, VGA and BNC, so I'm basically running a multiple monitor setup on one.

    Was I a nut not buying a Matrox G400 Max instead of the G-force - I mean is the Max fast enough?

    Any throughts would be appreciated

    Louis


  • #2
    hi is your mimatrox card pci ??

    and your geforce agp ? if ya put them in @ the same time and use a little app u can change what card gets used for what
    http://www.3dfiles.com/utility/3dcontrolcenter.shtml (for 3d control center)
    http://www.3dfiles.com/utility/ (go 2 multiple video cards)


    ------------------
    P2 333 (not o/c yet <G&gt
    128mb 66mhz sdram
    gigabyte bx2000
    matrox g400 max (not o/c
    ....................
    P3 600e @ 660 (6*110)
    128mb 100mhz sdram
    abit be6-2
    Radeon 32ddr (biding time till the g800
    voodoo 3 2000 pci (166)
    soundblaster 16pci
    4.3gb seagate udma 33
    15.3 wd udma 66
    creative modem blaster 56k ext
    win me
    ie5
    direct x 8.0
    4013.71

    ....................

    Comment


    • #3
      can I have your voodoos?

      Comment


      • #4
        "Was I a nut not buying a Matrox G400 Max instead of the G-force - I mean is the Max fast enough?"

        Yup.
        You see how superior the Mill2 is in 2D visuals...the G400 is even better. And all G400s are plenty fast enough. Ya nut

        You should be able to get those cards to work together if you want to, though. Since you are wanting to use the GF for 3d and the Mill2 for 2d, you will only need to use the above listed program once, to set D3D to use the GF. Then you will have to set up a double desktop for 2d, and use the bnc/vga input toggle of your monitor to switch between the two.

        Or you could take the GF back, and get a G400 It's not like these are slow cards. GF may score higher average and max in some (mostly OGL) games, but G400 usually holds a higher minimum fps, plus you get the Matrox visuals...
        Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

        Comment


        • #5
          3DControl Center will only let yet you switch between Direct3d cards. For Quake3 (and other OpenGL games) you can try GL Switcher...use it at your own risk though (I haven't tried it).
          http://home.earthlink.net/~heavensrage/ogls.html

          Games Box
          --------------
          Windows 2000Pro, ASUS A7Pro, Duron 750@950, 192MB Micron PC133, OEM Radeon DDR, 15gb Quantum Fireball+ LM, Fujitsu 5.25gb, Pioneer 32x slot load CDROM, SB Live! Value, LinkSys LNE100, Altec Lansing ACS45.2, Samsung Syncmaster 955DF, Sycom 300va UPS

          Video Box
          ------
          Windows 2000Pro, PIII700 on ASUS CUBX, 256mb Micron PC133, Vanilla G400/32 (PD5.14), Hauppage WinTV-DBX, LinkSys LNE100, 8.4gb Maxtor HD, 40gb 7200 Western Digital, Diamond Fireport 40 SCSI, Pioneer 32x SCSI Slot load CDROM, Pioneer 10x Slot load DVD, Yamaha 4416s burner, MX300, Panasonic Panasync S70

          Feline Tech Support
          -------------
          Jinx the Grey Thundercat, Mischa (Shilsner?)(still MIA)

          ...currently working on the world's first C64 based parallel computing project

          Comment


          • #6
            If you replace your GeForce with a G400, the G400 will also compare poorly to the Millennium.

            Just switch your BNC cable from the Millennium to the GeForce...

            Comment


            • #7
              G400 IS FAST!!
              http://forums.murc.ws/ubb/Forum4/HTML/003229.html

              Check it out.

              ------------------
              The Wonderful thing about tIgGers is tIgGers are fast deadly and sneaky. ;-)
              The Wonderful thing about tIgGers is tIgGers are fast deadly and sneaky. ;-)

              Comment


              • #8
                Man we all wish we had a G-force here, what a silly question...

                Comment


                • #9
                  I just upgraded to 600e coppermine that runs swell at 800mhz. I ran the Wintune98 benchmarks with this result.(my computer is me)
                  http://www.teleport.com/~zetra303/WINTUNF.jpg

                  HEADLINE-
                  16MB g400 slays GEForce DDR's again!

                  ------------------
                  The Wonderful thing about tIgGers is tIgGers are fast deadly and sneaky. ;-)
                  The Wonderful thing about tIgGers is tIgGers are fast deadly and sneaky. ;-)

                  Comment


                  • #10
                    When I plug in GeForce I had to reinstall windows, although I did uninstall G400. There was trouble with OGL on GeForce, D3D worked fine. Since I have 15 inch nokia monitor I cant see 2D picture quality diference, but 3D is WAY faster on GeForce, especialy multitexture games. (G40-245 Mtex fill rate, GeForce 470).

                    Comment


                    • #11
                      Wait a minute !!!

                      Look into your MoBo's BIOS and search for an entra like "VGA Boot Sequence" or something like that.
                      It is a switch to determine which slot is used as your primary display adapter and usually default to PCI/AGP ... set to AGP first (AGP/PCI) and see if it helps you out to play 3D games.

                      PS: You were a nut ...

                      ------------------
                      Cheers,
                      Maggi

                      Despite my nickname causing confusion, I am not female ...
                      Despite my nickname causing confusion, I am not female ...

                      ASRock Fatal1ty X79 Professional
                      Intel Core i7-3930K@4.3GHz
                      be quiet! Dark Rock Pro 2
                      4x 8GB G.Skill TridentX PC3-19200U@CR1
                      2x MSI N670GTX PE OC (SLI)
                      OCZ Vertex 4 256GB
                      4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
                      Super Flower Golden Green Modular 800W
                      Nanoxia Deep Silence 1
                      LG BH10LS38
                      LG DM2752D 27" 3D

                      Comment


                      • #12
                        I personally don't see any difference in 2D between my G400 and my former i740 chip. But OK, I have only a 15" monitor, so you can't see the difference.

                        However a friend bought a 21" monitor and compared both the TNT2 and G400 and acknowledged that the G400 has a somewhat sharper image at 1600x1200 but that the difference was very small.

                        I personally feel that the crisp and clear image of Matrox cards is somewhat exagerated. But other might think otherwise.

                        Frank

                        Comment


                        • #13
                          Hi All

                          It seems this thread has taken a turn - and is now about visual qualities. Fine with me.

                          I can add that the Millennium, Matrox’s first Millennium with 4 mb wram, and the Asus Geforce 32 mb DDR are working in the same system (Win 2K) and it is possible to switch between the two without rebooting. The problem was, that games that do not have the ability to select adapter, would always use the primary display adapter. Witch in my case was the Millennium. It is possible to make that change in windows display settings and then switch input VGA/BNC on the monitor.

                          Visual qual:
                          The difference between the two adapters is big. In fact like night and day. The god old Millennium is brighter, sharper, whiter, and clearer.

                          However it maxes out at 75 Hz at 1600x1200 (I used to have 80 Hz in win98 does anyone know how to hag the registry?), can only do 16 bit and is properly somewhat slower. The Geforce can properly do 120 Hz in 32 bit. There is no difference between the VGA and the BNC input.

                          There is also not a very big difference at resolutions up till 1280x1024, but once we get to 1600x1200 and above… When I first saw it I thought that something in the set up must really wrong. Especially because I read everywhere that the Geforce was second to only the G400 in visual quality. If it is, it’s labs ahead!

                          On this thread Ashley wrote that the G400 would compare poorly against the Millennium. I thought every step in the Millennium evolution brought better visual quality? (Millennium -> Millennium II -> G200 -> G400) Isn’t it so?

                          About speed
                          I tried to run the Utbench.dem and got 33 frames/sec in 1024x768x32 on the Geforce. What could I expect from a Max? (I have PIII 500, 256 mb, lots of disk space and win 2k). What about Q3 and win 2k?

                          Thanks to every one for participating.
                          Louis

                          Comment


                          • #14
                            Louis, I was alluding to the issue of cabling, not card, differences. (I assume both cards are run at the *same* refresh rate, or your comparison isn't valid at all.)

                            At 1600x1200x75Hz, you say you can NOT tell the difference between using a BNC cable and using a VGA cable??? The GeForce is inferior either way?

                            Nah... I don't buy that.

                            Comment


                            • #15
                              Ashley,

                              I'm sorry but that's the way things are. There is no difference wether or not I use VGA or BNC, 75 Hz 85 Hz or 90 HZ, 16 or 32 bit planes. If there are differences they are marginal, and as I mentioned we're talking night and day here!

                              Louis

                              Comment

                              Working...
                              X