Announcement

Collapse
No announcement yet.

Explanation of Millennium - G200 era

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Explanation of Millennium - G200 era

    I jumped on the Matrox bandwagon with G200. Their previous cards are a bit unknown to me (wasn't following the hardware scene at that time).

    So could someone please explain / briefly sum up date of release and features / drawbacks / lead to me.

    One of the reasons I'm posting this is that I won a Millennium in online auction for $4.65. It's a decent secondary PCI card and should be great if/when I turn this machine into server.

    If I'm correct:

    Millennium (when?)

    Mistique (same core as Millennium?, (why) was Mistique worse than Millennium?)

    Millenium II

    The infamous M3D

    Productiva G100 (Which core?)

    Millennium G200 (Had it, died, replaced it with G400 3D difference was that OMG feeling - quite substantial.)
    Last edited by UtwigMU; 6 March 2003, 07:13.

  • #2
    this should help: http://www.matrox.com/mga/archive_st...ct_history.cfm
    OFFICIAL EX- EMPLOYEE

    <font size="1">"So now I'm dreaming<br>For myself I'm understanding <br>Performing there, one hundred thousand fans would gather one and all <br>And so decided, we could rule it all if we should <br>Dance all away across the greatest city in the nether world..."<p>- Central Park 09/24/03</font>

    Comment


    • #3
      Re: Explanation of Millennium - G200 era

      Briefly and from memory (I don't have dates),

      Originally posted by UtwigMU
      Millennium

      MGA2064W. First card to use WRAM (from Matrox anyway). Could be configured with between 2 and 8MB memory. This was a very fast VGA and Windows card, as opposed to previous Matrox cards, which were very fast under Windows and horrifically slow as VGAs. The Millennium took over from the Tseng ET4000 as the card to have for games and scene demos since it was so fast. It also had some basic 3D abilities (lines, gouraud shading), but I don't think they ever got used.

      Mystique

      MGA1064. Matrox' 3D gaming card. At the time no-one thought much of it, but these days it's probably the only pre-Voodoo consumer 3D card you can talk about with a straight face. Had 2 or 4MB of memory. There was also a MGA1164 version which bumped the RAMDAC from 170MHz to 220MHz.

      Millennium II

      MGA2164W. Like the name suggests, a new improved Millennium. More memory (up to 16MB), faster RAMDAC, etc. This was a very very fast card for Windows (apparently even faster than the G200). Supposedly it also had texture mapping on top of the Millennium's 3D features, but again nothing made use of it.

      The infamous M3D

      The only Matrox card using someone else's chip (a NEC PowerVR, ancestor of the Kyro). Suffice to say, they're probably not going to do that again in a hurry.

      Productiva G100 (Which core?)

      Basically a G200 without the WARP engine, I think. With this card Matrox abandoned WRAM and went with SD/GRAM. Not sure what you're asking by "which core?"

      Millennium G200

      Matrox' next attempt at a gaming-oriented chip after the Mystique. Actually it did pretty well - I seem to recall it held the speed crown for about...oooh....three weeks or so, while still looking better than everything else, of course.
      Blah blah blah nick blah blah confusion, blah blah blah blah frog.

      Comment


      • #4
        So Millennium cam out in 95/96? When exactly?

        I recall CADdie (member of Ars forums) boasting about autocadding in '95 or '96 with dual 21's and 2 Millenniums @ 2x1200x1600 on a P90.

        Comment


        • #5
          The Millennium must have come out early in 1995. I remember it was around before Win95 was.
          Blah blah blah nick blah blah confusion, blah blah blah blah frog.

          Comment


          • #6
            Re: Re: Explanation of Millennium - G200 era

            Originally posted by Ribbit
            Briefly and from memory (I don't have dates),
            Millennium II

            MGA2164W. Like the name suggests, a new improved Millennium. More memory (up to 16MB), faster RAMDAC, etc. This was a very very fast card for Windows (apparently even faster than the G200). Supposedly it also had texture mapping on top of the Millennium's 3D features, but again nothing made use of it.
            Ahemm.. I had a Millennium 2 way back, and it supported Direct3D just fine, thank you very much No bilinear filtering, though, but it worked just fine for most D3D games of the day.

            Comment


            • #7
              Thanks for clearing that up.

              As a defence against further nitpicking I'll also mention that the Mystique didn't use WRAM as I might have implied.
              Blah blah blah nick blah blah confusion, blah blah blah blah frog.

              Comment


              • #8

                it is my cards
                you can find Millennium,mistique,MillenniumII,G100,G200 and so on
                PC:Intel P4 3G |Intel D875PBZ|Geil PC3200 256MB Golden Dragon x 2| matrox Parhelia-512 R 128MB|Creative SB! Audigy2 Platinum|Seagate Barracuda 7200.7 SATA 120GB x 2 Raid0|WesternDigital WDC WD1200JB-00EVA0|LG 795FT Plus|LG HL-DT-ST RWDVD GCC-4480B|LG HL-DT-ST CD-ROM GCR-8523B|LGIM-ML980|LGIM-K868|SF-420TS
                DataCenter:Intel PIII 450|Intel VC820|Samsung RDRAM PC800 256MB x 2|matrox Millennium G450 DualHead SGRAM 32MB|Adaptec 2940UW|NEC USB2.0 Extend Card|Intel pro100 82557|Samsung Floppy Disk|Fujitsu MAN3367MP|Seagate Barracuda ST136475LW|IBM DTLA-307030|Sony CU5221|SevenTeam ST-420SLP|LGIM-ML980|LGIM-K868

                Comment


                • #9
                  ahh...the g200, the fastest for a week or two....but the fecking best looking output for at least 6 months.....well except opengl....and that got trippy with the patented rainbow effect.

                  Comment


                  • #10
                    was the only time when you could get best of both worlds:

                    G200 + Voodoo2 SLI

                    best 2d, best 3d. nuff said.

                    Comment


                    • #11
                      Lol Kurt.
                      I had that, too. Pricey as HELL tho!



                      ~~DukeP~~

                      Comment


                      • #12
                        yeah, and that's without factoring the monitor with dual input (BNC + HD15)

                        Comment

                        Working...
                        X