Announcement

Collapse
No announcement yet.

Geforce2 mx kopykat

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Geforce2 mx kopykat

    Two months after introducing its latest high-end graphics processor, the GeForce2 GTS, Nvidia today unveiled a chip designed for the mainstream home and business markets. The GeForce2 MX brings performance at the level of last year's star, the GeForce 256 SDR, down to a starting retail price of about US$120.

    With the GeForce2 MX, Nvidia is making its most serious move into the mainstream home and business markets to date, and it has already announced deals with NEC and Fujitsu Siemens to have the chip integrated into those companies' PCs. Important features for business-oriented users include dual-monitor and digital flat-panel support. Nvidia's new TwinView dual-monitor feature pushes resolutions up to 2048 x 1536 at 60Hz on the primary display and 1600 x 1200 at 100Hz on the second. Nvidia is also introducing a feature called Digital Vibrance Control, which is designed to make colors more brillant in 2D applications, 3D games, and video.

    In terms of performance, the MX includes most of the generational innovations of the GeForce2 GTS. The 175MHz processor features two pixel pipelines (compared to the GTS's four), both with dual texturing. This provides 700 megatexels of texture-rendering power, which places it right between the GeForce 256 (480 megatexels) and the GeForce2 GTS (1600 megatexels).

    One of the steps Nvidia has taken to keep down the MX's price, however, is to pair it with the slower memory architecture of the GeForce SDR. With the 128-bit memory bus running at 166MHz, the MX has exactly the same memory bandwidth as the GeForce 256 SDR. Memory bandwidth, the speed of the critical path feeding the graphics processor with texture data, is a key factor in determining performance in the current generation of graphics cards.

    The lower-powered MX design will allow Nvidia to use the chip for new applications. The MX generates only four watts of heat - half the level generated by the GTS and one-quarter the level generated by the GeForce 256 - making it a natural candidate for notebooks. Nvidia is confident that the chip won't even need a heatsink in desktop systems. The GeForce2 MX is also the first Nvidia component to natively support Macintosh color formats. Nvidia representatives asserted that the company is not planning on introducing a Mac product without Apple's backing, but that native hardware and software support is a necessary first step towards that market.


    read the text , you'll see it has twinview ( in other words DualHead ) and the vibran color thing , ( matrox has that too )

    what's next .. the Gforce 3 ??

  • #2
    I'm surprised that it have taken until now for any of matrox rivials to present that feature, Matrox was well ahead of time when they presented their masterpiece G400.
    And they've probably made many many people green of jealousity when people still today persist in comparing the G400 with Gforce2! To compare G400 with tnt2 and same age 3d-processors is a fair thing.

    ------------------
    My system:
    • Athlon 700 MHz
    • 192 Mb Ram
    • Asus K7V mb
    • Matrox G400 (oc @ 150/200)
    • wester 10.2Gb and seagate 6.5Gb (removable) hdd
    • Soundcrapster PCI64 (soon Aureal A3d )
    • Logitech Force Feedback wheel (cool)
    • 300W atx case

    My system:
    | Athlon XP2600+ 2,4GHz@200FSB| Water Cooled Processor | Samsung 2x256MB PC3200 Ram (400MHz) | Epox 8RDA mb | Hercules FDX Radeon 8500 LE 128MB | Abit Hot Rod DMA/100 RAID controller | 2*20GB IBM and Western 10.2GB @RAID controller | Soundblaster 5.1 Audigy Player | Actima 6x/32x DVD-ROM, Samsung 12x/8x/32x CDRW | Logitech Force Feedback wheel | Microsoft Sidewinder Force Feedback 2 Joystick | 550W water-cooled Q-TEC PSU | 278/278 kbit DSL internet connection | Windows 2000 Pro | My Homepage
    homemade watercooling in use,Pics on the homepage...

    no more VIA, no more!

    Comment


    • #3
      The reason it gets compared to the g400 is well.....lets see, what is Matrox's latest card on the market. hmmm, not the g800 or the g1600. It's the g400. You should be yelling at Matrox to get off their Arses and make something competitive more than once every few years.
      Asus K7V
      Athlon 700
      128mb PC133 HSDRAM
      Matrox Millennium g400max
      Adaptec 2940U2W
      IBM 9gb U2W
      Plextor 8/20 cdr
      Diamond MX300
      3com 905b-tx

      Comment


      • #4
        Forget that $120 pricetag, for the dual head version it's $200.

        Comment


        • #5
          I actually think that it is refreshing for a developer in such a fast moving part of the industry to be taking there time. I have not cmplaints about the fact that matrox haven't actually released anything after the G400. I am just going to sit back and wait to see what new thing they bring to the market and but the card when it arrives.

          Athlon 1Ghz [Arghh]
          Abit KT7A [Arghh] [Arghh]
          512 MB Ram
          Kyro 4500 [Arghh]

          Comment


          • #6
            mj12 - Actually, the G400 is more than just competitive. It is in a class by itself. Clearly the best card on the market, unless you are in the niche of users who need the most fps you can get. (actually, no one needs more fps than the MAX provides, they just think they do).

            IMHO

            Comment


            • #7
              It was just another card to me, without dual head, it's basically a better TNT 2. Hardly in a class by itself, let's not get too carried away with the brand name loyalty.

              As for taking their time, well, I don't think they have the luxury, if they ever did, NVIDIA's goal is total market domination ala Intel. I see the GTS MX as a clear indication that Matrox is in their direct line of sight as a target.

              Comment


              • #8
                Matrox is known for their technology and inovation, Only now Nvidia is starting to copy what Matrox achieved a year ago.
                My point is no matter what Nvidia and others think of, you can always count on Matrox to develope something different and inovative that will take the competition years to achieve.

                Cheers,
                Elie

                Comment


                • #9
                  from what I've read, the g450 should "spank" the gf2mx strictly because of nvidea's cheapo approach to memory strangulation
                  jim

                  ------------------
                  Abit BE6-2 with P3 700(cb0) @ 7*143=1002 using an MC-1000!
                  15.3 gb Maxtor ata66 and Twinmos 128mb pc-133 ram
                  G400 vanilla 32mb @ 168/210 @ 2x
                  Sblive with Altec Lansing speaker combo
                  384k DSL and Realtek nic
                  Windows 98se with DX7a
                  Worn out reset button :O)


                  PIII-500mhz @ 620 ! with an Abit BE6 mobo
                  128mb pc-100 cas 2
                  Mill G400 (vanilla!!!) 32mb @ 167/208 with MGATweak-417mhz, (2.5, 2, 2.5), PD 5.5010 & bios 1.5-22
                  Maxtor 14.3 gb Uata66 hdd
                  SB Live!
                  Winblows 98se & DX7
                  and 384k DSL!

                  Comment


                  • #10
                    To be fair, the G400 cannot be compared to a Geforce in terms of 3d performance. In most cases the G400 hangs around with the TNT2U, but it has other features that make us users of it not want to upgrade (dual-head especially).

                    The geForce2mx Digital Vibrance Control is NOT the same thing as the Matrox VCQ2. The matrox feature means 32bit color calculation internally regardless of the final color depth, while the Nvidia feature is nothing more than a glorified gamma control.

                    Comment


                    • #11
                      Erm, from what I read, the MX can use 64 bit DDR, just like the G450, not sure how it could be choking one card and not another.

                      Comment


                      • #12
                        matrox was not the first company to put dual display on a single card. Appian Graphics had it before them. So judging by the attitude of the original poster, Matrox copied it from them. Arguements about who is copying who are so childish.

                        Comment


                        • #13
                          Ok I agree with the last post , but Nvdia took the dual head thing because it became popular with the G400 , no risk there , because people already like it

                          like 3dfx , they're afraid of implementing new technology

                          Comment


                          • #14
                            I'm sure Matrox will remain innovative and will put out a few suprises, the worry is that they are getting slower and slower getting these things out onto the market. Lets face it the G450 is not much more than a G400 with a fresh lick of paint and should have been out well before now.

                            Comment


                            • #15
                              mj12 I'm sure I don't know what you mean, haven't Matrox just announced a new chipset the G450?

                              Comment

                              Working...
                              X