Announcement

Collapse
No announcement yet.

G400/Max Mod?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • G400/Max Mod?

    There have been reports of poor 2D image quality on the new MX Geforce cards. Someone posted a hardware modification at http://www.geocities.com/porotuner/imagequality.html which bypasses the card's low pass filter. It got me wondering if a similar mod would improve my G400Max's 2D quality at high resolutions (1600x1200)? It's not bad, but even my lowly Millenium I outperforms the G400 (at least on my system).

    Has anyone ever tried it?

  • #2
    It´s theoretically impossible a Millennium I being better in 2D than the G400MAX. There´s something wrong with your G400, or else why would you need to improve the image quality of the best-in-its-class 2D card in the universe?

    Comment


    • #3
      Impossible? Why? The Mill-I has an external RAMDAC, the G400's is internal.

      And look elsewhere on this website, I'm not the only one who sees a difference.

      Comment


      • #4
        What does that have to do with it? If anything having the dac internal would help prevent noise from affecting it.

        There is no way a Mill 1 beats any G400!
        "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

        "Always do good. It will gratify some and astonish the rest." ~Mark Twain

        Comment


        • #5
          Thanks for the replies, guys. Has conventional wisdom changed over the last year or two? I doubt it. This is from: http://www6.tomshardware.com/guides/graphic/index1.html

          "On the video card side, the RAM DAC is the part that is responsible to send the data for a decent picture to the monitor. Two factors are important, the quality of the RAM DAC, e.g. is it stand alone or integrated into the video chipset, and the max. pixel frequency, measured in MHz. A 220 MHz RAM DAC is not neccessarily but most likely better than a 135 MHz one and it certainly offers higher refresh rates - will tell you why further down on this page. RAM DACs tend to be included into the graphic chips more and more now, since it can decrease costs of graphic cards considerably and the quality of modern internal RAM DACs is coming close to the external ones."

          There are other sites that discuss the filtering which is necessary when using internal RAMDACs, which result in degraded signals. Let me know if you're interested in learning about it.

          Comment


          • #6
            FrankDC, who are you? Be careful with how you address whom you don't know. I'm a Matrox beta tester and can tell you point blank that your wrong (not entirely, just when it comes to Matrox, yes)

            What monitor are you comparing them on?
            "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

            "Always do good. It will gratify some and astonish the rest." ~Mark Twain

            Comment


            • #7
              Sorry, did I address someone improperly?

              If you can make a statement that internal RAMDACs are better than external ones (Matrox or otherwise), you should be able to cite references for your claim. What I'm telling you is that this is not the case, at least from everything I've ever seen or read.

              BTW the monitor is a new 22" Mitsubishi 2040U. As I said earlier the quality is not bad on the G400Max, it's simply a little better with the Mill-I. Text is slightly sharper and graphics are slightly clearer/brighter.

              Comment


              • #8
                Is that with a bnc cable?

                I don't have to prove it, it's well documented. While external dacs were a prominent chip to be found in the past, all manufactures will integrate them in the future. This saves board space and complexity = $$$ on the final product. If the identical circuit was integrated, why should it's quatily drop?

                The external filters on Matrox cards have been designed specifically to take into account what you want to modify based on a bunch of old non applicable hear say? I do know why this did work for them on other cards, most of these mods are part for part mods based on Matrox's filter design (of old) and are applying them to nVidiot's cards... it works one way... not the other
                "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                Comment


                • #9
                  Yes, the monitor has dual inputs so I tried swapping the Dsub and BNC between cards. Same results.

                  I didn't intend this to become a debate over RAMDACs. I was hoping someone had tried bypassing the G400's low-pass filter to see if its 360MHz dac was being throttled because of possible RFI concerns, and to see if 2D quality improved at high resolutions.

                  Comment


                  • #10
                    good thread.

                    I thought at the time, when I changed from a Mill2 AGP to my G200 that the 2D wasnt quite as sharp. I thought at the time it was because my Mobo wasnt giving enough power to the AGP (back in the LX days, when TNT cards wouldnt run on 50% of the motherboards out there).

                    I must say the G400 is very sharp, but I havent got the Millenuim2 anymore, so cant look.

                    Is it possible that the old RAMDAC gave a lighter colour, or something. Maybee its just a personal preference between the two.

                    Would it be a difference between the VQC and VQC2 ?

                    I dont think its fair or nice to say 'you are wrong!' without looking into the issue a bit more.

                    Has anybody worked out what that orange wire on the G450 is for? Maybe there is some interferance with the internal RAMDAC, and that earth fixes it. Some reviews do say the G450 has better output than the G400.

                    Ali

                    Comment


                    • #11
                      Frank,

                      You stated that your Mill 1 OUTPERFORMS a G400.

                      Wrong.

                      Then, you cited as a reference Tom "I can't tell my ass from my elbow" Pabst, who will gladly repeat anything anyone with money tells him, no matter how wrong it is.

                      Lots of us have (or have had) both cards. Lots of us can tell the difference. Lots of us say you're wrong. Perhaps you have a bum G400?

                      - Gurm

                      ------------------
                      Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
                      The Internet - where men are men, women are men, and teenage girls are FBI agents!

                      I'm the least you could do
                      If only life were as easy as you
                      I'm the least you could do, oh yeah
                      If only life were as easy as you
                      I would still get screwed

                      Comment


                      • #12
                        Gurm, not mean to disagree with you, but your response has been a bit extreme.

                        See my points here:
                        Back to the Millenium/Mystique era, people always regard Millenium has better image quality than Mystique, due to is external high-quality TI RAMDAC. Mystique uses internal RAMDAC.

                        When Matrox first announced G200 will have integrated RAMDAC, some people are disappointed, as they think G200 image quality will fall within Mystique's range.

                        Today, G400 is using integrated RAMDAC. In fact, I can hardly think of any graphics chips without integrated RAMDAC now.

                        Noise is not a significant issue with external RAMDAC. Graphics chip communicate with external RAMDAC digitally. It's the analog signals that generated by RAMDAC and output to the monitor that matters much with noise. Is today Matrox integrated RAMDAC better than yesterday TI high-quality RAMDAC? I don't know. In fact, you can call me blind, as I can't really differentiate slight difference in 2D image quality. I can't afford for high-quality, big monitor too.

                        Ali's response should be more reasonable.

                        And FrankDC, I have read such modification to old Matrox Mystique, too, but can't remember the URL, or if the URL even exist now. But AFAIK, if you perform such modification, your hardware will no longer compliant with FCC regulation. Is it worth to perform such modification, will you see a heaven and earth difference?? I don't know.

                        KJ Liew

                        Comment


                        • #13
                          Don't try to use the G200 against g400 on the Ramdac for 2D quality. I just checked it. Using the Newest drivers. The suprising part is my G200 8MB beat my G400 32MB in 2D with a slower ramdac. However the 3D textures was a Large change. G400 Creamed the G200. I think I might have a Bum G400 thou. I don't have opengl working for some reason. The G200 OGL is working. the Computer was the same for the testing. Also my G400 is Benching to slow. Still that is more for the 3D side of the House. The 2D is Different.

                          Comment


                          • #14
                            Airsen, I have a G400 Max and a G200 in two different machines running a common monitor (Cornerstone 50/95 21") and can say point blank that the G400's 2D is superior to the G200. Anyone wishing to make a comparasion should run resolution of 1280x960 as a minimium (& @ 32 bit color depth) to see the difference. I'm @ 1600x1200 (normal desktop res for me
                            "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                            "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                            Comment


                            • #15
                              I owned a G200 up to a month ago (now I have a G400) and I noticed a difference mostly in color depth at 1024x768 32-bit 75Hz, which is the resolution that I work in. The G400 is slightly better. And my SONY HMD-A200 TRINITRON monitor is good at telling the differences between cards.

                              I saw some tests somewhere, and in 2D speed the G400 overcomes the G200 too, but with a small margin.

                              Comment

                              Working...
                              X