Announcement

Collapse
No announcement yet.

Matrox's IQ deserves its reputation!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Matrox's IQ deserves its reputation!

    I've currently got 3 desktops at my new work place. I use one keyboard/mouse/monitor for the 3, using a belkin switch.

    I have the following graphics cards:

    -Integrated ATI stuff (old)
    -Matrox G450
    -Quadro 4 900XGL

    As I'm using the exact same monitor, it's really easy to compare the image quality (one push of a button). I can tell you my desktop looks HORRIBLE on the ATI and Quadro. I heard people saying that the GeForce/Quadro 4 had good IQ and that these problems were of the past, but it's not true at all from what I see. The text is blurry as hell and the colors washed out. It gets worse as the resolution increases, but I'm still only at 1280x1024, for crying out loud.
    I've had coworkers tell me what they think, and most find the G450 output on the Trinitron monitor looks sharp like an LCD.

    Kudos Matrox!

  • #2
    Keep in mind a KVM switch will degrade the IQ.
    Not sure how this affects your test though.
    P4 Northwood 1.8GHz@2.7GHz 1.65V Albatron PX845PEV Pro
    Running two Dell 2005FPW 20" Widescreen LCD
    And of course, Matrox Parhelia | My Matrox histroy: Mill-I, Mill-II, Mystique, G400, Parhelia

    Comment


    • #3
      Well if it degrades it, it degrades it equally for all cards!

      Comment


      • #4
        No, actually, it doesn't.
        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

        Comment


        • #5
          care to be more explicit?

          Comment


          • #6
            also... old ati integrated stuff was absolutely horrible...
            "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

            Comment


            • #7
              They don't actually degrade equally it all depends on a few factors.. the output impeadance, drive strength etc..
              but the whole idea is that when they validate the card they take
              into account the signal quality with a few different circumstances..
              and modify the design to work best all around..

              I agree the output quality on the Parhelia is amazing the guys
              who designed, and validated the output stages.. are well.. genius's!!

              Comment


              • #8
                You should try comparing it with the original Radeons and the later 9700 etc. You might find that there wouldnt be much difference. Actually I prefered my Radeon to either of my 2 G400's whilst running at 1280*1024 and above.


                Regards Michael
                Interests include:
                Computing, Reading, Pubs, Restuarants, Pubs, Curries, More Pubs and more Curries

                Comment


                • #9
                  unfortunately that's not an option right now but you're right, I'd be curious to see the quality of the Radeons.

                  Comment


                  • #10
                    Originally posted by mdhome
                    You should try comparing it with the original Radeons and the later 9700 etc. You might find that there wouldnt be much difference. Actually I prefered my Radeon to either of my 2 G400's whilst running at 1280*1024 and above.


                    Regards Michael
                    Got to disagree there both my G400 and G550 whip my Radeon 8500 and 9700 pros ass in 2d. 3d different kettle of fish of course.
                    Chief Lemon Buyer no more Linux sucks but not as much
                    Weather nut and sad git.

                    My Weather Page

                    Comment


                    • #11
                      Originally posted by OasisPlaces
                      I agree the output quality on the Parhelia is amazing the guys
                      who designed, and validated the output stages.. are well.. genius's!!
                      rarely do business get to be the top by genius alone... a much larger portion comes down to how anal the engineers were when designing it...
                      "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                      Comment


                      • #12
                        Well, then praise needs to go to Matrox's "Anal Engineers"
                        DM says: Crunch with Matrox Users@ClimatePrediction.net

                        Comment


                        • #13
                          Originally posted by ElDonAntonio
                          care to be more explicit?
                          I could be. The KVM switch is an RLC box, so it can be modeled as an impedence function for whatever signals it passes. The different video cards all put out different video signals - different rise & fall times, etc. The KVM will have much stronger effects on some signals than others.
                          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                          Comment


                          • #14
                            And any test regarding signal quality done with such a switch used is imo fully worthless - as I've yet to see any switchbox that doesn't give a damn lousy picture at high resolutions (e.g. the popular 1600x1200@85Hz).
                            My G400 gives a real damn crappy image in 1600x1200 with an electronic switch but is nice without. Same goes for my Radeon, so I've abandoned those switches - and use monitors / TFTs with two inputs instead.
                            But we named the *dog* Indiana...
                            My System
                            2nd System (not for Windows lovers )
                            German ATI-forum

                            Comment


                            • #15
                              I know this fact (you MURCers told me before), but why so?
                              Isn't a KVM switch just a switch of wires?
                              Or is there any processing inside?
                              P4 Northwood 1.8GHz@2.7GHz 1.65V Albatron PX845PEV Pro
                              Running two Dell 2005FPW 20" Widescreen LCD
                              And of course, Matrox Parhelia | My Matrox histroy: Mill-I, Mill-II, Mystique, G400, Parhelia

                              Comment

                              Working...
                              X