Announcement

Collapse
No announcement yet.

Geforce 3 Image quality

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Geforce 3 Image quality

    I offer the following for your perusal. I tested the following cards myself.

    I know some of you may be contemplating buying a GF3 in the light of recent comments, that the image quality on GF3 cards is vastly improved.

    Well, you can see what I found out below.

    I set up G450 AGP 32Mb (bulk), ATI Radeon 64Mb DDR (bulk) AGP and Elsa
    Gladic 920 AGP (retail) inside one computer (Iwill KK-266R based machine
    with Enermax 450W PSU) one by one. Each with latest reference drivers
    running under Win98.

    I then asked three of my colleagues (who were not aware of which
    card was inside the machine) to judge the image quality by writing
    a relative value on a sheet of paper indicating the relative
    performance of each card compared to others. Furthermore, they
    were prompted to write down qualitative words to describe the
    image they saw from each card.

    They used a combination of bit map graphics, black text on a white
    background (and vice versa), color/hue purity test pattern and
    cross hatch & line convergence patterns to evalute the image quality.

    On all of the systems, hw calibrated Mitsubishi Diamond Pro 2040u
    was used as the display device. For connecting to the display a double
    shielded 75 Ohm terminated UltraVGA cable with BNC connectors was used.

    Cards were manually uninstalled and installed with complete removal
    of drivers and registry entries along with accompanied other software.
    Display settings were scaled to fill the whole viewable area at
    both tested resolutions.

    The jurors were allowed to change the their preferred color temp
    from the default 6500K during the evaluation - they were not allowed
    to touch other monitor controls other than color temp, brightness and
    contrast (remember, the display is calibrated for maximal accuracy
    when fed a perfect signal from a signal generator).
    Everybody started the evaluation at 6500K and were
    adviced to concentrate on that temp if unsure. Gamma was at 2.2.

    They were presented with the cards in following order with
    display setttings at 1600x1200@85Hz and 1280x1024@100Hz (in
    that sequence):

    A. G450
    B. Elsa 920 GF3
    C. Radeon 64 DDR

    The jurors did not see each others markings, nor did they discuss
    with each other about the quality or see each others evaluations
    before the test ended.

    The results are as follows:

    1. G450 (2 highest ratings, 1 2nd highest rating)
    Adjectives used: "sharp", "steady", "vibrant", "[color] pure", "contrasty",
    "

    2. Radeon 64 DDR% (1 highest rating, 2 2nd highest ratings)
    - "sharp", "steady", "not as contrasty as A", "clearly better than
    B", "slight lightness ghosting at 1600x1200 with black on white"

    3. Elsa GF3 (3 3rd highest ratings)
    - "unfocused", "color fringing", "soft", "color focus problems e.g. with
    blue and yellow transition", "less contrasty than A", "wavy image"

    After the test I put Radeon in one computer (with Hitachi 823F 21"),
    G450 in one computer (Hitachi CM772 19"), and Elsa 920 in one computer
    (Mitsubishi 2040u 22") and looked at them side by side. 2040u is
    hw calibrated, others only in software.

    At this point jurors were allowed to give additional comments
    on the quality of the images (they still didn't know which machine
    had which card). Monitors were different, but they were side by side.

    People now had more trouble discerning between A and C (G450 and Radeon)
    especially at 1280x1024@100Hz.

    People commented on how much softer the B (Elsa) now looked (more so at
    1600x1200). One person said that the image of B at 1280x1024@100Hz
    wass fuzzier than the image of A or C on 1600x1200@85Hz, although
    he thought that A and C were "pin-sharp" at 1280x1024@100Hz.

    One juror said that looking at the slightly fuzzier image of B (Elsa)
    was pleasant with graphics, but not when reading small text or details.

    Two people also noted that A (G450, running on 19" Hitachi) seemed a bit
    unsharp compared to the original test (when it was running on 2040u),
    when the brightnes was increased and resolution was 1600x1200@85Hz.

    In conclusion they all felt that if they had to pick one, they
    would pick either A or C for their own displays. Only one
    person said was absolutely sure about picking A (G450) over C (Radeon).
    All of them though B was too soft to be used at higher resolutions
    and when looking at small details.

    What does this tell you - if anything?

    I leave that for you to discuss. I won't participate in
    the discussion anymore. I think I've contributed more useful data
    than anybody else in this thread and if you don't like it, you can
    do your own double blind studies. Best of luck in getting them published
    (perhaps you can include them in your PhD studies).

    I'm through with this and I know what card to put inside my computer
    to satisfy my needs and my eyesight. This took half of my working day and
    I need to catch up.

    cheers,
    Halyon

    PS I'm still not claiming one card is better in all respects over
    another one. I was only trying to find an answer to the question:
    "Which card has the best image quality especially for
    doing a lot of text work at high resolutions and high refresh
    rates, while still not sacrificing 3D speed completely." I've
    found my answer - I hope you'll find yours too.



  • #2
    <font face="Verdana, Arial, Helvetica" size="2">Originally posted by Halcyon:
    ... I think I've contributed more useful data
    than anybody else in this thread and if you don't like it, you can
    do your own double blind studies. ...
    </font>
    Considering yours is the only post in this thread up to this point, I would wholeheartedly agree with you. Is this a copy of a post from another forum?

    Nice testing. I think the results agree with what most MURCers have been saying.

    <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

    Comment


    • #3
      I'm sorry, I should have been more clear.

      Yes, that is indeed a re-post from a thread I participated in comp.sys.ibm.pc.hardware.video.

      I first told them that my Elsa Gladiac 920 had sub-par image quality with high refresh rates and resolutions (other people had clamed that it was "picture perfect").

      Many thought I was biased and full of shite.

      So, in order to please them and to check for myself if I had that much bias, I decided to do this little test.

      In retrospect I was a tad tired and even angry when I wrote that concluding paragraph.

      So no offence mean to anyone at MURC, I was just annoyed by some characters at csiph.video and that paragraph was included in my cut & paste copy here.

      Anyway, I hope it was useful to someone.

      cheers,
      Halcyon

      PS As much as I'm ashamed to admit this here at MURC, I must say that I decided to keep the Radeon in the end. Still, I try to keep my options open and keep an eye out for new Matrox hardware.

      Comment


      • #4
        Very good test indeed. I wish all the hardware sites would do something like that before they test a card and say that the 2d image quality of the card is "Geforce2 perfect".

        [This message has been edited by Novdid (edited 30 May 2001).]

        Comment


        • #5
          Hey don't feel too bad about using the Radeon. I got a AIW Radeon in December, and pretty much like everyone says the Hardware is top notch, but the drivers suck! I still have my trusty G400Max and I refuse to sell it or give it up. I'm waiting on my Herlcues Kryo 2 card to show up and try it out in my system (for $112 bucks you can't go wrong), plus having another video card gives me the excuse to start piece mealing my next system together (still waiting on that Athlon4 desktop chip to show up). In all likelyhood the Kyro will get sold in a couple months to get what ever new Matrox Product is out there, it the 3d is up to snuff....

          Scott


          ------------------
          Abit BH6 rev 1.0 Celeron 2-566@877mhz,256mb RAM,ATI Radeon AIW,SB Live! with Klipsch Promedia v2-400, Optiquest V95 19in montor, Asus 40x CD-ROM, Aopen 5x DVD-ROM,HP9110i 8x4x32 CD-RW,SupraMax 56k modem,WinME on Western Digital 30GB drive
          Why is it called tourist season, if we can't shoot at them?

          Comment


          • #6
            Wow Halcyon, quite insightful. Good job.
            Primary system specs:
            Asus A7V266-E | AthlonXP 1700+ | Alpha Pal8045T | Radeon 8500 | 256mb Crucial DDR | Maxtor D740X 40gb | Ricoh 8/8/32 | Toshiba 16X DVD | 3Com 905C TX NIC | Hercules Fortissimo II | Antec SX635 | Win2k Pro

            Comment


            • #7
              Hmmm... The only question this raises is whether this reflects badly on ELSA (the board maker) or NVidia (the chip maker)...

              A lot of components that affect image quality are at the discretion of the board maker, and one has only to compare, e.g., a PowerColor GF2 or Kyro to ones from, say, Canopus and VideoLogic, to see there can be a world of difference between cards from different manufacturers...

              (How many people here ever compared a G400 from Gigabyte to one from Matrox?)

              Comment


              • #8
                Hercules's Kyro2 board has decent 2D quality, I have one, I know... But their GF2 boards have lousy 2D quality. I don´t think they use filters that are worse than those on the Kyro (considering they are marketing it as a budget solution).

                That leads me to the obvious conclusion that it is something very shitty about Nvidias chips.

                Comment


                • #9
                  I came to an identical conclusion doing a similar test quite a while ago when the Radeon first came out. It was against a GF1 at the time, but still...

                  And I came to the same conclusion as Novdid just did:

                  There is something INTRINSICALLY wrong with nVidia chips. NOBODY makes one with a perfect display. The BEST nVidia-based cards on the market are not as nice (display quality-wise) as the cheapest Radeon. Why is this?

                  There must be something about nVidia chips. Forget that MOST manufacturers use nVidia's board designs. One of them surely wouldn't, right?

                  Visiontek's OWN WEB SITE refers to their nVidia cards in the following manner:

                  "Your card is poorly designed, and has bad 2D output"

                  "Dodgy card design"

                  "Numerous 2D problems"

                  So what do you expect?

                  - Gurm

                  ------------------
                  Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
                  The Internet - where men are men, women are men, and teenage girls are FBI agents!

                  I'm the least you could do
                  If only life were as easy as you
                  I'm the least you could do, oh yeah
                  If only life were as easy as you
                  I would still get screwed

                  Comment


                  • #10
                    Well I don't have eagle eyes. (I wear glasses). But when I pump my Geforce 3 up to 2048X1536 the image looks fine to me. I don't do professional level graphics but I do stare at a computer monitor all day long.

                    The "crappy" Geforce 3 quality is fine for anyone who doesn't do 2D graphics I guess.
                    C:\DOS
                    C:\DOS\RUN
                    \RUN\DOS\RUN

                    Comment


                    • #11
                      Right on Freak! Beauty is in the eyes of the beholder. Why sacrifice money for quality you can't appreciate? Its like buying an '85 Martha's Vineyard Heitz Cabernet when you can't tell it from this week's Thunderbird.

                      [This message has been edited by xortam (edited 01 June 2001).]
                      <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                      Comment


                      • #12
                        I don't have a GF3 so that's not why i'm posting this !
                        I do not get it .. why do you keep bringing this subject up ?
                        Is it because the only things the G400 has going for it is 2D and Dualhead ?
                        What about people who use their GFX Card for 3D ? Does a G400 Card have superior 3D as well ? I don't think so .
                        Recently there has been sone test on the www regarding 2D performance mainly in 3D Studio Max ... well, the G450 takes a serious beating when compared to a Radeon and a GF2.
                        So I simply don't get it, why do you keep on bringing this subject up ?


                        Just for the record I now use a GF2 GTS and I previously owned a Millenium, G200 and a G400 Max.
                        As far as the GF2 goes ... it looks and runs pretty smooth ..... so i'm

                        [This message has been edited by Kosh Naranek (edited 01 June 2001).]
                        Fear, Makes Wise Men Foolish !
                        incentivize transparent paradigms

                        Comment


                        • #13
                          Kosh, I do believe it comes up because some people believe or at least say that the 2D of the GeForce3 is as good as that of the G400. The people who tend to bring it up are people who believe that is not so. I don't see what your problem is with that really. Clearly you realise that the majority of regulars here do not have 3D speed as their top priority. I have played games on a GTS ans a G400 and while the gameplay was smoother on the GTS, I enjoyed playing more on the G400. Why is this, well to me it just looked better on the G400. I have friends who preferred the GTS experience, and some who preferred the G400 experience. Do you hear people in the nv forums talking about the great picture quality in 2D. Probably not a lot because most nv users are after gaming speed primarily.
                          [size=1]D3/\/7YCR4CK3R
                          Ryzen: Asrock B450M Pro4, Ryzen 5 2600, 16GB G-Skill Ripjaws V Series DDR4 PC4-25600 RAM, 1TB Seagate SATA HD, 256GB myDigital PCIEx4 M.2 SSD, Samsung LI24T350FHNXZA 24" HDMI LED monitor, Klipsch Promedia 4.2 400, Win11
                          Home: M1 Mac Mini 8GB 256GB
                          Surgery: HP Stream 200-010 Mini Desktop,Intel Celeron 2957U Processor, 6 GB RAM, ADATA 128 GB SSD, Win 10 home ver 22H2
                          Frontdesk: Beelink T4 8GB

                          Comment


                          • #14
                            I don't visit the NV forums !!!

                            I have never said nor will I ever say that the 2D quality on a GF3 is as good as on a G400.

                            How ever 2D quality on a GF2/GF3 is not as bad as G400 owners say it is .....
                            The thing with colors out of focus etc. i can only speak for myself when I say I have never seen that, but that's maybe because i'm using a Trinitron CRT monitor.

                            Having played on a G400 Max and a GF2 GTS i disagree with you when you say that a G400 looks better ... it looks different, but I wouldn't say better ( 3D )
                            Reason ... have you tried playing Giants on a G400 and then on a GF2 ?
                            The difference is HUGE and it looks so much better on a GF2, not because a GF2 has better picture quality but because Giants takes advantage of all the features offered by a GF2.

                            About UT ... again I wouldn't say it looks better on a G400 but it does look different especially when playing in OpenGL.
                            The GF2's color saturation is much higher/stronger than on a G400 ... not always a good thing.

                            And finally ... Geforce series of cards have an almost infinite number of configuration tweaks which can be applied on the fly .. if you are not careful you can wreak havoc with a GFx's picture quality but when everything is set at MAX quality ( filtering, user mipmaps, anisotropic, texture compression etc.. ) then it does look very good.( Most GFx owners set their GFX card up for max. speed while sacrificing quality ... I don't)

                            [This message has been edited by Kosh Naranek (edited 02 June 2001).]
                            Fear, Makes Wise Men Foolish !
                            incentivize transparent paradigms

                            Comment


                            • #15
                              It is very unfair to compare the 3d capabilities of a Geforce to with a G400 (performance wise and quality wise). You have to compare the Geforce to something newer and more equal in terms of speed as well as features. Enough about that.

                              The ones who still is holding on to the G4x0, are very happy with what the card has to offer. Even if they want more speed the benefits (excellent 2D, stable and good drivers etc.) of the card outweighs that with a good margin. I'm not talking for everyone here, but I think most of the Matrox owners reason that way.

                              Comment

                              Working...
                              X