Announcement

Collapse
No announcement yet.

Look: Athlon 500 + G400 doing T&L very close to the Ge-Farce.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Look: Athlon 500 + G400 doing T&L very close to the Ge-Farce.

    Kyle from www.hardocp.com has done a wonderfull job demistifying the much hyped Ge-farce GPU T&L. In his recent article http://www.hardocp.com/articles/nvid...shine_pg1.html it seems that Ge-force does a nice job with non-SIMD cpu´s, but strangly enough, when SSE comes into the play, a PIII performs the T&L job faster than the Ge-Force T&L. So much for the 15 MTriangles/sec raw power! LOL!

    He was using Test Drive 6, a average game, but one of the first to use Hardware T&L (i.e. DX7 T&L).

    Here´s the result of a celery 500 doing T&L Alone:



    Here´s some Celery 500 + Ge-force T&L action:



    Now the best part (for me ) - My Athlon 500 + G400 T&L Action:



    Not to shabby, eh? So do I really need a Ge-Force DDR? Me thinks not


  • #2
    Hi nuno,

    I'm afraid I cannot follow you ...

    1st: you have lower minimum and average FPS
    2nd: your CPU costs about 3-4 times the amount of the Celery (I suppose it's a 300A@112MHz > 504MHz)

    so what did you expect ?

    Also it is known that NVidia disabled sideband addressing by default due to compatibility issues on SS7 boards and furthermore they use 'load balancing' which means that the work is split to be done by the GPU and CPU, meaning they don't use the GPU to its full potential ...

    Don't get me wrong, I surely think that you got awesome performance, but imho it is not correct to compare apples with peaches, is it ?



    ------------------
    Cheers,
    Maggi

    Despite my nickname causing confusion, I am not female ...
    Despite my nickname causing confusion, I am not female ...

    ASRock Fatal1ty X79 Professional
    Intel Core i7-3930K@4.3GHz
    be quiet! Dark Rock Pro 2
    4x 8GB G.Skill TridentX PC3-19200U@CR1
    2x MSI N670GTX PE OC (SLI)
    OCZ Vertex 4 256GB
    4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
    Super Flower Golden Green Modular 800W
    Nanoxia Deep Silence 1
    LG BH10LS38
    LG DM2752D 27" 3D

    Comment


    • #3
      Interesting Comments Maggi,
      Also it is known that NVidia disabled sideband addressing by default due to compatibility issues on SS7 boards and furthermore they use 'load balancing' which means that the work is split to be done by the GPU and CPU, meaning they don't use the GPU to its full potential ...
      The first part about the Sidebanding was correct until the recent set of drivers released, users with Geforces can now get 2x and even 4x AGP when avalible... I beleive Kyle of HardOCP was using the latest...

      But the part about 'load balancing' is the part that got me... If you go over and look at the review, it'll show that with a faster CPU(666Mhz P3) and Hardware T&L on, the FPS is lower than when the Software is used...
      So I don't think there is much 'load balancing' and that term sounds too much like marketing to me....

      I know this is alittle off the Matrox topic, but it is important to Matrox hardware in that if Matrox jumps on this T&L bandwagon and there is no real improvement to the gaming experience then why pay for the R&D costs?? or the board costs...


      Craig
      BTW the article mentioned is a good read, interesting and very fairly written IMO...
      1.3 Taulatin @1600 - Watercooled, DangerDen waterblock, Enhiem 1046 pump, 8x6x2 HeaterCore Radiator - Asus TUSL2C - 256 MB Corsair PC150 - G400 DH 32b SGR - IBM 20Gb 75GXP HDD - InWin A500

      Comment


      • #4
        No, No, Maggi, you are getting it all wrong. I´m not comparing the celery with the Athlon! What I´m trying to say is that the Ge-Force doesn´t have all that wonderful T&L performance that Nvidia hyped so much. In those celery results with t&l on, the celery isn´t doing nothing, the t&l is being made by the ge-force. And my minimum framerate is lower, but if you watch closely my graph, It must have been a Hdd access to swap-file or something. (the other guy has 300+ mb ram!). The framerate remains consistently above 30.

        So my point is the T&L in the Ge-force isn´t that powerfull after all. Of course the Ge-force results are higher, but not that much, so a good cpu (and the K7 500 isn´t that expensive) can be *competitive* with ge-force T&L performance. Even if ge-force performance is better, I would expect it. But I would expect it to be MUCH better than it really is. So the Ge-force can do 15 Mtri/s? Don´t think so.

        Load balance or no load balance, the fact is that when they said that the Ge-force had a fpu processing power around 5 times faster than the fastest cpu available, they were lying. A PIII 666 using SSE is already faster than the Ge-force doing T&L. So the point is, it seems that if you put a Ge-force in a high-end system, the risk is the videocard becoming the bottleneck. Or that, or Nvidia drivers suck beyond any recognition.


        [This message has been edited by Nuno (edited 09 February 2000).]

        Comment


        • #5
          compared to the g400, i would say the geforce, and even ddr boards have much less features. hardware bump mapping, nope. dualhead, nope. QUALITY TV OUT, nope. The only game the geforce is much faster in is quake 3, and guess what, you know nvidia tweaks the crap out of there drivers to give the fastest quake 3 benchmark. i wish people would accept the fact that quake 3 is only one game, there is much better out there. everything else, the g400, and even the tnt2 ultra is neck and neck in fps.....

          if you own a g400, you won't see much of a difference until at LEAST late next summer, the we will have to decide between the IMPROVED geforce, (NV20 i think) which on paper looks hot, and the matrox g800.
          don't count out 3dfx, i have a banshee in my 2nd system, and even though it is not the fastest, EVERYTHING WORKS, OUT OF THE BOX.

          the real necessity to upgrade will come with the 1 ghz processors and the NEXT generation of either nvidia, matrox, and 3dfx.....

          my humble opinion, soon to be proven in the fields of all out 3d wars.....

          Comment


          • #6
            Nosuchluck:

            When you say "EVERYTHING WORKS, OUT OF THE BOX" you forgot about the OpenGL support didn't you. Unless you bought the card in the last couple of months, you didn't have any real OpenGL capability, other than the basic miniGL for Quake.

            I used to frequent the 3dfx forums back before I got my Max, keeping up with info and updates for the V2 I have. Many times after the V3 was released, I saw word wars between the 3dfx supporters (I hesitate to say fanatics, though that's the way I felt at the time) and the 3dfx users that complained that they'd bought the Banshee and it was supposed to have an OpenGL driver right in the box (said so on the box, but many waited more than 6 months to get the beta).

            Even now the OpenGL (full ICD, not the WickedGL) has a lot of problems, from what I've been reading recently. Even more problems, it seems, than the Matrox G200/G400 has had. To make matters worse, those same sources mention errors with the ICD if the wrong version of the Glide drivers are installed (??- Why? The ICD ISN'T supposed to depend on the Glide drivers, the same as the D3D component doesn't. That was the miniGL that converted OpenGL calls to Glide calls).

            Anyway. Sorry for sounding like I'm beating on your hardware. My point is that most of the current cards have had driver problems out of the box that have masked their true potential (missing or incomplete), and that only time will tell how the current GeForce looks in a few months. The only way to find out is to wait until that time.

            ------------------
            Ace
            "..so much for subtlety.."

            System specs:
            Gainward Ti4600
            AMD Athlon XP2100+ (o.c. to 1845MHz)

            Comment


            • #7
              good point on 3dfx's open gl, it IS a total joke.

              Comment


              • #8
                What I want to know Nuno, is where do ya get the Bench Graphics? Kyle's was the first I've seen of it. Now your sporting this spiffy GIF ???

                ------------------
                "Hi, my name is Gary and I'm an OC addict"

                "If your still stock,... ya' best stay on the porch"

                Comment


                • #9
                  The graphs are from Test Drive 6 demo. It has a benchmark option that displays that graphic when the bench is finished.


                  Here, try for yourself: http://www.3dfiles.com/games/testdrive6.shtml

                  Comment


                  • #10
                    Sorry for the delay and thanx for feedback, guys !!!



                    Obviously my knowledge was not up to date, Stringy and I also misinterpreted your post, Nuno.

                    Sorry for that !!!

                    ...

                    Nuno, could you also add a graph showing the performance of your Athlon in software mode (like the one for the Celery) ?

                    I'd love to the the difference between Athlon & G400 and Athlon solely, because the GeForce seems to boost the graph quite heavily.

                    Cheers,
                    Maggi
                    Despite my nickname causing confusion, I am not female ...

                    ASRock Fatal1ty X79 Professional
                    Intel Core i7-3930K@4.3GHz
                    be quiet! Dark Rock Pro 2
                    4x 8GB G.Skill TridentX PC3-19200U@CR1
                    2x MSI N670GTX PE OC (SLI)
                    OCZ Vertex 4 256GB
                    4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
                    Super Flower Golden Green Modular 800W
                    Nanoxia Deep Silence 1
                    LG BH10LS38
                    LG DM2752D 27" 3D

                    Comment


                    • #11
                      Hi there Maggi

                      I think your´re still misinterpreting it... That 2 celeron 500 graphs are BOTH in D3D hardware. The diference is that in the first, the Ge-force T&L engine is disabled, so the celery 500 is doing all the geometry. The results are not that good as you can see. The game is heavily cpu-dependant. The second is with the Ge-force doing T&L, and it boosts the results quite a bit...

                      Now as the G400 doesn´t have T&L in hardware (despite those rumors around ) I only have the choice of software T&L. That´s the graph I posted.

                      The point I was trying to make is that my graph was comparable (of course not as good, but in the same performance range) then a cpu@500 Mhz + a Ge-force doing T&L.

                      Comment


                      • #12
                        I can't speak for the Banshee, but the Voodoo3 was the most maintenance-free board I've ever used. I like my G400's a lot better, but I believe the Voodoo3 delivered everything 3dfx claimed it would. I do think the V3 3500 is kind of silly, but the V3 3000 was a good choice for many people.

                        To be fair, Matrox-users commenting about the Banshee and OpenGL "right out of the box" gets us into a real "pot calling the kettle black" situation.

                        Paul
                        paulcs@flashcom.net

                        Comment


                        • #13
                          If you turn off AGP 2x, SBA, texture features, every VGA chipset is just like Voodoo 3 -- problem-free! So I think the reason why Voodoo3 is the most stable among all is just the fact that it is a PCI device running at 66 MHz...

                          Instead, I think the Voodoo3's big problem is that it is not 100% compatible with Glide games...


                          ------------------
                          Celeron 300P@558/2.0v, P3B-F, G400DH/32MB@148/166.5

                          P4-2.8C, IC7-G, G550

                          Comment


                          • #14
                            I still have my marvel g400, and it is nice, but I also picked up a Gloria II (quadro chip). Granted it is a significant price increase, but the T&L module does wonderful things for 3d apps. With the gloria II I can push around a scene in 3d studio max with 2.2 million polygons at a nice pace. With the G400's ICD, the thing choked at pretty much any number of polygons (choking meaning it is slower than using a software only driver.)

                            Comment


                            • #15
                              I agree. The Voodoo3 isn't the most feature rich board out there. It does what it does do very well. (Except for that stupid V3 3500. What a terrible idea!)

                              Again, I think the G400 is, hands down, a better board. Certainly the best (and in many cases fastest) board of it's generation.

                              Doesn't the Elsa board have 64 MB of RAM? And the Marvel 16 MB? Could be more than the ICD slowing you down. I don't think the Marvel pretends to be aimed at CAD market.

                              Paul
                              paulcs@flashcom.net

                              Comment

                              Working...
                              X