Announcement

Collapse
No announcement yet.

nVidia 3DFarq2k3 scandal part II

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • nVidia 3DFarq2k3 scandal part II

    in response to http://forums.murc.ws/showthread.php?s=&threadid=41877

    Futuremark has released an update to circumvent the application detection mechanism in the nvidia drivers, forcing the _default_ directx codepath on that driver, as any application would use normally:



    more info here, with the speed difference when using the 'optimized' codepath for 3dmark in the drivers vs. the normal path:



    ROFL

    edit: though of course this (might/does) also prove that synthetic/non-optimised benchmarks are useless to give a good idea about real world performance, depending on how much of the optimisations done in the nVidia drivers can be called 'cheats'.
    Last edited by dZeus; 23 May 2003, 10:40.

  • #2
    Lol. Then the newest card from nvidia has got lower scores than my lowbudget ati 9500..

    Sheeesh.

    Then again, ATI also might have a benchmark detection rutine, notice the 1.9% loss of score, with the new version.

    ~~DukeP~~

    Comment


    • #3
      A sort of debate that could go on for ever. When you consider that most card manufacters optimise there drivers for certain games as well how do you tell what the real world performance is.
      Chief Lemon Buyer no more Linux sucks but not as much
      Weather nut and sad git.

      My Weather Page

      Comment


      • #4
        heh, yeah. I guess people should just ask if a certain card runs the games they want to play decent enough.

        I dont mind my card being optimised for games, especially not games I play!

        If a Hardware producer wants to spent money on optimising for a benchmark, then its their privilige. Its my privilige to consider wheter I want to buy their brand afterwards.

        ~~DukeP~~

        Comment


        • #5
          I think nvidia and all the other cheaters are being unfair towards futuremark. I mean futuremarks goal is to benchmark GPUs in a fair and unbiased manner, what nvidia is doing can be viewed as sabotage(IMO).
          what would you say if you made a program, and some other company puts alot of effort into invalidating your program.


          synthetic benchmarks does have their use(in measuring cababilites of a gpu, in this case), maybe most gamers don´t find them usefull, but I would actually like to know what kind of performance I can expect from a card before I buy it.

          the reason why game benchmarks can´t give me that information, is exactly what has been hinted before: application specific optimizations.
          game benchmarks will(probably) only show me what kind of performance I can expect, if the driver writer bothers to optimized for that game. but where can I get information about the performance in games that aren´t "trendy" enough to get such optmizations, maybe I would like to play an "untrendy" game once in a while, and I would like to now what GPU is best in games like that.

          do you guys think that the game-benchmarks we see in reviews have anything to do with performance in games in general...
          the few benchmarks we see, have driver tweaks written all over them, 90% percent of the games out there probably doesn´t.
          That is what synthetic benchmarks is good for. they are supposed to measure performance of a gpu, and not measure which drivers are tweaked the most.
          Last edited by TdB; 23 May 2003, 12:00.
          This sig is a shameless atempt to make my post look bigger.

          Comment


          • #6
            TdB: exactly.
            And the results of 3DMark seem not too far off, since they are proven as well by any other program with heavy pixelshader2.0 use - even nVidias own Dawn-Demo seems to run faster on the 9500Pro than on the GF FX, now that some guys made it to run (I'm sure nVidia will hate them forever....)

            As it seems, nVidia just have totally f*cked up the pixel- and vertexshader engine of the NV30/35 and now try to hide this fact with very obscure driver tricks - which might work as long as there are no real games using these techniques.

            I'd be interested in benchies of ATIs DX9 demos on GF FX cards (those should definitely run since they use generic DX9 code and not proprietary OGL extensions like the NVidia ones) - I'm quite sure they would confirm the picture 3DMark03 and the Dawn demo are painting.
            Last edited by Indiana; 24 May 2003, 10:39.
            But we named the *dog* Indiana...
            My System
            2nd System (not for Windows lovers )
            German ATI-forum

            Comment


            • #7
              But this in 3DMark.... You can't se it right? I have read that you need a special developers version of 3DMark to se this "cheating" that they call it. If this is true, then I don't care about this at all, if a normal user like me cant se the difference, then nVidia has just made a good driver after my oppinion, and then I hope they will put a new code into the drivers that "detect" the new version of 3DMark to!

              Comment


              • #8
                You've gotta be joking, right?

                Originally posted by [GDI]Raptor
                But this in 3DMark.... You can't se it right? I have read that you need a special developers version of 3DMark to se this "cheating" that they call it. If this is true, then I don't care about this at all, if a normal user like me cant se the difference, then nVidia has just made a good driver after my oppinion, and then I hope they will put a new code into the drivers that "detect" the new version of 3DMark to!

                Comment


                • #9
                  nvidia also replaced some of the more demanding shaders, with shadercode they wrote themselves. I don´t know exactly what these shaders does, but it may be very possible that nvidia lowered the color-precision, heck! maybe they even replaced the dx9 shaders with dx8 shaders, or something equivalent.

                  if a normal user like me cant se the difference, then nVidia has just made a good driver after my oppinion
                  so if matrox hacked 3dmark2001 and replaced the pixelshader in gametest 4 with EMBM, so that it looked just like real pixelshaders, and the G400MAX got a 2000 mark increase for being fast in pixelshaders it didn´t have, you wouldn´t mind?

                  Ok, i know that was an extreme example, but the purpose of 3dmark, is to benchmark GPUs with a predefined workload, and not benchmark the drivers abilities to alter the workload, because then it would run a different test than the other GPUs.

                  edit: post nr 1200!
                  Last edited by TdB; 24 May 2003, 17:30.
                  This sig is a shameless atempt to make my post look bigger.

                  Comment


                  • #10
                    Yes, sure you cannot see the difference. You also cannot tell the difference between a Ferrari and a Ford then, I guess...

                    Just some examples of blatant errors due to nVidias driver tricks:


                    Discuss anything you want here folks. A place for computer junkies to boldly post Off Topic... Rant and rave if it is necessary, but just try to keep it clean!

                    (this thread has a VERY funny title, btw )
                    Last edited by Indiana; 24 May 2003, 18:51.
                    But we named the *dog* Indiana...
                    My System
                    2nd System (not for Windows lovers )
                    German ATI-forum

                    Comment


                    • #11
                      Originally posted by Indiana
                      http://www.rage3d.com/board/showthre...eadid=33687990
                      (this thread has a VERY funny title, btw )
                      Agree T'was funny
                      P4 Northwood 1.8GHz@2.7GHz 1.65V Albatron PX845PEV Pro
                      Running two Dell 2005FPW 20" Widescreen LCD
                      And of course, Matrox Parhelia | My Matrox histroy: Mill-I, Mill-II, Mystique, G400, Parhelia

                      Comment


                      • #12
                        Originally posted by WyWyWyWy
                        Agree T'was funny



                        ROFLMAO
                        But we named the *dog* Indiana...
                        My System
                        2nd System (not for Windows lovers )
                        German ATI-forum

                        Comment


                        • #13
                          ROTFLMAO
                          If there's artificial intelligence, there's bound to be some artificial stupidity.

                          Jeremy Clarkson "806 brake horsepower..and that on that limp wrist faerie liquid the Americans call petrol, if you run it on the more explosive jungle juice we have in Europe you'd be getting 850 brake horsepower..."

                          Comment


                          • #14
                            Good one.
                            Chief Lemon Buyer no more Linux sucks but not as much
                            Weather nut and sad git.

                            My Weather Page

                            Comment


                            • #15
                              When you benchmark a card, ALL YOU GET IS THE CARD'S BENCHMARK PERFORMANCE.

                              The real world performance is tending to differ vastly.

                              Benchmarking as we have it today is a joke that is doing nothing but cheating end users. A 3DMark benchmark only benchmark's that card's performance in 3DMark. A card will beat another one in 3D Mark, then in another benchmark it is vice versa. Then we need these "elitists" like Anand's and HardOCP to tell us why???

                              No, I say cut out the middleman and benchmark right in the first place. But the problem is, nVidia spends tons of money on Anand's and HardOCP's benchmarks that they use. My stance is, ask someone who bought the thing to take Unreal 2 and turn the timedemo on, and then play it for about 15 minutes and tell me what the result is, because I trust the next joe a hell of a lot more than Anand's and [H]'s
                              I am the 1 and the 0, the bit and the byte.
                              No computer is unbendable to my will, as hacking is not so much skill as psychology. Much like the lawmaker and the money that drives him to do as anyone would wish with it.

                              Comment

                              Working...
                              X