Announcement

Collapse
No announcement yet.

3dmark03 and NV saga continues

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 3dmark03 and NV saga continues



    they are disabling scores useing "suspect" optimised drivers by NV
    Last edited by Marshmallowman; 24 March 2003, 21:20.

  • #2


    If there's artificial intelligence, there's bound to be some artificial stupidity.

    Jeremy Clarkson "806 brake horsepower..and that on that limp wrist faerie liquid the Americans call petrol, if you run it on the more explosive jungle juice we have in Europe you'd be getting 850 brake horsepower..."

    Comment


    • #3
      Don't they all do that ??????
      Chief Lemon Buyer no more Linux sucks but not as much
      Weather nut and sad git.

      My Weather Page

      Comment


      • #4
        Not like nv, it seems...

        Comment


        • #5
          Yes, in the past those driver hacks were more minor things with slightly lowered LOD and/or texture-detail settings (see ATIs "Quack" and also some earlier NVidia driver releases).

          But what they are apparently doing in those 3DMark03 drivers is much more than this: basically they are simply not using DX9, but hacking the whole thing down to DX8 level.
          A minimum of 24 bit floating point rendering precision is required for DX9, NVidias Pixel- and VertexShaders perform absolutely terrible when floating point precision is used.
          So apparently the do not only force it down to their 16bit floating point rendering path (which would be bad enough and a sure reason for the driver to NOT get DX9 WHQL certification ever), but the seem to force the use of their 12 bit integer pipeline there!
          (And just look at those Shader tests: while the GF FX has lackluster shader performance in floating point precision, it shines when integer is used).

          It's basically like saying 4 x 4 = ~15. Well that's not correct but near to it and you did it very FAST, didn't you...

          When NVidia has to release WHQL driver there will be a great speed-loss due to this. But I'm sure they will leak non-WHQL'ed "beta" driver a week after it which will provide a major performance boost (but put all those hacks back in...)
          But we named the *dog* Indiana...
          My System
          2nd System (not for Windows lovers )
          German ATI-forum

          Comment


          • #6
            NV30 is nvidia´s major flop since Riva128 (NV1 doesn´t count ). What´s more interesting is that it isn´t a bad chip at all - good amount of features, and let´s face it, fast enough.
            What Nvidia didn´t see coming was the ATi R300, wich was (and NV30 just proves it) ahead of its time.
            In fact, it´s a little pathetic that the "6 month cycle company" can´t release a faster card than another... 6 months old.
            What´s even more sad is all the hype, marketing and cheats on the drivers that all that they do is to fool the customer.

            edit: 2000 posts

            Comment


            • #7
              LOL. I think it was the superNV-Boeing-USAF-jet-engine that disappointed many. hehe... in fact, Parhelia is supposedly a lot more disappointing, but they receive the same level of critics ROFLMAO.

              BTW, congrats Nuno!

              Comment


              • #8
                Yes, as Matrox w/o Ati thread has also shown, were R9700 to equal Ti 4600, Parhelia and especially GeForce FX would not be perceived as failures.

                Comment


                • #9
                  well, IMO the thing that made the nv30 a failure isn´t speed, it is the noise.
                  what nvidia should have done:
                  1# don´t overhype
                  2# release it with a normal cooler and clock it at 400mhz
                  3# price it reasonable
                  and just accept that they can´t win every time.
                  the nv30 isn´t bad, their pr-hype is bad, and the noisy cooler is bad, but the nv30 isn´t bad at all.
                  and if they really want some pr advice: then pimp the fp32 precision, and say: "yes, we don´t get 122fps in xxx, but we get 97fps in 32bit precision", they could get away with that.
                  Last edited by TdB; 25 March 2003, 16:45.
                  This sig is a shameless atempt to make my post look bigger.

                  Comment


                  • #10
                    I'm glad they did this. I'm tired of nVidia's PR bs. I mean all PR is pretty much bs, but they really push the limits...
                    System Specs:
                    Gigabyte 8INXP - Pentium 4 2.8@3.4 - 1GB Corsair 3200 XMS - Enermax 550W PSU - 2 80GB WDs 8MB cache in RAID 0 array - 36GB Seagate 15.3K SCSI boot drive - ATI AIW 9700 - M-Audio Revolution - 16x Pioneer DVD slot load - Lite-On 48x24x48x CD-RW - Logitech MX700 - Koolance PC2-601BW case - Cambridge MegaWorks 550s - Mitsubishi 2070SB 22" CRT

                    Our Father, who 0wnz heaven, j00 r0ck!
                    May all 0ur base someday be belong to you!
                    Give us this day our warez, mp3z, and pr0n through a phat pipe.
                    And cut us some slack when we act like n00b lamerz,
                    just as we teach n00bz when they act lame on us.
                    For j00 0wn r00t on all our b0x3s 4ever and ever, 4m3n.

                    Comment


                    • #11
                      Nvidia have not released a WHQL drive in near 6 months, they are going to pieces.

                      I thought all the 3dfx/nvidia analogies were funny, but every day they just seem more accurate. I mean schedules slipping, 3dfx never released such dodgy drivers.

                      I hope nvidia get a bit of caning, but I don't want to see them going down, Its about time they take their head out of there arses.

                      Comment


                      • #12
                        I don't think nVidia is going down like 3dfx did because they have a wider market, and their nForce chipsets are doing very well. But they are definately not top dog anymore in the video card market (if they ever really were). I think they just can't handle not having the fastest card out there. They're just gonna have to get used to it.
                        System Specs:
                        Gigabyte 8INXP - Pentium 4 2.8@3.4 - 1GB Corsair 3200 XMS - Enermax 550W PSU - 2 80GB WDs 8MB cache in RAID 0 array - 36GB Seagate 15.3K SCSI boot drive - ATI AIW 9700 - M-Audio Revolution - 16x Pioneer DVD slot load - Lite-On 48x24x48x CD-RW - Logitech MX700 - Koolance PC2-601BW case - Cambridge MegaWorks 550s - Mitsubishi 2070SB 22" CRT

                        Our Father, who 0wnz heaven, j00 r0ck!
                        May all 0ur base someday be belong to you!
                        Give us this day our warez, mp3z, and pr0n through a phat pipe.
                        And cut us some slack when we act like n00b lamerz,
                        just as we teach n00bz when they act lame on us.
                        For j00 0wn r00t on all our b0x3s 4ever and ever, 4m3n.

                        Comment


                        • #13
                          the thing that made the nv30 a failure isn´t speed, it is the noise
                          It´s speed also. GFFX is getting dismal performance on everything that´s DX9 related. If you call for high-precision floating point shaders or pixel shaders 2.0, it performs at roughly half speed than the R9700. It does get away with Dx8.1 pixel and vertex shaders (using integer calculations), that´s why it doesn´t look so bad on most games and benchmarks - they´re only DX8 at best. And when DX9 is benchmarked (a small part of 3dmark03) they have to use ugly hacks in order to score closer.

                          Comment


                          • #14
                            Originally posted by TDB

                            and if they really want some pr advice: then pimp the fp32 precision, and say: "yes, we don´t get 122fps in xxx, but we get 97fps in 32bit precision", they could get away with that.
                            Yes, that would've been much wiser (not to speak of much more truthful...)
                            The clearly reduced image quality (visible easily even to the untrained eye) you get with those "benchmark-drivers" will have a negative impact on NVidias image - even more so as they are not really that much faster than the cheaper R9700Pro even with those nasty hacks.
                            But we named the *dog* Indiana...
                            My System
                            2nd System (not for Windows lovers )
                            German ATI-forum

                            Comment

                            Working...
                            X