Announcement

Collapse
No announcement yet.

3DMark2000 CPU test a big spoof?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 3DMark2000 CPU test a big spoof?

    A word on the street goes that if one adjusts FSB with SoftFSB after starting up 3DMark, it will have no effect on the CPU mark. Sounds like the app actually measures your clock speed and CPU type when launched, and uses that to give the "benchmark".

    Note that I can't confirm this while SoftFSB doesn't work on my board, so I would greatly appreciate if someone else could test it and post the results here.

    :
    B

  • #2
    Well, considering that its 'CPU test' maintains /an absolutely constant/ '7.50 fps', I'd certainly be inclined to say of this so-called benchmark...
    ...That Boy Just Ain't Right :-)

    Comment


    • #3
      SDG, they limit the frame rate so the result is not effected by your Gcard. By limiting the frame rate and timing the canned scene they can figure what the performance of the cpu is. Don't go by the 30 seconds you see in the test as this is not the actual time that's measured.
      "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

      "Always do good. It will gratify some and astonish the rest." ~Mark Twain

      Comment


      • #4
        I should have said 'the frame rate /display/' - the actual updates are clearly much faster (and variable)...

        Erh...either
        1) the speed of the scene is set internally, so fps isn't an issue with speed (in which case it invalidates results gleaned from 'timing the canned scene')
        or
        2) the speed of the scene is left open to be changed as the fps change (in which case limiting the fps invalidates results)
        ...erh...I don't get it :-)

        To be honest I don't really care anymore - I'm already sick of slagging off the benchmark that isn't...:-)

        Comment


        • #5
          My brother's AMD K6-2 550 got a better result in the cpu test than my neighbour's coppermine 600!
          Both of them has the same type of 3D-card (voodoo 2 12MB) and different 2D-card (Matrox mystique 170 pci 8MB/ATI "Something" AGP 8MB)

          And i have twice the cpu score than any of them and i have a "Katmai" 550! and a G400.

          It seems to depend on the 2D-card as vell (ATI SUCKS!)

          ------------------
          INTEL PIII550 MSI 6163
          G400Mill 32MB SGRAM + RRG
          SBlive
          256 MB RAM CAS2
          43GB HDD Space!(Actual 40GB) (13+30 Quantum drives)
          Pioneer 104S DVD 10x CD 40x SLOT IN
          SONY CRX100E 4/2/24 CDRW

          If there's artificial intelligence, there's bound to be some artificial stupidity.

          Jeremy Clarkson "806 brake horsepower..and that on that limp wrist faerie liquid the Americans call petrol, if you run it on the more explosive jungle juice we have in Europe you'd be getting 850 brake horsepower..."

          Comment


          • #6
            ...Just as likely it depends on the phases of the moon...:-)

            Comment


            • #7
              I just tried it and can confirm this: the CPU benchmark scores of 3DMark2000 are COMPLETELY BOGUS!!

              Here are test results (Athlon600, K7M):
              1. Started 3DMark2000 with my usual 106 MHz FSB -> 181 CPUMarks
              2. lowered the FSB to 100MHz without leaving 3DMark2000 -> 181 CPUMarks
              3. lowered the FSB to 90MHz, again without quitting 3DMark2000 -> 182 CPUMarks!
              What the heck is going on here?!?
              4. quit 3DMark2000 and restarted it without changing FSB (i.e. leaving it at 90MHz) -> 154 CPUMarks!

              I think this is one of the worst cases of fooling a lot of users with a totally bogus so-called benchmark program.
              Well, at least the demo looks good...
              But we named the *dog* Indiana...
              My System
              2nd System (not for Windows lovers )
              German ATI-forum

              Comment


              • #8
                Yeah, I agree - I thought 3DMark99 was a reasonable benchmark - using that for D3D testing and Q2/3 for GL we got a good estimate of a cards performance in general circumstances - the 2000 version however is incredibly unrealistic and f****d up.

                Let's just stick to Q2/3 then and maybe UT/HL for D3D?

                Anybody used the Video2000 benchmark yet?

                Paul.
                Meet Jasmine.
                flickr.com/photos/pace3000

                Comment


                • #9
                  I don't like the CPU mark thing (results being incoherent as they are), but the rest of the benchmark appears to be valid, stressing both the video card, the CPU, using DX7 T&L and SIMD instruction sets (SSE, 3DNow!).

                  BTW, it uses a real game engine (Max Payne).
                  But I agree on the CPUMark, which looks seriously buggy.

                  ------------------
                  Corwin the Brute

                  Corwin the Brute

                  Comment


                  • #10
                    If you think MadOnion farqed up 3DMark 2000, then it won't come to a surpise to you when I say that Video 2000 is bogus too. I'll quote,
                    "Video2000 is based completely on standard DirectX 7 and DirectX Media 6 interfaces, so everything Video2000 measures is exposed via one of the two APIs. The features you're describing are proprietary and not accessible in a standardized way and thus not supported by Video2000."

                    The features he's mentioning here are none other than the hardware codec's which set Matrox cards out in front of their competition.

                    This wonderful tidbit of information was supplied to me by no one other than...

                    Ilkka Koho, Project Manager
                    MadOnion.com tel: +358 9 4355 0475
                    Kappelitie 6 fax: +358 9 4355 0445
                    02200 Espoo, Finland

                    This is especially sad concidering a G200 will beat a G400 due to their pathetic testing methology and if you haven't guessed it, virtually all other cards on the market come out above it too. And they call it an "Industry Standard". BS's what I call it! Geesh
                    "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                    "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                    Comment

                    Working...
                    X