Announcement

Collapse
No announcement yet.

Parhelia got owned by Xabre 600!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Parhelia got owned by Xabre 600!



    Notice how the Xabre 600 64 Mo eats the Radeon 9500 64 Mo, Radeon 8500 128 Mo, Parhelia 128 Mo, etc. for breakfast
    And notice the Parhelia is LAST, even BEHIND THE GEFORCE4 MX!

  • #2
    Parhelia was never designed for gaming at that rez tbh, that's what FAA 16x is for

    Comment


    • #3
      Parhelia was never designed for gaming at that rez tbh, that's what FAA 16x is for
      Still, it is pretty sad that the Parhelia can't even muster half the fps of cards that cost 1/4 of its price. Hell, even the Xabre 400 owns it. I sure hope this is due to some driver issues. Otherwise, this means that the Parhelia's design is very inefficient.

      Comment


      • #4
        How did any of those cards score at 3072x768?
        Oh yea...
        Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

        Comment


        • #5
          Probably driver issues: http://forums.murc.ws/showthread.php...highlight=rtcw
          Main: Dual Xeon LV2.4Ghz@3.1Ghz | 3X21" | NVidia 6800 | 2Gb DDR | SCSI
          Second: Dual PIII 1GHz | 21" Monitor | G200MMS + Quadro 2 Pro | 512MB ECC SDRAM | SCSI
          Third: Apple G4 450Mhz | 21" Monitor | Radeon 8500 | 1,5Gb SDRAM | SCSI

          Comment


          • #6
            "Hell, even the Xabre 400 owns it. I sure hope this is due to some driver issues."

            Maybe, but Matrox is more well known for good drivers than SIS...

            Comment


            • #7
              Maybe, but Matrox is more well known for good drivers than SIS...
              True, but Matrox has always had problems in OpenGL.

              Comment


              • #8
                Rather, it is issues that the Parhelia has with the Q3 and Q2 engines...
                Let us return to the moon, to stay!!!

                Comment


                • #9
                  Originally posted by 3dfx
                  Parhelia was never designed for gaming at that rez tbh, that's what FAA 16x is for
                  i got one and i thought the whole idea with it was that it performed rubbish at 800x600 but you could crank up the res and eye candy without taking a hit, 16xfaa is a kop out if it cant perform decent frame rates at resolutions without and aa.9i love the card as it plays the games i use fine but i am not going to defend it on this one)
                  is a flower best picked in it's prime or greater withered away by time?
                  Talk about a dream, try to make it real.

                  Comment


                  • #10
                    Originally posted by Kruzin
                    How did any of those cards score at 3072x768?
                    Oh yea...
                    right on!!!
                    is a flower best picked in it's prime or greater withered away by time?
                    Talk about a dream, try to make it real.

                    Comment


                    • #11
                      Originally posted by Kruzin
                      How did any of those cards score at 3072x768?
                      Oh yea...
                      you know... according to that chart the Parhelia should run ~20FPS (less probably) at 3072x768... while all the other cards would remain playable if they could run at that resolution...

                      K6-III: what Thomaz said is more accurate in this case... its not just the Q2/3 engines that are having issues, there are problems in OpenGL that are affecting way more than just those...

                      3dfx: i wish the Parhelia wasn't designed to run at those resolutions. unfortunately, that is not the case. especially considering that 1600x1200 is less demanding than the 3072x768 surround gaming mode that Kruzin felt the need to bring into this....
                      "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                      Comment


                      • #12
                        I notice my following example has many flaws in them but o well it's an idea...

                        1600x1200
                        = 1920000 pixels

                        3072x768
                        = 2359296 pixels

                        So... a 1600x1200 scene is approx. 81.4% of a 3072x768 scene...

                        now let's take the second worse card on the chart, the Readon 8500LE 64MB for comparison...

                        42.5fps * 81.4%
                        = 34.6fps...

                        34.6fps vs. 22.9 fps. for the R8500LE to render 2359296 pixels...

                        **sniff sniff**

                        Comment


                        • #13
                          What good does it do to calculate "theoretical" possibilities, when the card can't do it?
                          Even if P got 5 FPS, it would be 5 more than the others can dream of doing.
                          Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

                          Comment


                          • #14
                            You know what...I don't why this thread is even here.
                            It's clearly a benchmark thread, so it belongs in the benchmark forum.
                            Off ya go...
                            Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

                            Comment


                            • #15
                              Kruzin - who gives a shit if you can play the game on 3 monitors if it is unplayable at that resolution?
                              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                              Comment

                              Working...
                              X