Announcement

Collapse
No announcement yet.

Radeon HD 2900 XT Review

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Radeon HD 2900 XT Review

    [H] is the first out of the gate with a review.

    Bottom line: The card is a flop, sad to say. They say it overclocks like crazy though because it is really an "XTX" but they decided not to release it simply because 300W power consumption is not justifiable. But if you overclock it, you will hit 300W power consumption Also, to overclock, you will need a 6-pin and 8-pin to power the beast.

    Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

  • #2
    Actually, I'm wrong, there are 2 or 3 other sites that came out with reviews already.

    This sucks because now I don't know what to buy Maybe I'll get the 2600 assuming it competes performance wise to tide me over until the R700 comes out Worse yet, the IQ is inferior on the new ATI cards compared to Nvidia. Wow, they really blew this one.
    Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

    Comment


    • #3
      Card is good, drivers are not. Wait a few weeks. Some sites have tests at really high res, 2560x1600 or something, where the R600 beats both 8800gts and gtx, but then actually gets lower fps at 1280x1024 than it did at double the res. Makes no sense.
      Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
      Laptop: MSI Wind - Black

      Comment


      • #4
        Regardless though, with any newer powerhouse card (8800/2900) make sure you have adequate power.
        Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
        Laptop: MSI Wind - Black

        Comment


        • #5
          Well, the only resolution I really care about now is 1920x1200 and all the tests show the 2900 getting pounded.
          Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

          Comment


          • #6
            But then you raise the res and it gets higher fps, makes no sense. Wait till drivers get better before making any decisions.
            Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
            Laptop: MSI Wind - Black

            Comment


            • #7
              Originally posted by |Mehen| View Post
              Card is good, drivers are not. Wait a few weeks. Some sites have tests at really high res, 2560x1600 or something, where the R600 beats both 8800gts and gtx, but then actually gets lower fps at 1280x1024 than it did at double the res. Makes no sense.
              i've been waiting for ATI AMD to fix their drivers for years. still hasn't happened. Matrox's Parhelia did the same sorts of stupid shit. it's entirely unjustifiable in todays market.
              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

              Comment


              • #8
                300W!? People, this is absolutely crazy.
                If a dual core laptop with half-decent graphics can run with a tiny little power supply, there's absolutely no reason why you shouldn't be able to run a mean machine with as much as 400W.
                "For every action, there is an equal and opposite criticism."

                Comment


                • #9
                  Overall it would be a disappointment if in the last month I didn't come to expect the HD2900 XT to be power hungry and close to 8800GTS performance.
                  Sure the 8800GTX beats it in all tests but at the price range it competes with the 8800GTS 640MB. It falls a bit short of it in most test, I guess there's room for driver improvement.
                  In the end it could be just power consumption that separates them.
                  Another thing ATI introduces is tesselation , to be fair it's sort of their second attempt, let's hope it's used unlike n-patches (and I'm not forgetting Matrox, it was just ahead of time ? ).

                  TBH, if I were to buy a new card now it would be a 8800GTS 640MB.

                  Comment


                  • #10
                    sigh...

                    it would be nice if people learned from their mistakes about this...

                    Why has tesselation not caught on in the past? because the content developers have little to no control over the final output. It makes it difficult to guarantee what the end user will see, and it adds a lot of testing to the code. Not to mention the fact that it traditionally breaks geometry based hit detection/physics implementations... Generally speaking, it's a lot of ****ing around for not too much of a benefit.

                    Displacement mapping is such a better feature...
                    "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                    Comment


                    • #11
                      The 2600 is competitive performance-wise and draws "only" 45 W. C't has a review; got it saturday.
                      There's an Opera in my macbook.

                      Comment


                      • #12
                        Am I the only one that is seeing the good side of this? At REALLY high resolutions the HD2900XT is comparitive OR FASTER than the 8800GTX.

                        Two things need to happen; and they will:
                        1. the 65nm part, what we are seeing reviewed is the 80nm part, which is horrible. Some chips from this process come out ok, others are leaky and hot. Its just bad. The 2600 series is built on the 65nm process and as previously stated has low power draw. The R650 (65nm version) will likely come out later this summer, and THAT is what we have to be excited about.
                        2. Drivers. Once they get them up to par for the HD2900 series (they are fine for every other card, not sure what DGhost is talking about...).

                        The R600 is a bust with a "but." Buying the HD2900XT would be stupid, but the card does show us what DAAMIT (amd/ati) is capable of. Once R650 comes out, hopefully with #1 & #2 fixed, in a few months we will see a monster of a card.


                        IMO they shouldn't have released the R600 at all. All it does is piss people off (moreso than just delaying). The R650 on 65nm will show massive power drops, which they can use to either up the clock speed or just significantly lower power consumption.
                        Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
                        Laptop: MSI Wind - Black

                        Comment


                        • #13
                          "For every action, there is an equal and opposite criticism."

                          Comment


                          • #14
                            Originally posted by |Mehen| View Post
                            Two things need to happen; and they will:
                            1. the 65nm part, what we are seeing reviewed is the 80nm part, which is horrible. Some chips from this process come out ok, others are leaky and hot. Its just bad. The 2600 series is built on the 65nm process and as previously stated has low power draw. The R650 (65nm version) will likely come out later this summer, and THAT is what we have to be excited about.
                            2. Drivers. Once they get them up to par for the HD2900 series (they are fine for every other card, not sure what DGhost is talking about...).
                            Number 2, I have no doubt about, but number 1... WHEN ? if they keep as quiet about it as they kept about the R600 they might loose even more customers. I put my hopes in a 65nm part since the beginning and I was hoping for a 80nm XTX and a 65nm XT.
                            Last rumored ETA is for September, that's a bit way off for me and by that time a G100 or R700 could be in the workings for early 2008 release.

                            Comment


                            • #15
                              Mehen, Are you telling me that you rather buy the 2900 XT over the 8800 GTS because of equal or better performance at high resolutions?

                              Let me point out a few things to you.

                              1. BFG 8800 GTS for $329 after rebate vs. $399 and higher for the 2900XT
                              2. 8800 GTS has better IQ. It's a sad day when ATI has to play catch up in IQ category to Nvidia
                              3. 8800 GTS runs way cooler vs. the really hot 2900 XT
                              4. In almost all benchmarks that 90% of the market will run, the 2900 XT loses
                              5. 2900 XT is waaaaay late to the game.

                              I really wanted to like the R600. I don't like Nvidia, but Nvidia's DX10 cards are superior in every way that I can think of except for the one you pointed out about high resolutions.
                              Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

                              Comment

                              Working...
                              X