Announcement

Collapse
No announcement yet.

8800GS - 3850 competitor/killer

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 8800GS - 3850 competitor/killer

    Well it might not be a killer, we'll have to see what the street pricing and availability is like.

    More reviews should come tomorrow.



    The 8800GS 384mb is about 80% the performance of a 8800GT, and should be priced at around $165 USD.
    The 768mb GS (also higher clock speed) will probably be closer to 85% of the GT and priced just over $200.

    Prices don't seem THAT great when you compare it to boxing day 8800GT prices, but the 384mb GS in particular may find its niche.
    Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
    Laptop: MSI Wind - Black

  • #2
    OMFG! A new video card that costs twice what I would pay for a video card, that isn't as good as the one that costs 10x what I'd pay! OMFG! AMAZING! It's better than the other one that also costs twice what I'd pay!

    Uh... sorry. I'm in a bit of a snit tonight. I just can't keep up with this whole 6-month dev. cycle on video hardware thing.
    The Internet - where men are men, women are men, and teenage girls are FBI agents!

    I'm the least you could do
    If only life were as easy as you
    I'm the least you could do, oh yeah
    If only life were as easy as you
    I would still get screwed

    Comment


    • #3


      Its annoying to keep track of, especially when there are (for example) 3 different 8800GTS's for example. A 320, 512, and 640mb (and 640 SSC) one. And the 512mb one is the best.
      Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
      Laptop: MSI Wind - Black

      Comment


      • #4
        It used to be that when you bought a high end product, it stayed high end for a while...now, you sneeze, and you're mainstream...

        I think its mainly because there hasn't been a straight forward upgrade path for the manufacturers. In DX9 and before, it was very clear, more pipelines, and more complex pipelines.

        With unified shaders, they aren't sure which is beneficial, or even what...

        Open rant at reviewers :
        And I really wish all the reviewers out there would stop benching at 1024x768 with No AA or AF, and then showing us 1280x1024 with 2xAA/8xAF, and then 1600x1200 at 4xAA and 16xAF...
        Why do they mess up their benchmarks with AA at high resolutions ?
        I, like many people, would rather have higher framerates than AA, especially on a fixed resolution TFT...
        /end rant
        PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
        Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
        +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

        Comment


        • #5
          My biggest gripe with current high(?) end video cards is the amount of current they need.
          I don't see why a modern gaming computer should need more than a quality 380W psu.
          "For every action, there is an equal and opposite criticism."

          Comment


          • #6
            I am not sure they really do.. buddy of mine just used a watt meter on his Q6600+2xHD2900XT computer.. using 300W on idle.. to me that means that if he had 3870's instead it would be probably around that rate under load.. I have yet to get his load numbers
            We have enough youth - What we need is a fountain of smart!


            i7-920, 6GB DDR3-1600, HD4870X2, Dell 27" LCD

            Comment


            • #7



              200W under load for a single graphic card is simply outrageous.
              IMHO, they should take no more than 60W when idle and under 150W at peak load.
              You should be able to run a quad core, SLI gaming rig on a 400W PSU and still have some watts to spare for USB devices etc.
              "For every action, there is an equal and opposite criticism."

              Comment


              • #8
                Originally posted by TransformX View Post
                http://arstechnica.com/journals/hard...hd-3800-series

                200W under load for a single graphic card is simply outrageous.
                IMHO, they should take no more than 60W when idle and under 150W at peak load.
                You should be able to run a quad core, SLI gaming rig on a 400W PSU and still have some watts to spare for USB devices etc.
                You'll probably never see that in a desktop. In a laptop however... Just have to wait till they release a mobile quad core cause SLI is already available for laptops. But as for performance...
                Last edited by ZokesPro; 3 January 2008, 13:39.
                Titanium is the new bling!
                (you heard from me first!)

                Comment


                • #9
                  Originally posted by TransformX View Post
                  http://arstechnica.com/journals/hard...hd-3800-series


                  200W under load for a single graphic card is simply outrageous.
                  IMHO, they should take no more than 60W when idle and under 150W at peak load.
                  You should be able to run a quad core, SLI gaming rig on a 400W PSU and still have some watts to spare for USB devices etc.
                  Please read the article carefully. I followed your link. Those wattage measurements are for the [b]entire computer[b]. Also, since they were measured at the wall, it also means that the power supply's inefficiencies are not taken in to account. So even the idle vs. load comparison have an error factor.
                  Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                  Comment


                  • #10
                    while it might change the exact figures, it doesn't change the fact that there's been a disturbing trend in rising power consumption needs for gaming videocards.

                    My system is not significantly different regarding power consumption from a modern system other than for the videocard (p650), and it doesn't pull more than 120W stressed out. And that's with a CPU that was regarded to have very high power consumption back in its days (northwood 2.8C). Imo, anything over 100-150W stressed out these days is rather silly for a full system. If you carefully select components, less than 80W for a very fast system should be possible. Of course this excludes the only factor where power consumption has been very problematic, i.e. the videocard.

                    Maybe if mobile videocards were usable in desktop machines it wouldn't be a problem, otherwise say 30W of power budget won't get you more than a Radeon HD2600 Pro.

                    Comment


                    • #11
                      One thing to keep in mind, it's only been the last year or two that PSUs have been required to operate at about 80% efficiency or above.
                      So you don't want to really ever peak above 80% and you want some head room. For the most part a quality 450W or so from the last year will be plenty.
                      If you have an older PSU, even if it is rated at 400-500W, keep in mind the efficiency might be in the low 70's or even 60s (%).

                      IMO the 600/700/1000W etc PSUs are overkill, with few exceptions - namely large overclocks and SLI.
                      Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
                      Laptop: MSI Wind - Black

                      Comment


                      • #12
                        The 3850 and 3870 appear to use around the same power at idle as a 8600GT. However, they have 667 million transistors compared to 289 million for 8600. Now, I find that to be impressive. Apparently the new Radeons do use very advanced power conservation tech, as you can read here.

                        So the RV670 includes mojo for many markets. One of those markets is mobile computing, and this time around, AMD is pulling in a mobile-oriented feature to make its desktop chips more power-efficient. The marketing name for this particular mojo is "PowerPlay," which wraps up a number of power-saving measures under one banner. PowerPlay is to GPUs like Intel's SpeedStep is to CPUs. At the heart of the mechanism is a microcontroller that monitors the state of the GPU's command buffer in order to determine GPU utilization. With this info, the controller can direct the chip to enter one of several power states. At low utilization, the GPU remains in a relatively low-power state, without all of the 3D bits up and running. At high utilization, obviously, the GPU fires on all cylinders. The RV670 also has an intermediate state that AMD calls "light gaming" where some of portions of the graphics compute engine are active, while the rest are disabled in order to save power. PowerPlay can also scale core and memory clock speeds and voltages in response to load. These things are handled automatically by the chip, and niftily, AMD has included a GPU utilization readout in its driver control panel.
                        From the looks of it, these are extremely power-efficient GPUs. Certainly when you compare them to NV.


                        How does a 98W load power improvement (from the wall) sound? 3870 does just that to its very similar predecessor, 2900.

                        As for the pricing arguments, perhaps you guys haven't looked at the engineering involved in a 8800 GTX board in comparison to a G400 MAX. Yikes. I'm sure there's still a lot of mark up there, but I am also sure that those cards are very low volume. High complexity + low volume != low prices. Besides, a 8600 GT has much more value to it than a TNT2 M64 did when it was $100. And a $200 Radeon 3850 or 8800GT 256MB certainly are decent options. The ultra high end is just yet another option available.
                        Last edited by Heiney; 5 January 2008, 03:45.

                        Comment

                        Working...
                        X