Announcement

Collapse
No announcement yet.

Doom 3 vs. Half Life 2: Regardless of Which is Better, We All Lose

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Doom 3 vs. Half Life 2: Regardless of Which is Better, We All Lose

    Excellent article IMO.



    Dave
    Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

  • #2
    a good read, thx for posting
    P4b@2.7, AOpen ax4spe max II, 4X Parhelia 128 with Zalman zm80c and fan -or- ATI Radeon X800GTO, 1024mb.

    Comment


    • #3
      I think statements like these make the article a bit interesting...
      "Did ATI gain from its Valve partnership? No. Will it, in the long run? Only if final performance figures vindicate the purchasing decisions of people who bought HL2-enabled cards as long as 18 months ago. If they don’t, ATI and Valve both end up looking like liars, if they do, we end up with a bifurcated system where you buy NVIDIA for Doom, ATI for Half Life, and suffer if you wanted both. "

      He makes it sound like those that bought the ATI cards expecting good performance out of HL2 will suffer!!!
      Why are they going to suffer??? is Doom3 the only game on earth? So ATI plays Doom3 a few frames short of NVidia but the image quality is identicle, where's the problem here?

      I guess we'll wait until HL2 is released September 2, and find out what happens.

      Regards,
      Elie

      Comment


      • #4
        its a good article, but yet they are still stuck in the 3dfx way of thinking, which is more FPS=better video card, which totally shoots down their argument, if both cards are equal otherwise.
        Why is it called tourist season, if we can't shoot at them?

        Comment


        • #5
          I'm getting bit by stuff like this right now. I just bought Star Wars: KOTOR. It crashes all the time on ATI cards. I know it's the game's fault, and not ATI's, but damn it's annoying.
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • #6
            Originally posted by Elie
            I think statements like these make the article a bit interesting...
            "Did ATI gain from its Valve partnership? No. Will it, in the long run? Only if final performance figures vindicate the purchasing decisions of people who bought HL2-enabled cards as long as 18 months ago. If they don’t, ATI and Valve both end up looking like liars, if they do, we end up with a bifurcated system where you buy NVIDIA for Doom, ATI for Half Life, and suffer if you wanted both. "

            He makes it sound like those that bought the ATI cards expecting good performance out of HL2 will suffer!!!
            Why are they going to suffer??? is Doom3 the only game on earth? So ATI plays Doom3 a few frames short of NVidia but the image quality is identicle, where's the problem here?

            I guess we'll wait until HL2 is released September 2, and find out what happens.

            Regards,
            Elie
            i think the writer has his point. When we saw virtually unplayable fps from the early version of HL2 definites, many of us just stoped considering NV3X cards. The problem? Its not that ATI doesn't perform as well in Doom 3, its just that many buyers reject the NV3X irrationally, and forgets there are more things to a graphics cards than just games, like Linux/BSD drivers, Dual Head configuration flexibility, video playback performance/quality, frequency of drivers update etc.

            And then a year later the game finally releases, if the final numbers turn out to be decent, they'd think, gee, NV3X are actually playable, and now i want better linux drivers, i should have gotton an NV3X card instead of this R3X0 card.

            ATI doesn't do bad in D3 at all. The writer emphasized that many times in the article. Its just that IF (a big if) the game didn't do so bad on nvidia hardware afterall, there may be some disappointed customers who wanted what other things nvidia cards have to offer. And if that even happen, it will haunt ATI's sales in the future. (at lesat the sales of those who have read this article )

            Comment


            • #7
              Originally posted by GT98
              its a good article, but yet they are still stuck in the 3dfx way of thinking, which is more FPS=better video card, which totally shoots down their argument, if both cards are equal otherwise.
              didn't realize that until you said it. but when HL2 & the ATI team released the numbers a certain graph/numbers, its the same idea right? (graphs/numbers don't tell quality and by releasing them they are also assuming FPS=better video card; i'd think on ATI's shaders day they didn't do a direct comparison between the two brands in HL2) However, on the other hand, IIRC they did touch on the trilinear cheats nvidia did to their ForceWare 4X.XX drivers, just not a more specific comparison specifically in HL2)

              One last thing, both nvidia and ati sucks. Matrox forever! (but then they did miss the speed category, whatever)

              edit: Thanks Dave! That was a very interesting read!
              Last edited by Chrono_Wanderer; 22 August 2004, 22:25.

              Comment


              • #8
                I guess the performance-delta in the final HL2 will be about the same as it is in Doom3 - and all new - ATI+NVidia - cards will be on a very playable level.

                Since HL2 uses lots of shaders, but only short ones (well suited for ATI), we likely will see a "fps-victory" for ATI that is dependent on the ATI cards higher fillrate (->30% max., most likely smaller due to other things like AI, CPU-dependency, etc.). This goes for the NV40 vs. the R420, however, since NVidia with the NV40 finally corrected their shader-performance

                The older NV cards of the NV30-series will still most likely suck at HL2 because of their inferior shader performance.
                So the writer is not fully correct. When you want to play HL2, it surely was a good idea to NOT get an NV3x based card.
                If those buyers from that time can feel betrayed depends more on the question: how good will HL2 play on the old R300 based ATI cards?
                If it's not really playable well on these cards, they would have every right to feel betrayed...
                Last edited by Indiana; 24 August 2004, 06:26.
                But we named the *dog* Indiana...
                My System
                2nd System (not for Windows lovers )
                German ATI-forum

                Comment


                • #9
                  Matrox only lost me as a customer do to their crappy linux support. Here it is, a little more than 8 months later and they still haven't fixed their driver so that it works properly with the 2.6.x kernels.

                  The fact that some people DO buy their video card based on how FAST it's going to run a game over another card is sad but true. Personally, I figure any high-end card (especially those that are marketed for gamers (we should all burn Matrox for burning us on this)) should be able to play games nicely. I guess I can't blame Matrox too much, since they did release their Parhelia more than 2 years ago. Doesn't really matter now, I wouldn't go back to them due to the bad linux support.

                  Leech
                  Wah! Wah!

                  In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                  Comment


                  • #10
                    A quick question on HL2... I've read several places that at least the Vampire the Masquerade: Bloodlines (which is what I'm waiting for, HL2 may be cool, but I'm waiting on the Vamp game that uses it's engine!) will have support for SM 3.0. So if this is the case, wouldn't that give the NV40 at least a comparable performance to the R420? Or maybe even better? I'd guess it all depends on how much of the game uses it.

                    Leech
                    Wah! Wah!

                    In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                    Comment


                    • #11
                      Depends on the used shaders. The rare examples of Shader 3.0 being used didn't bring great performance gains to the NV40, so I wouldn't expect to much.

                      Still, with longer shaders, the NV40 should gain performance compared to the R420.
                      But we named the *dog* Indiana...
                      My System
                      2nd System (not for Windows lovers )
                      German ATI-forum

                      Comment


                      • #12
                        Originally posted by Indiana
                        Depends on the used shaders. The rare examples of Shader 3.0 being used didn't bring great performance gains to the NV40, so I wouldn't expect to much.

                        Still, with longer shaders, the NV40 should gain performance compared to the R420.
                        I thought that the 5800 did badly in <I>FarCry</I> with 3.0 shaders used?
                        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                        Comment


                        • #13
                          Do you mean the 6800? If you did, well before the 1.2 patch all nvidia boards defaulted to a lower precision mode (if I'm not misstaken) that produced lower quality images than the latest patch that added 3.0 shaders.

                          I haven't checked this myself though, as I own a Radeon.

                          Comment


                          • #14
                            Actually from everything I've seen, the 1.2 patch (that was recalled) added support for the SM 3.0 actually ran a lot faster on the 6800 cards. The reason they recalled it, is because all of the ATI users whined that it broke the game (don't remember exactly the issues that ATI users were having, but they were pretty bad). I'm sure over at nvnews.net there are a lot of people who posted comparisons.

                            Leech
                            Wah! Wah!

                            In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                            Comment


                            • #15
                              Originally posted by Novdid
                              Do you mean the 6800? If you did, well before the 1.2 patch all nvidia boards defaulted to a lower precision mode (if I'm not misstaken) that produced lower quality images than the latest patch that added 3.0 shaders.
                              I should have added that the 3.0 patch improved rendering speed over a pure 2.0 path.

                              Comment

                              Working...
                              X