Announcement

Collapse
No announcement yet.

Unreal II to be GeFarce III optimized?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Unreal II to be GeFarce III optimized?

    What the heck?! Isn't this what we have been trying to get away from for several years now!

    http://www.voodooextreme.com/article...formation.html

    I'm sure most of you remember the bajillion different versions of Quake that were released during it's reign of terror. There was the S3 version, the Verite version, the Glide version, the straight OpenGL version, and the classic software.

    And who can forget the madness with the Mechwarrior 2 optimization crap that went on for a good 2 years. Every retail card out there came with a version optimized for that card!

    So WHY!? After all that madness, and after we finally standardized on a 3D API (OGL and D3D), are we going BACK!!!!

    Someone needs a SEVERE beating!

    To be fair, this isn't entirely nVidia's fault. It's also Epic and id's fault for supporting this idiocy. To make matters worse, who know's how many other companies are going to blindly follow along because these two power-house developers are doing it. And anyone that wants to use the Doom3 or Unreal II engine are going to be supporting whether they like it or not.

    Sigh...
    Ok, thats the end of my rant. I hope someone realizes that this is a mistake before it's too late.

    Ian
    Primary System:
    MSI 745 Ultra, AMD 2400+ XP, 1024 MB Crucial PC2100 DDR SDRAM, Sapphire Radeon 9800 Pro, 3Com 3c905C NIC,
    120GB Seagate UDMA 100 HD, 60 GB Seagate UDMA 100 HD, Pioneer DVD 105S, BenQ 12x24x40 CDRW, SB Audigy OEM,
    Win XP, MS Intellimouse Optical, 17" Mag 720v2
    Seccondary System:
    Epox 7KXA BIOS 5/22, Athlon 650, 512 MB Crucial 7E PC133 SDRAM, Hercules Prophet 4500 Kyro II, SBLive Value,
    3Com 3c905B-TX NIC, 40 GB IBM UDMA 100 HD, 45X Acer CD-ROM,
    Win XP, MS Wheel Mouse Optical, 15" POS Monitor
    Tertiary system
    Offbrand PII Mobo, PII 350, 256MB PC100 SDRAM, 15GB UDMA66 7200RPM Maxtor HD, USRobotics 10/100 NIC, RedHat Linux 8.0
    Camera: Canon 10D DSLR, Canon 100-400L f4.5-5.6 IS USM, Canon 100 Macro USM Canon 28-135 f3.5-5.6 IS USM, Canon Speedlite 200E, tripod, bag, etc.

    "Any sufficiently advanced technology will be indistinguishable from magic." --Arthur C. Clarke

  • #2
    I expect "GeForce III optimized" just means that it makes use of DX8 features that the GF3 happens to support.

    Comment


    • #3
      Ooh dammit Raptor beat me to it.

      Yeah there's no such thing as "optimizing" for a particular card any more. At least not like it used to be.

      - Gurm

      ------------------
      Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
      The Internet - where men are men, women are men, and teenage girls are FBI agents!

      I'm the least you could do
      If only life were as easy as you
      I'm the least you could do, oh yeah
      If only life were as easy as you
      I would still get screwed

      Comment


      • #4
        <font face="Verdana, Arial" size="2">I expect "GeForce III optimized" just means that it makes use of DX8 features that the GF3 happens to support.</font>
        If that is the case, then they should simply say that it is optimized for DX8. As is I am paranoid about the claim that it is optimized for the GFIII. I wouldn't put it past some companies to do something as stupid as this would be.

        The fact that only the GFIII can currently take full advantage of DX8 is irrelevant. There will be other cards that do eventually, at which point the game no longer is "GFIII Optimized" unless there really are some direct hardware calls that fuxor everything up for other cards.

        So, I'm just going to sit here and be cynical about it until M and ATI release their next cards and lay the smack down (hopefully) on nVidia.

        Ian
        Primary System:
        MSI 745 Ultra, AMD 2400+ XP, 1024 MB Crucial PC2100 DDR SDRAM, Sapphire Radeon 9800 Pro, 3Com 3c905C NIC,
        120GB Seagate UDMA 100 HD, 60 GB Seagate UDMA 100 HD, Pioneer DVD 105S, BenQ 12x24x40 CDRW, SB Audigy OEM,
        Win XP, MS Intellimouse Optical, 17" Mag 720v2
        Seccondary System:
        Epox 7KXA BIOS 5/22, Athlon 650, 512 MB Crucial 7E PC133 SDRAM, Hercules Prophet 4500 Kyro II, SBLive Value,
        3Com 3c905B-TX NIC, 40 GB IBM UDMA 100 HD, 45X Acer CD-ROM,
        Win XP, MS Wheel Mouse Optical, 15" POS Monitor
        Tertiary system
        Offbrand PII Mobo, PII 350, 256MB PC100 SDRAM, 15GB UDMA66 7200RPM Maxtor HD, USRobotics 10/100 NIC, RedHat Linux 8.0
        Camera: Canon 10D DSLR, Canon 100-400L f4.5-5.6 IS USM, Canon 100 Macro USM Canon 28-135 f3.5-5.6 IS USM, Canon Speedlite 200E, tripod, bag, etc.

        "Any sufficiently advanced technology will be indistinguishable from magic." --Arthur C. Clarke

        Comment


        • #5
          There's no such thing as optimizing for the GeForce. Please, if anyone thinks there is I welcome them to tell me how to do it. These cards no longer have native API's.

          The closest you can come is tweaking your code so that you get optimal results for the GeForce. That doesn't mean it won't look perfect on a Radeon - in fact it will likely look better.

          The only way you could optimize for one card and NOT another is if you purposely exploited bugs in that card's drivers and/or implementation. BUT that would make your game fail to run on any other card, and they can't do that - it would ruin them.

          - Gurm

          ------------------
          Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.

          [This message has been edited by Gurm (edited 06 March 2001).]
          The Internet - where men are men, women are men, and teenage girls are FBI agents!

          I'm the least you could do
          If only life were as easy as you
          I'm the least you could do, oh yeah
          If only life were as easy as you
          I would still get screwed

          Comment


          • #6
            Well which sounds better from a marketing point of view?
            Saying that your upcoming game is optimized for the latest and "greatest" in video cards, or that it makes use of DX 8 features.

            Lots of people won't know or care what these DX 8 features are or do, but most of them will have heard of a GF3 through Nvidia's marketing bandwagon and how it has all these "new" features. The fact they are the same thing is irrelevant, the majority of people can refer to one and not the other.

            ------------------
            Unreal Fortress developer

            Comment


            • #7
              Doesn't nVidia have their own OpenGL implementation of Vertex Shaders, etc. in OpenGL, accessible through extensions called nv_blablabla ?

              If a game would use those extensions, and no others (provided that competitors have their own version of those extension), wouldn't the game theorectically be 'nVidia optimized'? Or are those nv_blablabla extensions open source, and can any company implement those in their drivers?

              Comment


              • #8
                <font face="Verdana, Arial" size="2">Originally posted by dZeus:
                Doesn't nVidia have their own OpenGL implementation of Vertex Shaders, etc. in OpenGL, accessible through extensions called nv_blablabla ?

                If a game would use those extensions, and no others (provided that competitors have their own version of those extension), wouldn't the game theorectically be 'nVidia optimized'? Or are those nv_blablabla extensions open source, and can any company implement those in their drivers?
                </font>
                Baldurs Gate II anyone? Runs on a crapy tnt1 but not on a G400 Max!
                According to the latest official figures, 43% of all statistics are totally worthless...

                Comment


                • #9
                  Guru,

                  Baldur's Gate II runs fine on a G400 Max.

                  - Gurm

                  ------------------
                  Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
                  The Internet - where men are men, women are men, and teenage girls are FBI agents!

                  I'm the least you could do
                  If only life were as easy as you
                  I'm the least you could do, oh yeah
                  If only life were as easy as you
                  I would still get screwed

                  Comment


                  • #10
                    Yeah, BG2 runs fine on my MAX.. I had to tweak it around a bit though, since it did run like it was on a cirrius logic chip at first :P

                    Comment


                    • #11
                      With 3d acceleration or whitout?
                      According to the latest official figures, 43% of all statistics are totally worthless...

                      Comment


                      • #12
                        ...and PlanetMoon should be shot for doing a modified version of Giants to be bundled with the Elsa GF3 Card.

                        These people are rightly proud of their games, and if they can be tweaked to look even better, then who can blame them?

                        Are you forgetting the Expendable game that came with the G400 MAX? As far as I can tell, the standard retail version didn't have an EMBM option. There was a bump mapping option in the Video settings, but I don't think it was EMBM aware.

                        I could be wrong about this last point though
                        Phils PC Mods - a rough guide

                        Comment


                        • #13
                          So, does anyone know if they are using 'NVIDIA GL extensions' or just the features on DX/GL that any card will eventually support?

                          Sounds a little like the EMBM/DualHead situation, but of course, only for a >$500 card

                          Now then, we want to see if they'll announce a patch for G-FX support Assuming Matrox release it before the game anyway!

                          And, finally, why doesn't a 134th generation GeSpot come with render nude?

                          P.
                          Meet Jasmine.
                          flickr.com/photos/pace3000

                          Comment


                          • #14
                            If any company releases a game with the words 'Optimised for DirectX 8' on the box can't you see what would happen?
                            "I bought this game, it syas optimised for DirectX 8, but it doesn't look very optimised to me"
                            "What graphics card do you have?"
                            "I've just downloaded Directx 8 from Microsoft's site"
                            "What card sir?"
                            "Well it's an ATI Xpert@Play, but I do have DirectX 8......"

                            They need to tell the uninformed user that they mean DirectX hardware optimised, the easiest way of doing this is to mention the actual hardware that uses it.
                            It's like those old games optimised for Glide.
                            You would read the box, the Diamond VooDoo II used to be listed, but never any mention of my Orchid VooDoo II card.
                            If I didn't know that VooDoo was just VooDoo I might have been tempted to not purchase.
                            It cost one penny to cross, or one hundred gold pieces if you had a billygoat.
                            Trolls might not be quick thinkers but they don't forget in a hurry, either

                            Comment


                            • #15
                              Guru,

                              Baldur's Gate 2 runs fine (well with the 5.33 drivers, anyway - damn those 5.51's!) with hardware acceleration.

                              But you have to use OpenGL optimized for accuracy, and page flipping MUST be ON.

                              (Actually, I don't use the accuracy part, but some people have to or they get texture corruption.)

                              - Gurm

                              P.S. And DO NOT USE the "BackwardsCompatible" thing that Interplay/Bioware tell you to use. They are on crack saying the G400 can't run their pathetic semi-accelerated OpenGL engine. *wink*

                              ------------------
                              Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.

                              [This message has been edited by Gurm (edited 07 March 2001).]
                              The Internet - where men are men, women are men, and teenage girls are FBI agents!

                              I'm the least you could do
                              If only life were as easy as you
                              I'm the least you could do, oh yeah
                              If only life were as easy as you
                              I would still get screwed

                              Comment

                              Working...
                              X