Announcement

Collapse
No announcement yet.

And the Carmack speaks again...

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • And the Carmack speaks again...

    Somewhere in his last .plan file (here) he mentioned some things that will probably be of interest to us Parhelia users...

    Doom has dropped support for vendor-specific vertex programs (NV_vertex_program and EXT_vertex_shader), in favor of using ARB_vertex_program for all rendering paths. This has been a pleasant thing to do, and both ATI and Nvidia supported the move. The standardization process for ARB_vertex_program was pretty drawn out and arduous, but in the end, it is a just-plain-better API than either of the vendor specific ones that it replaced. I fretted for a while over whether I should leave in support for the older APIs for broader driver compatibility, but the final decision was that we are going to require a modern driver for the game to run in the advanced modes. Older drivers can still fall back to either the ARB or NV10 paths.
    Bad news indeed, as this means that the vertex shader units on the Parhelia will go entirely unused when using Doom 3 or any derivative engine. Last time i checked the only method of using vertex shaders in OpenGL on the Parhelia was through EXT_vertex_shader, and also last I checked the Parhelia cannot support ARB_vertex_program.

    Throwing more work at the processor when using an already impressively CPU bound video card. Not a good thing, as that should drop performance down a little bit.

    I am gonna go out on a limb and say that Parhelia support in Doom3 is gonna be slim to non-existant.

    Edit: I take that back, the Parhelia probably can support ARB_vertex_program. It should (in theory) be able to support floating-point vertex shader code (especially as they are supposedly VS2.0 compliant). That being said it would take a fair amount of work in order to make it support it, involving updating their ICD to support OGL 1.4... if anyone wants to take guesses as to this happening or not (especially considering Matrox's past history of fairly poor OpenGL support) they are more than welcome to...
    Last edited by Roark; 30 January 2003, 01:16.

  • #2
    That's bad news for P owners if i ever heard any.
    no matrox, no matroxusers.

    Comment


    • #3
      Especially since rumours say they are down to 2 OGL Coders.

      Comment


      • #4
        Hmmm, last I checked ARB_vertex_program WAS in the listed parts of support in the OpenGL ICD. If I were in windows I could tell you for sure.

        Leech

        P.S. Carmack is switching from vendor specific API's to a standard one. This is a GOOD thing for Parhelia owners, not a BAD thing.

        Edit: Oops, not there. Too many of them are the same, but I still think this is a good thing, they just need to update they're ICD to 1.4
        Last edited by leech; 31 January 2003, 12:34.
        Wah! Wah!

        In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

        Comment


        • #5
          It is a good thing overall. It forces vendors who want to stay competetive in the gaming market to update their drivers and add support for an industry standard. It also moves closer to the idea of cross-hardware programable pipeline capabilities in OpenGL instead of vendor specific extensions.

          However, in the case of Matrox this is not nessicarily a good thing as they don't even support VS 2.0 or *any* part of DX9 in their drivers yet, and we all know how bad they are at OpenGL support. The G4xx speaks tons to the quality of support Matrox has for OpenGL. and the Parhelia I have speaks volumes too.

          It perhaps also could be interpreted in a way that shows how little Carmack cares about the Parhelia for Doom3. Not to start a flame war against him, but i am betting that if he considered it a competetive part or even worth the effort it would have factored into the decision more, instead of it being a mostly NV/ATI thing. And i honestly cannot blame him considering the state of the Parhelia drivers right now. They have improved a ton since launch but they are still far from perfect in terms of quality and speed.

          just a couple of thoughts...

          Comment


          • #6
            Uh, whats wrong with G-Series OGL support? I know it was flaky like 4 years ago, but come on it is pretty sweet now.

            Comment


            • #7
              That's my thoughts on Matrox and OpenGL as well. I haven't had any problems with the Parhelia's OpenGL and who gives a harry monkey's ass if it doesn't support any DX9 yet, NO games will for the next few years. Hell, games are just barely starting to take advantage of DX8 specific things. Doom3 is OpenGL anyhow, I just hope that by the time it is released I can play it in linux (Hurry up Matrox and get some killer GL drivers for linux)

              Leech
              Wah! Wah!

              In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

              Comment


              • #8
                Uh, whats wrong with G-Series OGL support? I know it was flaky like 4 years ago, but come on it is pretty sweet now.
                The G400 OpenGL support is very stable and much better than it used to be. Most major issues have been fixed. For example:
                1. Vsync is now supported
                2. 32-bit z-buffer is now supported
                3. Multitexturing in Serious Sam now works.
                4. Jedi Knight II no longer crashes during the intro.
                5. Texture "jiggling" or "swimming" on polygons was fixed.
                6. GL extensions in RTCW are now supported.

                However, it still has some minor limitations:

                1. It's about 5-10% slower than what would be expected from Direct3D performance.
                2. Hardware fog is not supported (that's why parts of RTCW are very slow).
                3. Incompatibilities b/w 32 bit z-buffer and dynamic lights in some games.
                4. Vsync does not seem to work right... Basically the performance hit is quite significant even in older games. Direct3D vsync works great, on the other hand.
                5. Not all extensions are supported (of course in some cases this is due to hardware limitations).
                6. Some graphics artifacts remain: polygon boundaries are sometimes visible, etc.
                7. Most emulators (ePSXe, SNES9x, etc.) do not work in OpenGL mode.

                (Note: Some of these may have been fixed by now... I haven't played games on my g400 for quite a while.)

                Comment


                • #9
                  the reason hardware fog (used in RTCW) is so slow is that it uses the extension NV_... (I forgot) doh
                  Hey! You're talking to me all wrong! It's the wrong tone! Do it again...and I'll stab you in the face with a soldering iron

                  Comment

                  Working...
                  X