Announcement

Collapse
No announcement yet.

Nvidia CG...breaking DirectX Rule??

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Nvidia CG...breaking DirectX Rule??

    I got this from Hardocp.com

    I don't know if you understand French, so here's the translation:

    "NVIDIA's marketing Dan Vivoli has announced that the next big thing from the 3D chip giant is the launch of NVIDIA Cg.

    What is this? It's simply a new development kit based on a new language that should greatly simplify developer's lives...This new language wants to be the C++ of graphics cards.

    Dan Vivoli said that this kit should permit the replication of certain effects used in CG movies, in real time on NVIDIA's next 3D chip (NV30). He has also stated that graphics cards available in 5 to 10 years will reproduce any CG movie.

    Developer's can get a beta version of the kit this week. This autumn with DirectX9 and NVIDIA's new 3D chip, a final version of the kit should be available"
    The way I look at this is we are going back to the was of certain games coded for certain cards etc?
    Last edited by GT98; 12 June 2002, 09:14.
    Why is it called tourist season, if we can't shoot at them?

  • #2
    Seems like a really stupid idea!

    BTW:

    Nvidia has always tried to imply that their card can do the same cg efects in realtime that takes huge serverfarms to produce
    If there's artificial intelligence, there's bound to be some artificial stupidity.

    Jeremy Clarkson "806 brake horsepower..and that on that limp wrist faerie liquid the Americans call petrol, if you run it on the more explosive jungle juice we have in Europe you'd be getting 850 brake horsepower..."

    Comment


    • #3
      what!
      haven´t you head?
      pixar traded their renderfarm for a gf4ti!!!

      j/k

      seriously though, the last thing developers need, is yet another API, why not improve the 2 we already got.

      of course IF they make cg opensource, and IF it actually is better, and IF it isn´t to optimized for nvidia cards, then it might be a good idea.

      but theres is alot of "IF"s
      This sig is a shameless atempt to make my post look bigger.

      Comment


      • #4
        Originally posted by TDB
        of course IF they make cg opensource
        Then you would call it OpenGL...


        If companies like Nvidia would just spend more time on actualy developing API's that are opensource then they could at least give some direction (which would benifite them as well, as well as getting suppot to all OS's) to the graphics industry and Card makers could make OpenGL Card with MS DX wrappers instead of the other way around... OpenGL in my opinion works better than DXAnything.
        Last edited by {PainCresT}DAn; 12 June 2002, 11:07.
        What was the error? Well its the ID10T error.

        Comment


        • #5
          hmm, so buying 3dfx paid off, they have rebadged GLIDE

          Comment


          • #6
            Originally posted by Marshmallowman
            hmm, so buying 3dfx paid off, they have rebadged GLIDE
            kinda what it sounded like to me....

            granted, i don't think that glide was that great for dealing with cinematics... but it could have been expanded and patched up a bit...

            go figure... nice one NVidia... if this is the innovation that is coming with the NV30, you can gag me with a spoon...
            "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

            Comment


            • #7
              I think they're trying to ressurect Glide in a new form.
              Glide had his merits if you all remember. It was very clean and efficient.
              I also remember that when the forst PowerVR chipsets arrived, they also had some sort of propriatiry language of their own.

              For better or worse, they're doing it. I don't think we should be quick to sneer about it, maybe (just maybe) they're doing something right (for once) and create a new API that'll be nicer and simpler to use.

              I bet that if they do it alone, they'll end up alone cause there are other big time players in the market such as ATI, Creative, [M]atrox etc. and nobody would like to put his money just to find himself on the incompatible side of the market later

              Comment


              • #8
                hehe... [M]atrox...

                i just read something saying that microsoft worked with them on it... perhaps it is not a ressurection of glide, but it is odd that microsoft worked with them on it at all, except for maybe directx interactions...

                Edit: it appears that Cg sits ontop of OpenGL or DirectX, providing an easy interface to either without the developer having to worry about it. this is rather contrary to what glide did.
                Last edited by DGhost; 13 June 2002, 09:31.
                "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                Comment


                • #9
                  Press release at Nvidia http://www.nvidia.com/view.asp?IO=IO_20020612_6724

                  And an interview http://www.gamespydaily.com/news/fullstory.asp?id=3529
                  Main: Dual Xeon LV2.4Ghz@3.1Ghz | 3X21" | NVidia 6800 | 2Gb DDR | SCSI
                  Second: Dual PIII 1GHz | 21" Monitor | G200MMS + Quadro 2 Pro | 512MB ECC SDRAM | SCSI
                  Third: Apple G4 450Mhz | 21" Monitor | Radeon 8500 | 1,5Gb SDRAM | SCSI

                  Comment


                  • #10
                    As I understand it this is just high level coding for the pixel/vertex shaders, not a whole new api. This has traditionally been done in asm afaik.
                    -Slougi

                    Comment


                    • #11
                      <i>"In short, the Cg Run Time Compiler will optimize code to run on any major GPU, from the likes of a GeForce4 to a Radeon 8500, R300, Kyro II or the up coming Matrox Parhelia."</i>

                      They actually admitted the P card as nexgen

                      <i>"We can render at HD resolution, with over 1 million polygons per frame, covering each pixel 20+ times with pixel shaded detail, at 60+ Hz, ....."</i>

                      60hz kinda sux

                      <i>"I’ll look forward to seeing it when the product is available.
                      "</i>

                      Aww, you don't know, do you?^?
                      Last edited by Parhelia Pro256; 13 June 2002, 15:14.
                      I'm with the ugly guy below me

                      (It's amazing how many threads I kill with that line )

                      Comment


                      • #12
                        Yeah right - it will optimize pixel shader code to run on kyro 2 based cards...

                        This is just a high level language in replacement for the assembler coding that was necessary to code for pixel/vertex shaders till now - just a tool for game developers to ease development. nVidia does this to ensure pixel & vertex shaders become more widely used, to generate need for future graphics hardware, of which they will have quite some market share. In the end, this will benefit the customer from a "nicer graphics" point of view, but will also shorten product life cycles.

                        I only have two concerns: Performance - will it be as fast as assembler code, or near enough? And flexibility - will it work really good with competitors' products, too? In theory, it should...

                        AZ
                        There's an Opera in my macbook.

                        Comment


                        • #13
                          If its just a compiler, then the output of it should be DX8.1 or 9 complient (depending on target source) and therefore work on any competitors hardware (I would be very suprised if PS1.4 is supported though, sorry ATI).

                          Im guessing that it just outputs a .asm file that can then be used as per normal.

                          Just a guess, and I double nVidia will do it nicely though. Then again I was suprised that the nForce AGP seems to work with other manufacturers cards.

                          Ali

                          Comment


                          • #14
                            Originally posted by Slougi
                            As I understand it this is just high level coding for the pixel/vertex shaders, not a whole new api. This has traditionally been done in asm afaik.
                            Yeah - it's hardly Glide II or another OpenGL! Should be cool - getting more use of the pixel and vertex shaders!

                            Comment


                            • #15
                              Why Nvidia's Cg won't work


                              Comment

                              Working...
                              X