Announcement

Collapse
No announcement yet.

NV20 will be programmable???????

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NV20 will be programmable???????

    Nvidia's next graphics chip, codename NV20, will be the best thing since sliced bread - what new 3D accelerator isn't? - according to specs leaked to ZDNet's German subsidiary.

    But while everyone appears to be getting all hot and bothered by the numbers, one small detail proves rather interesting: the chip is programmable.

    This is interesting because of recent developments with DirectX. Version 8.0 of Microsoft's games and multimedia API. Earlier this week, head of UK graphics chip developer 3DLabs, Nick Trevett, claimed DirectX 8.0 was going to render - pun not intended - all existing chip architectures obsolete.

    Why? Because DirectX 8.0 gives games developers control over the rendering pipeline. Where once accelerator chip designers determined how their silicon would speed up geometry, texturing, shading and lighting, and software writers had to work with that, now the coders can decide how best to organise the pipeline for the needs of their software.

    "There is a real discontinuity coming in the graphics market. Applications vendors will decide what the geometry and pixel pipelines look like. It makes every graphics chip in the world obsolete," said Trevett.

    Trevett's claim - made in an interview with Electronics Times - has an agenda, of course. It's all about promoting 3DLabs solution to accelerating DirectX 8.0: a programmable chip core.

    "We believe it is a bigger discontinuity than between 2D and 3D," he said. "Every time you get a discontinuity, you get an opportunity... We are working to make sure we are one of the first to have a programmable architecture."

    And, if the ZDNet Germany leak is anything to go by, so is Nvidia. Interestingly, we asked Nvidia to comment on Trevett's claims - oddly enough, no one from the company chose to speak to us.

    NV20 certainly is a DirectX 8.0-oriented chip, supporting the API's pixel shaders, according to the leak. It also adds support for 3D textures - something we'd predicted Nvidia was going to incorporate into its Xbox chip design - high order surfaces, real reflective bump-mapping and layered fogging for much smoother mist effects than has been possible to date.

    Nvidia has also built in technology to reduce overdraw - drawing one polygon then drawing another on top it because it's 'closer' to the viewer - which apparently leads to improved performance as the triangle count increases. That, reckons Nvidia, will lead to a two- to sevenfold improvement over the GeForce 2 Ultra, depending on the number of polygons in the image - the more there are, the better the improvement.

    Whatever, it should make for an interesting comparison with 3dfx's upcoming Gigapixel-based technology, codenamed Mosaic, which improves performance by focusing on only a small part of the to-be-rendered scene at a time, also an attempt to reduce the impact on performance from all the overdraws 3D chips usually waste cycles on.

    The NV20 is also said to offer significantly improved anti-aliasing - it's up to four times faster than the GeForce 2 Ultra.

    Much of NV20's performance gains will come from its support of 250MHz double data rate (DDR) SDRAM. The chip's core clock is said to be higher than the GeForce 2 Ultra, which runs at 250MHz.

    Both DirectX 8.0 and NV20 give a taster for the technology that will sit at the heart of Xbox, so the console is clearly shaping quite nicely, and Nvidia is doing rather nicely out of its relationship with Microsoft. In processor terms, NV20 does more than one trillion operations per second and 100 billion floating-point ops per second.


    I HOPE G800 gets the same features!!!1
    Athlon Thunderbird 1.1Ghz@1.2~1.3+GHz Socket A 256Kb,Asus A7V dipswitches,GlobalWin FOP32-1 heatsink,GlobalWin 802 Advance ATX Case, 17" Sony Multiscan 200PST,384MB Crucial PC133 CAS=2,ATI Radeon 32Mb DDR,(Matrox Millenium G400 MAX 32MB 5ns SGRAM),IBM Deskstar 75GXP 15Gb UltraATA/100, Quantum Firebal EL 10.2Gb,Hewlett Packard DeskJet 970Cxi,Epson Perfection 1240U Scanner,Sound blaster Live!,Cambridge Soundworks 5.1,Creative PC-DVD 5X,CDR-RW Ricoh MP7040S@MP7060S(Tweaked from 4x--->6x with no problem),Adaptec SCSI 2920C,Diamond SupraExpress 56e PRO,Iomega Zip Drive.

  • #2
    I wonder if they can program display quality?
    Interesting info though thanks.
    Chief Lemon Buyer no more Linux sucks but not as much
    Weather nut and sad git.

    My Weather Page

    Comment


    • #3
      The PIT
      I wonder if they can program display quality?
      Interesting info though thanks.


      I beleive in the next generation of cards 2d quality will be on an even keel. I don't think that Nvidia
      , 3DFX ,and ATI are ignoring the cries of the people.


      Let me just put it like this. My AIW Radeon does have better image quality than my G400 did! Yea, go ahead and flame me for the truth! All of my Cad Systems (MasterCam, ShopCam, ect.) even look better. As for QA3 , UT , and the like....High Resoultions as smooth as a babies bottom!

      Blasted Matrox!! You better make me a G800 Marvel soon because I sound like a SELLOUT!!!*whiper*

      WMTJ



      [This message has been edited by WMTJ (edited 23 November 2000).]
      http://www.skynary.com/winstonco1
      Not all will capitolize on the webs greatest opportunity!Only the few with a vision!

      Comment


      • #4
        I ain't going to flame you but someone else might. To be honest someone somewhere posted a load of images between various cards. Myself and a collegue couldn't tell the differance. Someone else younger than us looked at them and chose the g400 each time.
        The images included shots from the latest graphic cards inc the radeon. He didn't know which one was the g400 and to be interested he didn't care.
        I'm thinking of placing my g400 in my machine and then buying another graphics card poss the gfarce mx until the g800 comes out. If me peepers ain't ot good I won't know the differance. I can always ask for the demo of the two at my local computer store.
        Sometimes the best image comes down personnel preferance as well.
        Chief Lemon Buyer no more Linux sucks but not as much
        Weather nut and sad git.

        My Weather Page

        Comment


        • #5
          It's about time that this feature made it into more hardware.

          The SB Live! cards have programmable chips onboard, so it's about time it moved into video cards.
          Phils PC Mods - a rough guide

          Comment


          • #6
            Problem is programmable logic circuits are usually harder to make fast and efficient compared to their permanent counterparts.......

            For sound cards, this is much less of a concern obviously.....




            --------------------
            ABIT BF6, Pentium III Katmai @663MHz, GW VOS32 Cooler, 256MB Crucial 7E SDRAM, Quantum Fireball Plus LM, Matrox G400 DH@160/200, Enlight 7237, 300watt TurboCool PS, and some fun with a Dremel!
            Last edited by Heiney; 20 May 2022, 10:43.

            Comment


            • #7
              The geforce is already somewhat programmable. Proof of that is the Pixel shader thing in the gf2. The gf2 chip is quite a bit faster than the gf1 so programmed the pixel shader in it. So there is really nothing new with the gf2. Just pumped up stuff (including more texture units).

              Closer to the home of most people here, the g400 is also somewhat programmable. That's why about 6 months ago some people were talking about the g400 possibly doing T&L. It's just that the g400 already looks slow, implementing a T&L routine in it would just kill it.

              I think that the gf3 will be programmable in the sense that game programmers a going to be able to directly talk to the card using some hardware API.

              It is much better to do develop a super fast chip and program it than to try to develop a chip that does all the graphics stuff in hardware. It's much cheaper and since hardware technology improves so fast, nvidia is sure have a great chip and all they have to do is add stuff to the code and optimize it. Also, development gets done a lot faster, so no worries about the 6 months cycle.
              Salmonius

              Comment


              • #8
                One thing about owning an NVIDIA card, when you read anything from the company as you keep tabs on it you want to cringe at the excess hyper positive marketing speak. If you ask them what colour the sky is, they will tell you how their technology renders skys in every possible colour, from visible to infrared and ultraviolet. And you'll notice you still don't have an answer to the question.

                If NVIDIA would just fhut the suck up I'd consider buying from them again. Consider it at any rate. I like not owning an NVIDIA card, that way I don't have to read NVIDIA marketing bs.

                Of course, Matrox is the other extreme, I can understand how SETI is so popular here, they should target Matrox headquarters for signs of life instead.

                There are companies that have marketing people who know what a question is and what an answer is, and what honesty is, but they don't have anything I want anyway.

                Comment

                Working...
                X