Announcement

Collapse
No announcement yet.

The AMD/ATI "why?" emerges

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • The AMD/ATI "why?" emerges

    Last year there was great interest in an ATI prototype video compressor that ran not in the CPU but in the GPU's pixel shaders on ATI x1000 series graphics boards.

    That little app could encode MPEG's at speeds up to 5x faster than most AMD or INTEL CPU's. It was called the Avivo Transcoder. Impressive....and AMD took notice at last years Computex in Taipei.

    Avivo Transcode article from that time;

    ExtremeTech is the Web's top destination for news and analysis of emerging science and technology trends, and important software, hardware, and gadgets.


    This kind of retasking, not only by ATI but others as well, gave rise to a new term; the GPGPU


    GPGPU


    General-Purpose Computing on Graphics Processing Units (GPGPU, also referred to as GPGP and to a lesser extent GP^2) is a recent trend in computer science that uses the Graphics Processing Unit to perform the computations rather than the CPU. The addition of programmable stages and higher precision arithmetic to the GPU rendering pipeline have allowed software developers to use the GPU for non graphics related applications. Because of the extremely parallel nature of the graphics pipeline the GPU is especially useful for programs that can be cast as stream processing problems.
    http://en.wikipedia.org/wiki/GPGPU

    Now comes the merger.


    AMD, ATI and the GPU

    Breaking the monopoly

    By Guy Kewney, Newswireless.net
    Published Monday 24th July 2006 13:27 GMT

    Comment "We may lose business on Intel boards, but we will break the Intel monopoly." With these words, AMD's CFO Bob Rivet announced the takeover of graphics chip maker, ATI, offering a future of joined-up shared processing, split between CPU and GPU.

    The deal, announced today, goes back some time. Last year, at Computex in Taipei, it was apparent that ATI and AMD were falling in love with the idea of using the powerful graphics processor to run computer programs, not just for animating video.

    At that show, software developers were invited to the launch of the new dual-core AMD processors, with prototype applications that ran, not on the x86 central processor, but on the graphics chip. Examples included video editors which could handle the output stream live, in real time.


    This concept is probably beyond the grasp of the typical financial analyst, and in the short term, the City and Wall Street will probably panic, seeing only the probability that ATI will lose customers who make Intel motherboards, coupled with the possibility that end-users who want Nvidia graphics will have to buy Intel.

    "In 2008 and beyond, AMD aims to move beyond current technological configurations to transform processing technologies, with silicon-specific platforms that integrate microprocessors and graphics processors to address the growing need for general-purpose, media-centric, data-centric and graphic-centric performance," said the official statement.
    >
    >
    And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. To quote Wikipedia: "Recent developments in GPUs include support for programmable shaders which can manipulate vertices and textures with many of the same operations supported by CPUs, oversampling and interpolation techniques to reduce aliasing, and very high-precision colour spaces. Because most of these computations involve matrix and vector operations, engineers and scientists have increasingly studied the use of GPUs for non-graphical calculations."

    And: "Because all these applications exceed an actual GPU's usage target, a new term, GPGPU is usually employed to describe them. While GPGPUs are the same chips as GPUs, there is increased pressure on manufacturers from "GPGPU users" to improve hardware design, usually focusing on adding more flexibility to the programming model."

    The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform. If it is right, then the question of "how much did you pay for ATI?" is irrelevant. It may be a question of "How can you expect to survive, without ATI?"
    Applications

    The following are some of the non-graphics areas where GPUs have been used for general purpose computing:
    * Physically based simulation - Game of life, Cloth simulation, Incompressible fluid flow by solution of Navier-Stokes equations
    * Segmentation - 2D and 3D
    * Level-set methods
    * CT reconstruction
    * Fast Fourier Transform
    * Tone mapping
    * Sound Effects Processing
    * Image/Video Processing
    * Raytracing
    * Global Illumination - Photon Mapping, Radiosity, Subsurface Scattering
    * Geometric Computing - Constructive Solid Geometry (CSG), Distance Fields, Collision Detection, Transparency Computation, Shadow Generation
    * Neural Networks
    * Database operations
    * Lattice Boltzmann Method
    * Cryptography
    and more....

    My take is that with this tech coming on strong in not only scientific but pro and user apps (ie: compressors like Avivo Transcoder and later games, editing software, 3D etc.) AMD/ATI want to get an open source GPGPU platform in the market before Intel comes out with as propriatory one that would require licensing.
    Last edited by Dr Mordrid; 27 July 2006, 19:13.
    Dr. Mordrid
    ----------------------------
    An elephant is a mouse built to government specifications.

    I carry a gun because I can't throw a rock 1,250 fps

  • #2
    So what happens to the relationship between Nvidia and AMD?
    Will Nforce cease to exist?
    Intel doesn't want Nvidia, and now that AMD bought ATI they wouldn't want Nvidia either, hmmm.

    I guess it's back to producing GPU's only for Nvidia!

    Comment


    • #3
      Probably continue, they need each other plus I doubt AMD/ATI will lock out other boards. Probably contiue as it is now: the embedded is the default unless another board is plugged into a PCIe slot.

      AGP back compatability; doubt it severely. Too low a bandwidth for this kind of thing.

      No.

      Dunno. My guess is that Intel is already working on GPGPU in house.

      NForce has some advantages over ATI's MB chipsets for the short run, though with AMD working hand in glove with ATI their upcoming chipsets should be a blast, maybe with propriatory features NVIDIA can't duplicate (patented etc.). We'll see.

      Now think of this; GPGPU is not too different than what Matrox has done in the RT.x cards....offloading realtime calculations to the cards G-processor, but on a larger scale and at faster speeds.

      IMO this could mean that realtime video output, even in HD, could be offloaded to a fast GPGPU instead of depending on propriatory hardware like the AXIO. All that would really be needed is the I/O hardware.

      Yes, the MPEG encoding was that impressive, enough to demonstrate that much, much more is very possible.
      Last edited by Dr Mordrid; 24 July 2006, 19:41.
      Dr. Mordrid
      ----------------------------
      An elephant is a mouse built to government specifications.

      I carry a gun because I can't throw a rock 1,250 fps

      Comment


      • #4
        I totally agree with you Doc!

        I'm talking the long run as well, we'll see what happens.

        Comment


        • #5


          courtesy of theinquirer.net
          Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
          Laptop: MSI Wind - Black

          Comment


          • #6
            We can only hope for both

            If they pull this off we could see a new open source platform for GPGPU develop instead of the old Intel hegemony rearing its ugly head again.

            Yes AMD/ATI could have propriatory features on an open source platform, but I'd rather have that then Intel locking the whole platform up again.

            Dr. Mordrid
            Dr. Mordrid
            ----------------------------
            An elephant is a mouse built to government specifications.

            I carry a gun because I can't throw a rock 1,250 fps

            Comment


            • #7
              I honestly don't want AMD to capture the market. I want them to co-own the market with Intel. The competition between the two companies has really pushed the CPU market forward at warp speed while keeping prices down. As long as it remains only two standards, I don't mind either company. Both are being forced to adapt which is good for everyone.
              “Inside every sane person there’s a madman struggling to get out”
              –The Light Fantastic, Terry Pratchett

              Comment


              • #8
                And in reality this wont change things for us that much in the near or in the future...
                While Ati might stop making Intel chipsets, I doubt Nvidia will stop making AMD chipsets (or that AMD will try and stop them)and if nv decides to shot itself in the foot, via and sis would love to eat their marketshare.
                What probably will happen fast is interesting things in the laptop world
                If there's artificial intelligence, there's bound to be some artificial stupidity.

                Jeremy Clarkson "806 brake horsepower..and that on that limp wrist faerie liquid the Americans call petrol, if you run it on the more explosive jungle juice we have in Europe you'd be getting 850 brake horsepower..."

                Comment


                • #9
                  Originally posted by Jammrock
                  I honestly don't want AMD to capture the market. I want them to co-own the market with Intel...
                  Relax - I work in an IT dept for a company with US$ ~60m turnover and when I wanted an AMD machine, I had to build it myself!

                  Remember Intel bosses the market and has most end-suppliers by the balls, my preferred resellers of IBM, HP, Fujitsu-Siemens etc etc all don't sell AMD servers at all; I even asked for a quote on an opteron server and they send me xeon ones!

                  Hence having to build an Opteron server myself, which really is a bit of a waste of time when we should just be ordering them in (although it is always fun if there's no firefighting to do).

                  Comment


                  • #10
                    Interesting, Doc!

                    Anyone else seeing similarities with co-processors? (conceptually that is)

                    Now what is preventing manufacturers of making a multi-core where one of the cores has a GPU like architecture?

                    No doubt that Intel will follow soon (I too like a bit of competition).

                    Jörg
                    pixar
                    Dream as if you'll live forever. Live as if you'll die tomorrow. (James Dean)

                    Comment


                    • #11
                      Originally posted by Whirl-Secret
                      ...my preferred resellers of IBM, HP, Fujitsu-Siemens etc etc all don't sell AMD servers at all
                      Even though all three make Opteron based servers
                      When you own your own business you only have to work half a day. You can do anything you want with the other twelve hours.

                      Comment


                      • #12
                        Originally posted by VJ
                        Interesting, Doc!

                        Anyone else seeing similarities with co-processors? (conceptually that is)

                        Now what is preventing manufacturers of making a multi-core where one of the cores has a GPU like architecture?

                        No doubt that Intel will follow soon (I too like a bit of competition).

                        Jörg
                        Not a thing stopping them from placing circuits akin to programmable shaders on the die in addition to the onboard graphics. 1 or 2 0f those and it would scream, though I'd first look at reworking the MPU section first.

                        Dr. Mordrid
                        Dr. Mordrid
                        ----------------------------
                        An elephant is a mouse built to government specifications.

                        I carry a gun because I can't throw a rock 1,250 fps

                        Comment


                        • #13
                          Now what is preventing manufacturers of making a multi-core where one of the cores has a GPU like architecture?
                          But it wouldn't be a very good idea to do so. CPUs are mostly general-purpose machines. Video cards are very specialized. It just so happens that video card architecture is highly compatible with certain software problems, such as massive parallel calculations. Their functional units and memory systems are pretty favorable for these things.

                          However, putting that functionality on a CPU would be a negative, overall. You can't put something on a CPU without taking something away.
                          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                          Comment


                          • #14
                            /me is dreaming

                            AMD will remade Geode CPUs & put in them some simple, but fast 2D graphic core, so it will do not have to use separate board with graphic chip. Thanks to some ATI technology Geode will be more little & will eatl ess energy. The new PDAs will ariive, with nice CPU with graphic chip inside, so it will be fast will eat less energu

                            /me wake up
                            A CRAY is the only computer that runs an endless loop in just 4 hours...

                            Comment


                            • #15
                              We'll see. The Geode used to be done (pretty sure) by the AMD site in Longmont. Which they've recently fired 180 people from. But they're opening up shop in FtC, with some 200+ planned jobs there. Some of the Longmont people are already there, but *nobody* knows what for.
                              Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                              Comment

                              Working...
                              X