Announcement

Collapse
No announcement yet.

Programmable engine

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Programmable engine

    If i'm not mistaken i remember then even the G400 had a programmable engine in it. It was called WARP.
    Looking on this forums i read that probably it is used for triangle setup (even if nobody is sure about that) and it could be used to do almost anything but it lacked horsepower.
    I am just speculating now. But what if the new card had a very powerful programmable angine. It would be posible to go out right now with a 8.1 D3D (openGL 1.3) card, and later on switch it to D3D 9 or 9.1 (OpenGL 2) just with a bios change.
    What do you guys think?
    NocturnDragon

  • #2
    yeah, i renember that too, there was a lot of discussion about, if it would be possible to implement T&L on the g400.

    back to topic:
    if they plan on keeping the parhelia core for the next 2-3 years, then that aproach would make it very easy for them.
    and if they would offer next-gen (and next-next-gen) features with a simple bios-update, then the parhelia would be a really really great card.
    a bios-update like that would probably please alot of costumers, and about adding features with patches, isn´t that what they are already doing with the effects-addon for the rt2500? just on a higher level?
    just saying the idea isn´t new to them.
    This sig is a shameless atempt to make my post look bigger.

    Comment


    • #3
      Depends on how programmable the GPU was. The BIOS just controls parts of the I/O, not the actual processing of information. If the GPU was highly programmable it would be possible, but it wouldn't be faster than say a dedicated DX9 card. They would have to emulate the additional DX9 functionality, instead of using hardocded paths. It would still potentially be pretty fast...depending on the implimentation.

      Of course, if Matrox left space on the die for "future expansion" they could easily add functionality for future DX9/10 abilities on future cores.

      Jammrock
      Last edited by Jammrock; 22 April 2002, 08:43.
      “Inside every sane person there’s a madman struggling to get out”
      –The Light Fantastic, Terry Pratchett

      Comment


      • #4
        Originally posted by TDB
        yeah, i renember that too, there was a lot of discussion about, if it would be possible to implement T&L on the g400.
        Yes there is and even on the G200 if I remember things correctly!
        According to the latest official figures, 43% of all statistics are totally worthless...

        Comment


        • #5
          Yes it was present even on the G200
          NocturnDragon

          Comment


          • #6
            programmable engine makes sense to me.
            what about swapable GPUs? like you keep the board but change your old GPU for an updated one for some $. wil probably save some money and will ensure people keep buying matrox and not nvidia and ati. sounds like an incredibly good idea to me

            does it even make sense ?
            no matrox, no matroxusers.

            Comment


            • #7
              Originally posted by thop
              programmable engine makes sense to me.
              what about swapable GPUs? like you keep the board but change your old GPU for an updated one for some $. wil probably save some money and will ensure people keep buying matrox and not nvidia and ati. sounds like an incredibly good idea to me

              does it even make sense ?
              If i remember correctly there was some talk about swappable gpu's in the big G800 thread's a long time ago... They mentioned all future gpu's being pin combatible so they could use the same pcb and easily switch gpu's..

              then again it doesn't makes sense to me, because you won't gain anything with replacing an gpu if the memory interface stays the same.
              PIII 1Ghz|AbitSa6R|512mb Kingston|Matrox Parhelia 512 Retail|80gb WD & 30gb IBM 75gxp|Diamond MX300 A3d 2.0|36xcdrom|6x32AopenDVD|Sony DRU500A|Intel Pro 10/100 S|IIyama Vision Master Pro 450 | Celly 300a@450 'server' powered by a G400MAX

              Comment


              • #8
                Originally posted by Michel


                If i remember correctly there was some talk about swappable gpu's in the big G800 thread's a long time ago... They mentioned all future gpu's being pin combatible so they could use the same pcb and easily switch gpu's..

                then again it doesn't makes sense to me, because you won't gain anything with replacing an gpu if the memory interface stays the same.
                So a AMD XP 1500+ is no faster then a AMD XP 2200+ using the same mobo and memory?
                According to the latest official figures, 43% of all statistics are totally worthless...

                Comment


                • #9
                  No it isn't, not by much anyway. You're likely to do better by taking an XP1500 off of a standard SDRAM board and putting it on a DDR board.

                  P.S. The 2200 should be the faster chip there, buddy.
                  Last edited by Wombat; 22 April 2002, 13:38.
                  Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                  Comment


                  • #10
                    think about this from a business point of view.

                    If Matrox say, buy this DX8.1 card now, and in 6 months we will upgrade it for free to DX9 specs, then MS does something strange to DX9, that means that they cant upgrade, where is Matrox now.

                    On the other hand, if Matrox sell us a DX8.1 part now, and say its upgradable, at a cost of course, then they could be on to a winner.

                    By the time the card is regulated to the bottom end (eg G200 at the moment), wouldnt it be nice to know you could give Matrox say $50 for a firmware upgrade that would make it at least able to play the most recent games, even if it was painfully slow.

                    Have updates/fixes free for 3 years, while the card is under warranty, then charge users for an upgrade. Its a win win situation.

                    The marketing guys would love it. They could sell a G200 type card (relative speed to newest card) and say its DX8.1 complient, even though it could only do .1fps in current games.

                    The thing Im worried about at the moment is power ussage though. These GPUs are getting to the stage that they are more complicated and powerful than CPUs that were current only a year or two ago. How can they do that much proccessing by only using the AGP power.

                    I still think 3dfx had the right idea with the external power supply. Costs more, but removes ALL the power problems.

                    Ali

                    Comment


                    • #11
                      I think the problem would be the connection from the chip to the card - a connection without direct soldering works with processors, at bus speeds of up to 133 MHz (and no, P4 isn't more than 100 MHz Quad-Pumped ), but I doubt it'd work with the high bus speeds graphics chips use

                      As always, please correct me if I'm wrong

                      AZ
                      There's an Opera in my macbook.

                      Comment


                      • #12
                        Ali, don't sweat the power issue... it'll be fine, no external power connector needed
                        "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                        "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                        Comment


                        • #13
                          I completely agree with Ali on this topic!
                          NocturnDragon

                          Comment


                          • #14
                            I dont, and the new GPU's already have more transistors then the current CPU's, at least I think they do. and power is'nt a problem, the new gen is made at ,13 micron, at thus consumes less power.
                            Main Machine: Intel Q6600@3.33, Abit IP-35 E, 4 x Geil 2048MB PC2-6400-CL4, Asus Geforce 8800GTS 512MB@700/2100, 150GB WD Raptor, Highpoint RR2640, 3x Seagate LP 1.5TB (RAID5), NEC-3500 DVD+/-R(W), Antec SLK3700BQE case, BeQuiet! DarkPower Pro 530W

                            Comment


                            • #15
                              The latest graphics chips aren't even close to the latest processors. Graphics chips haven't even broken 100 million yet.

                              I disagree with Ali. Socketing chips makes them slower, and more expensive. Also, can you imagine the CS nightmare when people f*ck up installing their own chips?
                              Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                              Comment

                              Working...
                              X