Announcement

Collapse
No announcement yet.

Firingsquad writing about matrox

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Firingsquad writing about matrox



    The grand Patriarch of the PC graphics market, Matrox has been well regarded for a very long time for their high-end, high-quality products. I used to use Number Nine cards when AutoCAD was young, but the industry moved, almost universally, to Matrox MGA cards when they hit the scene. The 2D clarity was beyond incredible, and support for high resolutions with that level of clarity meant that the demanding task of creating and editing complex CAD files was made much easier for those of us who toiled for hours in front of the screen.

    One of the things that I always appreciated at Matrox was the quality of their drivers in terms of CAD and desktop environments. In those days, if you had trouble, you could call them up and get expert help in minutes. They consistently worked on their drivers so that they interacted well with AutoCAD’s complex ADI interface, and later, when we made the move from DOS to Windows (a real productivity killer for us old command line script writing geeks), they were right there with us as we forged through the mess that was AutoDesk’s accelerated display architecture.

    Matrox gained an excellent reputation through word of mouth, and later through industry publications. Consumers were begging for a broader all-purpose solution from Matrox, and they delivered in spades. Their Millennium and Mystique products handled the business and consumer market at the same time, with the Mystique actually adding basic video capture capabilities to boot. Both products were successful, but the Millennium in particular gained wide-spread praise from all facets of the industry.

    Their Rainbow Runner series added TV Tuner capabilities and enhanced video capture features to the product line, and later, after the updated Matrox Millennium was introduced, came an incredible advancement that they are still heralded for. Matrox introduced multi-monitor support in a single card solution. This was a huge, huge boon for business users, particularly the CAD and financial services market. The next step was to move this technology to the consumer side, which they did with the highly revered G400 series. This line of cards is still looked at fondly by anyone who got their hands on one. Not only did they offer their famous “Dual-Head” option so consumers could run multiple monitors, their 3D quality and speed were very competitive with other 3D cards out on the market. The card became so popular in the gaming market that they came out with a higher speed version called the G400 Max. They had a sure-fire winner on their hands, and they found themselves with a few more options than they were used to having, and potentially an exciting new direction.



    At The Crossroads
    Matrox had found themselves smack in the middle of the gaming market, and the rewards were potentially huge. However, focusing resources towards the volatile consumer gaming market would be a very risky move and require a great deal of capital investment to remain cutting edge. They were faced with a dilemma. They were already cutting edge in the business market and that demanded almost all of their attention. Should they enter a potentially lucrative market, or should they take a step back and look at what got them there in the first place.

    Much to their credit, Matrox decided to put their established customer base first and continue to devote maximum resources to the type of products they had long been known for. Yes, giving up on the high-end gaming market may have been an unpopular decision for some, but it was a heralded move by Matrox loyalists and longtime customers.

    It takes guts to make a call like that, knowing that you can’t please everybody, but Matrox stood true to their convictions and rediscovered their roots. They clarified their vision with laser precision on the business market, producing affordable, versatile products with their Dual-Head technology and top notch 2D quality and performance. So far their decision seems to have paid off, as evidenced by the massive casualties in the 3D gaming market. As it sits, Matrox is still a well-respected name in the corporate market, and is also fondly thought of in the consumer market thanks to their eTV line of products. An example of how to do things right, perhaps?

  • #2
    That was <I>too</I> favorable. Matrox has done a lot of cool things, but the mistakes were left out, too.

    ICDs? eDualhead? HEADCASTING?
    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

    Comment


    • #3
      That article was a bit something and nothing for me..... Matrox - great but now let's not bother with them...... Ati - wonderful but lets (once more) re-iterate their failings.... nVidia - just how can anyone think or even say anything wrong about them.....

      Just what was the point? Matrox haven't made a decent (all-round capability) card since the G400Max - their inclusion in the article was plain waffle. It takes guts to turn your back on your customers? Hmm, not quite what I'd say....

      And on the subject of 'massive' casualties in the 3D market - WHERE? Ok 3dfx died but really was a marketing issue - you have a very happy deal with lots of OEMs so you pull the rug and decide to make the card yourselves.... I didn't think the plan was all that sound - it was based upon brand but Voodoo was WELL thought of before they made that move?! There aren't that all that many other actual chip manufacturers that have ceased trading (due to games/3D market). Most didn't even make it into the market - for example Number 9 who properly tried once then gave up.
      Last edited by Reckless; 24 February 2002, 03:43.
      Cheers, Reckless

      Comment


      • #4
        Well, PowerVR just left the building...

        About matrox, most of their customers are truely from the business/cad world. So matrox dumped the gaming world (which isn't as big) in favor of their old reliable business/cad world. FS think it was propably the right thing to do since Matrox still exists.

        To count all the guys that left which I still remember:

        Tseng Labs
        S3
        STB (bought by 3Dfx)
        No.9
        Hercules (bought by guillemont - they even had their own chips)
        Diamond (Bought by ATI ?, they had theFireGL and Stealth family)
        Real3D (intel chipset )

        There were many many others back in the days when the Millenium (1) was the king.

        Comment


        • #5
          Sorry, but my point was that there were no major manufacturers proving cards at the time of the cross over from 2D acceleration to 3D game suppot. There were a lot of wannabes but did anyone really try and run 3D back then (on a STB Trio64V) ... I did and it was a laugh Your list (albeit accurate) would really only identify those manufacturers who didn't really hit the mark at all - most manufacturers were concerned with bit blitting speeds at the time Even the name now synonymous with 3D speed - Hercules - was in this camp.

          Tho true enough, S3 I'd forgotten about - eek but did they make some naff cards

          Matrox probably did the right thing about not focusing on the gaming marking but they should've released another G400Max equivalent card by now. For the casual gamer this would have been suffice but each succesive release has not match the Max for performace. So, if the Max has dual head, good RAMDACs, reasonable 3D speed, as good as you'd want 2D speed and good solid drivers (as ever from Matrox) - PLEASE tell me the point of the G450 and even the G550?

          And DAMN ST Microelectronics have indeed left the building - BUT backing a console that got to put to rest wasn't really in their favour. Their PC kit was always very late (probably Dreamcast influenced) and therefore under-powered/regarded by the market.]

          P.S. I'm not ranting, angry or bitter. I'd simply like a bit more choice when I come to replace my very reliable G400Max I don't consider 10 different manufacturers each with 5 variants of the latest nVidia chipset a 'choice'.
          Cheers, Reckless

          Comment


          • #6
            Hehe, what flavor would you like your nVIDIA sir ?

            Damn, I had to chose between Mill1 / Imagine 128 / ATI Mach64 etc. Our children will choose an nVIDIA flavor...

            If you check sites like www.2cpu.com you'll see they still use their old trusty G400 cause it works best in SMP.

            The jews have a saying about the lack of good leadersthat goes:

            "From Moshe (moses) to Moshe (the Rambam) there wasn't like Moshe"

            Well, we at the murc can say that about the Millenium 1,2 and the Millenium G400. too bad for us.

            Comment


            • #7
              Without 3D gaming, nVidia-based adapter's worth does not match its price if it is compared to Matrox's card...

              To get both of quality and performance, you have to pay the combined amount of money to buy a Canopus-made adapters. But it costs $$$$, and some Matrox only features are still unavailable...

              With professional 3D applications, I believe most companies will not care about the price but the actual performance and reliability of 3D chipset's OpenGL functions. As a result, 3DLabs is still the best choice although its low-end market is eaten by nVidia.

              But I have to admit, nVidia still keeps improving rapidly. Although some of his features' qualities are still unmatchable with Matrox's, it does not mean he never catches up with Matrox.

              From my point of view, it is stupid for Matrox to totally ignore 3D performance even though it is not useful for certain users. For the marketing concern, it is also very dangerous that the performance of 3D cannot even match the one with only half cost of Matrox adapter's price. Especially it seems that 3D performance cannot be easily dramatically increased without architecturally revolutionary design. Once the competitors offers the same features and powerful 3D performance with the near price range of Matrox's adapter, Matrox is never competitive.
              P4-2.8C, IC7-G, G550

              Comment


              • #8
                I agree that Matrox made the right decision about dropping 3D for a while.

                The G450 makes sense. Not much extra cost in development, specially the drivers, which are pretty much identical to G400 ones, therefore Matrox had a card that costs significantly less to produce than the G400, yet has the same features (give or take).

                The G550 makes no sense. The drivers must have cost some serious $$ to make, as there is a T&L engine in there, even if it is disabled/cut out. I realise it was meant to be a G800, but why spend all the money writting drivers ont he 64bit version, when just by adding a few more pins/traces they could have had a 128 bit version which would at least be as competitive as a Geforce2MX400. What they ended up with is a card that has teh same features as a G450, but cost more to make, and therefore is less desirable in the market.

                Does anybody knwo whet the R&D cost break down is for hardware vs software?

                Im guessing 80% of resources spent on hardware, 20% on software (drivers, eDualhead etc). The other thing to remember is each card released must have an on going cost related to support, both in terms of staff for the forum/phone, and updating drivers for 3-4 years (G200 drivers still come out etc).

                This makes the G550 even more confussing. If they wernt going to do the full G800 release, why not just drop it completely? I cant imagine its a big seller, yet the support/driver cost would be the same.

                Just my thoughts

                Ali

                Comment


                • #9
                  Originally posted by Ali

                  I agree that Matrox made the right decision about dropping 3D for a while.

                  The G550 makes no sense. The drivers must have cost some serious $$ to make, as there is a T&L engine in there, even if it is disabled/cut out. I realise it was meant to be a G800, but why spend all the money writting drivers ont he 64bit version, when just by adding a few more pins/traces they could have had a 128 bit version which would at least be as competitive as a Geforce2MX400. What they ended up with is a card that has teh same features as a G450, but cost more to make, and therefore is less desirable in the market.

                  I don't know where I read it but I belive someone said that the manufacturing costs for each G550 were significantly lower than that of a G450. That might have had something to do with it.

                  /Ulf

                  Comment


                  • #10
                    Originally posted by Ali
                    The G550 makes no sense. The drivers must have cost some serious $$ to make, as there is a T&L engine in there, even if it is disabled/cut out. I realise it was meant to be a G800, but why spend all the money writting drivers ont he 64bit version, when just by adding a few more pins/traces they could have had a 128 bit version which would at least be as competitive as a Geforce2MX400. What they ended up with is a card that has teh same features as a G450, but cost more to make, and therefore is less desirable in the market.
                    Ant Made a comment on the Crystal Ball that when the "G800" i.e. G550 was being developed that the R&D budget was cut back big time for some reason or another..thats why we winded up with a such an oddball card..guess they had to try and get something out of it.

                    Scott
                    Why is it called tourist season, if we can't shoot at them?

                    Comment


                    • #11
                      Sorry, but my point was that there were no major manufacturers proving cards at the time of the cross over from 2D acceleration to 3D game suppot. There were a lot of wannabes but did anyone really try and run 3D back then (on a STB Trio64V) ... I did and it was a laugh Your list (albeit accurate) would really only identify those manufacturers who didn't really hit the mark at all - most manufacturers were concerned with bit blitting speeds at the time Even the name now synonymous with 3D speed - Hercules - was in this camp.
                      i renember playing jedi knight with my matrox mystique(with 2mb upgrade), it could run jedi knight in 512*384*16 as fast as software could run 320*200*8, this was my first gaming experience with 3d-acceleration, and even though it wasn´t fast (about 30fps i think, i didn´t know how to benchmark back then), it was still alot better than software, 16 bit color was really a big improvement!
                      i later got a voodoo, and i never used the mystique 3d-acceleration since.
                      This sig is a shameless atempt to make my post look bigger.

                      Comment

                      Working...
                      X