Announcement

Collapse
No announcement yet.

My Perfect Video Card (TM)

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • My Perfect Video Card (TM)

    Here is an example of "My Perfect Video Card (TM)"

    A Matrox G-400 Marvel with 32MB of RAM and a 350MHz RAMDAC.

    OpenGL Drivers that give you the same performance no matter what CPU you are using.

    For example, A Celeron 500 will give you similar frame rates to a Pentium II 500 which should be similar to a Pentium III 500, they don't neccessarily have to be exactly the same, but I think it is a bit of a cop out not to be able to produce similiar frame rates on similarly rated (in MHz) CPU's, other Video Manufacturers can do it, Matrox should be able to as well!

    Some decent NT OpenGL drivers might be nice as well. (For those who don't know, NT Video drivers are a very difficult beast to write and I almost forgive Matrox for their failings here, as I am aware of the technical difficulties involved with the way NT works, someone else can explain why, if you are really interested to know)

    What do you all think, am I a dreamer, am I asking for to much, or do some of you think that it will actually occur, earth soon, not Matrox soon.

    To be honest I hope "My Perfect Video Card (TM)" becomes a reality soon, as the Voodoo 4 is not going to make it for Christmas and I have a sneaking suspicion that the Ge-Force may struggle to be out in quantities by then as well.

    If this is so, then "My Perfect Video Card (TM)" should sell like hot cakes, as nothing can match it for features, speed and video quality, that is why I call it "My Perfect Video Card (TM)" and it is something I would be prepared to put my money down for, and I suspect a huge amount of other people would be as well!

    Here hoping.

  • #2
    Where is Ant anyway?

    Nothing about G-400 Marvel on the News site, and definately no review either, I thought he would have been the 1st!

    Damn

    [This message has been edited by Whisper (edited 09-25-1999).]

    Comment


    • #3
      Are you Greebe, or anybody, seriously telling me that if you picked up just about any other brand Video Card, you would not get similar frame rates using a Celeron 500 compared to using a Pentium III 500.

      There will be only a 10% difference at worst, and I think I am being generous to the Pentium III by saying it would give you that much extra performance.

      I think it is you Greebe that should rethink what you know about processor technology, because either Matrox has a serious Hardware issue with OpenGL, ie. it is not as good as a TNT2, or it is a software issue, (read drivers).

      These are the facts:

      1. A TNT2 will give you similar frame rates in OpenGL, as long as you are using similar rated in MHZ CPU's, it does not matter what type, only the speed.

      2. A Matrox G400 is almost as fast if not faster than a TNT2, in DirectX.

      3. A Matrox G400 is only as fast as a TNT2 in OpenGL, as long as it "borrows" CPU processing power, by using SSE or 3DNow instructions. ie the Turbo GL driver.

      These are the facts as I see them, does anybody deny that they are correct?

      I am no expert about Video Software API's and how video chipsets deal with them, but it seems to me that either the Matrox G-400 has a basic Hardware issue with OpenGL (I sincerly hope not) or Matrox still can not write a decent OpenGL driver, (as much as I hate to say it, I hope this is still the case, at least that is a more acceptable option).

      Anyway this is supposed to be the Matrox users bulletin board, not an arm of the Matrox Graphic Inc. Marketing department. If you are not going to get the truth and the facts here, then where else are we going to get it from??

      My initial post was about what my perfect video card was, and I specifically said it was a Matrox Marvel G-400 as long as it came up to standard in other areas.

      If you did not get it, some of it was a wish list, but other parts, if you have brain, you could work out were absolutely essential, otherwise no deal.

      I forgot to add, welcome back Ant, nice to see you and your computer are still alive.

      [This message has been edited by Whisper (edited 09-29-1999).]

      Comment


      • #4
        Hey Whisper, I would have been the first if it were not for a major Windows meltdown that trashed my system. Took me 2 days to recover, HD reformat, reinstall etc etc

        Comment


        • #5
          Hey Ant, I just edited my last post, but you beat me to it.

          Yes formatting is a bitch, have you tried Ghost? At least it gets you up to speed quicker, in the event of a disaster.

          You realise, when I saw the announcment for the G-400 Marvel, your site was the 1st site I came to!

          Oh well, good to see you back, hope to see that review real soon.

          [This message has been edited by Whisper (edited 09-27-1999).]

          Comment


          • #6
            1. A TNT2 will give you similar frame rates in OpenGL, as long as you are using similar rated in MHZ CPU's, it does not matter what type, only the speed.


            TNT2 drivers have 3D-Now!/SSE built in, and fillrate limits Q3Test for example. The G400 just has a non game optimized ICD, that's why there will be a minigl apparently.

            2. A Matrox G400 is almost as fast if not faster than a TNT2, in DirectX.


            Depends on your system, I have found that a normal TNT 1 is faster in D3D in lower resolutions (800x600) than a G400 overclocked to max levels. Same deal with OpenGL. Friends with TNT 1's and the same cpu as mine are getting 4000ish in 3DMark99Max for instance at 800x600, vs my all time best score of around 3400 for the G400 at 145/194. (K6-3 at 448) Obviously the drivers or the hardware likes faster cpus more than the TNT 2 and it's only the fillrate that saves it at higher resolutions. Just think how much faster those higher resolutions would be with even faster drivers and I'm talking D3D, not OGL here.

            [Oh, and I'd expect replies here to go along the lines of get an Intel cpu, get an even faster Intel cpu, get a faster AMD cpu, stop whining, stop whining about whining, stop whining about whining about whining, who cares about low resolutions, yeah but look at the image quality, Matrox rules, everything else sucks, don't talk about foreign hardware in Matrox hardware, don't talk about anything to do with hardware, what you should be talking about is minutae of your personal lives or you should just pass insider comments back and forth like ops in an IRC channel, etc. ]

            3. A Matrox G400 is only as fast as a TNT2 in OpenGL, as long as it "borrows" CPU processing power, by using SSE or 3DNow instructions. ie the Turbo GL driver.


            I agree that there is a difference in the drivers, what that difference really is, I have no idea. Maybe it really is a hardware issue, (rumours of inefficient hardware triangle setup engine still abound) until the drivers prove otherwise I can't rule it out. I have learned not to trust what a spec sheet seems to imply, from NVIDIA.

            PS. Ghost is cool, and a freeware program called ZPart does a similar task.

            Comment


            • #7
              I use PowerQuest Drive Image. 15 minutes to restoration if I need it.
              MSI K7D Master L, Water Cooled, All SCSI
              Modded XP2000's @ 1800 (12.5 x 144 FSB)
              512MB regular Crucial PC2100
              Matrox P
              X15 36-LP Cheetahs In RAID 0
              LianLiPC70

              Comment


              • #8
                Himself

                1. I still stand by my original quote that a TNT2 running on a Pentium III @ eg. 500MHz does not generate significantly faster frame rates per second (this is all I care about, because people can stick their fill rates, triangles rates, synthetic benchmarks up their backsides, because all I am interested in is the final amount of frames per second that I actually get to see on my screen) than a Celeron @ eg 500MHz . My point being that SSE and 3DNow do not seem to make huge differences, except in artificial benchmarks, and the odd application, which I conceed do exist.

                This leads me to then ask, why the hell can't Matrox do the same??

                They also managed to make huge in roads in Frames per second, by using SSE & 3D-Now, which is good to hear that the driver writers are not completely incompotent, I know thats harsh, but I am still a bitter G-200 owner, or maybe the guy(s) on the Turbo GL driver are ****ing gurus, who knows.

                2. I liked the end of your point 2. It sometimes seems to be the case around here and is quiet a accurate summation of what sometimes occurs. As for your problems at low resolutions when competeing against guys with TNT1's, I don't know, I sympathise with you if this is in fact the case, but how do your numbers compare to other peoples numbers, with similar spec'ed hardware?

                Oh well.

                Cheers

                [This message has been edited by Whisper (edited 09-27-1999).]

                Comment


                • #9
                  Hmmm.Good idea Whisper. A CPU indepandent Video Card. You are talking about a Geometric Setup Processor. The GSP will process all the triangle, cube ...etc also the Geometric Setup Engine.

                  Most recent 3D card just have triangle, cube.. setup engine. No 3D card has a full geometric setup engine include GFoce 256(exculd workstation professionl 3D card like SGI). So most 3D card speed depent on CPU speed. It's because the geometric setup process will pass to CPU. So all 3D Card will have a good result in a fast CPU even the 3D card have a very low fill rate.

                  The fill rate just tell us how many line or pixel can the card produce in a second. If a card have 1000mps fill rate but have no setup engine. All the setup process pass to CPU. We use Quake III at PIII 500 to test. I think it just have 2 - 3 fps at 640x480x64k. 99% CPU power need to calculate many many "matrix" to setup the polygon and geomatric detail.

                  So, I think the next generation 3D card will have a fast GSP. It will make the card play on a pentium like PIII. The CPU just use to prcess the program (event - if then else). All 3D calculation will pass to the GSP not only the T&L processor.


                  [This message has been edited by Javert (edited 09-27-1999).]
                  P!!! 450 OC To 495. ASUS P2B V1.1, PC-133 256MB RAM. Adaptec 2940U2W SCSI Controller, IBM 9.1GB(2Mb Cache) LVD SCSI HardDisk X 2. Panasonic 20R,8W CD-R, Sony 5X ATAPI-DVD Rom, 3M LS-120 ATAPI-Floppy, Diamond Monster Sound Mx300 with four speaker, And My Matrox G400MAX

                  Comment


                  • #10
                    Intel will never allow the stress to be taken away from the CPU. What would they do if no one bought a new CPU and stuck with their 300mhz PII for the rest of eternity? And could still run at the same frame rate, the latest games as on a PIV-1666mhz?

                    ------------------
                    Cheers,
                    Steve

                    PS: Some or all of the above message may be wrong, or, just as likely, correct. Depends on what mood I'm in. And what you know. ;¬)


                    Comment


                    • #11
                      My point being that SSE and 3DNow do not seem to make huge differences, except in artificial benchmarks, and the odd application, which I conceed do exist.

                      This leads me to then ask, why the hell can't Matrox do the same??


                      Why can't Matrox do what? Have faster Celeron drivers without resorting to SIMD? I have no idea, would be nice though wouldn't it? Perhaps the rumours of a crap triangle setup in hardware are true? Or could be that all the time NVIDIA have been working on their drivers have been spent tweaking for the Slot 1 architecture, I know they did nothing for SS7 for around a year, concentrating on one platform only could be the reason.

                      As for your problems at low resolutions when competeing against guys with TNT1's, I don't know, I sympathise with you if this is in fact the case, but how do your numbers compare to other peoples numbers, with similar spec'ed hardware?

                      Well, I know it's the case since I did several 3dmark99max 800x600 benchmarks on my machine with both a TNT 1 and a G400 oc'ed, and did it again with a K6-2 350, same deal. Without trying the TNT 1 got around 3400 with my cpu at 400MHz using a 100MHz bus, which I can only match with the G400 by going up another 50MHz using a 112MHz bus. (There is a significant speed difference between using a 100 and 112 MHz bus on my system, almost 100pts.) Could be 3D-Now! or simply that the TNT has drivers tweaked to use the full speed L2 cache and AGP bus and of course cheating on features like trilinear etc. (Note, with my TNT 1 and K6-2 392 I always got around 2800-2900 on my system, getting 3000-3100 with a K6-3 400 and a G400 quasi max is saying something large about the drivers or the hardware.) BTW, I also tried combinations of 32 bit Z, vsync off/on, bilinear/trilinear, none made any real difference.

                      Some Q3Test numbers: http://home.thezone.net/~bm/benchmarks.html

                      Comment


                      • #12
                        Himself

                        I just read "The Firing Squad's" review of the Matrox G-400 Marvel, and from the looks of it, the synthetic benchmarks are not worth a damn, as the results made little sense whatsoever.

                        When it came to the real deal of playing games, it was doing pretty damn well against a TNT2 and Vodoo 3500, I guess that is what you call the difference between theory and real life.

                        I would not get hung up about it.

                        Here is the link if you are interested
                        http://www.firingsquad.com/hardware/marvelg400/

                        Cheers

                        Comment


                        • #13
                          The other thing that is really confusing me, is the difference in performance between the normal OpenGL driver and the TurboGL driver.

                          If Matrox can get such a huge benefit from using SSE and 3D-Now, it makes you wonder what NVidia and 3DFx are up to. It is the difference between night and day.

                          Maybe I am right when I said the Turbo-GL driver programmer(s), may just happen to be exceptionally gifted.

                          If I am right in my speculation, can the person/persons be moved over to the normal OpenGL driver team? Pretty Please with sugar and a cherry on top!

                          What else would be nice, is for somebody who knows what they are talking about in regards to OpenGL programming to speculate for us exactly what Matrox have done to get these huge performance increases, by using SSE and 3D-Now for OpenGL acceleration.

                          Comment


                          • #14
                            Whisper think about that

                            The SSE, 3D Now or even MMX instructions are optimized for doing Multimedia and 3D operations! A driver that uses these extensions and can't benefit from them says a lot against the programmer of that driver and not against programmers that are able to use the potential!

                            Helmchen

                            Comment


                            • #15
                              Greebe you started it with

                              "Please go and read up on this (ie processor technology), once you understand, you shouldn't be whining anymore..."

                              And it makes me wonder how anybody could get anywhere in an engineering field, if they can't count, I replied to you only once. Where you get the idea of repeated abuse on my part, is beyond me.

                              Moreover when somebody has to resort to big noting how much experience they have "someone that's an EE and has over 20 years in the computer & electronic design business." is usually struggling to prove their point.

                              All this says to me is that you are asking me to believe you just because YOU say you have 20 years in the computer & electronic design business.
                              Why don't you just stick to the facts, and argue your case accordingly, instead of trying to impress us all with how much you say you know, and all of your experience you say you have, in other words cut out the crap, you aren't impressing anybody, well not anybody worth impressing.

                              Also the crap about reverse engineering and industrial espionage, where the hell that came from, and what this has got to do with the discussion in question, I will never know. Yes I realise I am leaving myself open for criticism, by admiting my ignorance, but I can not fathom what this has to do with the G-400's lack of OpenGL performance.

                              As for the huge feature set that the Matrox G-400 has over the TNT2, your arguments may have merit, but, as has been pointed out by so many review sites already, the G-400 does Direct-X just fine compared to a TNT2, why can it not do OpenGL?

                              The use of SSE and 3DNow in the G-400 Turbo driver for OpenGL, makes me skeptical as to why the G-400 chipset basically has to borrow CPU power to compete with a TNT2 in OpenGL. Is it a fundamental design failing? Or is it a quick fix for a longer term driver problem? Will we be seeing MiniGL drivers using SSE and 3DNow optomised code from NVidia, that will make TNT2 cards using the old drivers, look 2nd rate?

                              Who knows, maybe NVidia don't care, as they are probably concentrating on putting all its resources into the GeForce Chip.

                              I have already posted my speculations about the TurboGL driver, none of which you seem to have cared to engaged me in Greebe.

                              I may not have 20 years of EE experience Greebe, but I have enough experience to know what questions to ask, and your 20 years of EE experience certainly is not answering them, not to my satisfaction anyway, nor I suspect, anybody else who is not a syncophantic yes man.(Sorry for being sexist ladies, but no women I know would believe this crap )


                              Helmchen: Tell me if I read what you said corrrectly.

                              If you are saying programmers who cannot take advantage of SSE and 3DNow are bad.

                              and

                              If you are saying programmers who can take advantage of SSE and 3DNow are good.

                              Then I have to say I agree with you there 100%, and I think I have said almost this much previously.

                              Please correct me if I misinterpreted your statement.

                              [This message has been edited by Whisper (edited 09-29-1999).]

                              Comment

                              Working...
                              X