Announcement

Collapse
No announcement yet.

New matrox drivers have texture compression

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • New matrox drivers have texture compression

    Proof?



    I'm simply shocked that I havn't yet heard all the 'eagle eye' matrox users moaning about this.

    Not that I'm complaining, but I'd like the option to TURN IT OFF..

    [This message has been edited by Rob M. (edited 08 November 2000).]

  • #2
    Did it give you a speed boost?

    Im a bit disapinted it looks the same as the Gforce stuff.

    Quite impressed that they got it working at all though. I wonder if it is hardware compressed, or a software hack to get a little more speed out of the ICD

    Ali

    Comment


    • #3
      yeah, I'm sure that's where all these 'major' speed increases are coming from. with the q3 texture detail slider set to max, my fps jumped 10 fps. with the slider set down one, the new drivers are about the same.

      Comment


      • #4
        So, does the texture compression give you the high texture maps in UT & Q3 that the GeForce can run?

        Matrox must be working on texture compression for their G800 and they just carried it over to the G4XX. It would be cool, if it allows you to use the 2nd CD in UT. I might just have to try that tonight.

        BTW, the 5.20 drivers seem to be the best (for me) since 5.04.

        Comment


        • #5
          I don't play Q3. Is it just the sky that's messed up with the GeForce & the new G400 drivers or do other objects look goofy too?

          Comment


          • #6
            If texture compression works in openGL, then Matrox must have licenced S3TC. I think its free to use under DirectX, because M$ owns it there.

            It would make sense. If Matrox paid S3 lots of money to use texture compression under openGL for the G800, then why not stick it onto the G400 if you have the ability?

            This suggests that the G800 is getting close. They would have put off paying S3 for as long as possible so they had the use of that money (even if it was just sitting in the bank earning interest), then when the cards were ready to go out they would cough up the dosh.

            Only a theory, but makes sense to me.

            Also, from that screen shot it looks like the G400 doesnt handle texture compression quite right, so they might also just be playing around to make sure they get it right with the G800.

            Ali

            Comment


            • #7
              Why do you think that this sky is texture compressed? How can you see it? What does this picture show you?
              Is it a nice sky or a high detailed sky or what?

              Comment


              • #8
                i think directx has it's own texture compression now called dxtc. i think it's based on s3tc. anyway, matrox wouldn't have to pay anyone.

                Comment


                • #9
                  You can tell the sky is compressed because its all grainy. It looks the same as a geforce card.

                  The sky texture on Q3 is a 128X128 texture, and it is transparent, aparently this means it doesnt like being compressed.

                  All the other textures in Q3 are 256X256 or 512X512 and they compress better.

                  The Radeon ignores textures below 256X256, so the sky looks good on them.

                  Also, DirectX DXTC is exactly the same algorithum as S3TC, because Microsoft Licenced it from S3. This means anybody can use it when writing drivers for DirectX (and making the hardware)

                  Under openGL you have to pay S3 for the privalege of using it.

                  3dFX made there own compression algorithum (actually 4 of them I think) and have made them open source. This means anybody can use them for free. For some reason nobody seems too interested in them.

                  Ali

                  Comment


                  • #10
                    in which drivers did you find texture compression? With the PD5.20 drivers (Win2k), I get "GL_s3_s3tc not found", thus implicating that these drivers do NOT support s3tc. Don't the Win9x and Win2k drivers use the same OpenGL implementation?

                    Comment


                    • #11
                      The sky at 16-bit textures and color ALWAYS looks grainy! I can´t seem to tell any difference, just in speed.

                      I moved up to 1024x768 32-bit and the game varies between 20-60 fps, albeit the mouse movement being jerky and discoordinated with the image. Can someone tell me if there´s a workaround for this?

                      Comment


                      • #12
                        To be honest, I havent even tried these new drivers myself.

                        Ill do some Q3 benchmarks tonight before and after updating the drivers, with the texture detail in different positions, and see what happens.

                        Something fun to do tonight.

                        Ali

                        Comment


                        • #13
                          1. The sky looks exactly as it does with geforce s3tc turned on
                          2. That pic was taken with the 5.20 win2k drivers at 32 bit colour with the texture detail slider set to max. I dunno what the story is for the new win9x drivers, I don't have that os installed.

                          3. There is a speed increase can only be caused by texture compression. I made a q3 level that specifically tested AGP speed (33 mb of textures) and on the old driver set I got 2.9 fps. The new drivers I get 163 fps.

                          Comment


                          • #14
                            Testing my own system using Q3 test (v1.11?), I did not get the pixelated clouds as your screenshot provides. But, after upgrading to PD6.10 (doing a near complete driver removal and overwrite of Registry settings) driver package I can tell you I re-enabled the 32bit Z buffer, in Win98's Advanced Display Property Sheet. Did you do the same?

                            This brings back to mind the 32bit color of the ATi's Rage Pro/128, where they had claimed 32bit all the way, but in reality had a 16bit Zbuffer. Remember those screenshots? (I know, I know... they go whay back - in this very forum, I believe)

                            I am more inclined to think this, because the "Driver Info" page in system settings did not mention any texture compression. Why would a game impliment a feature that it didn't detect during OpenGL driver initialization?

                            BTW Quake3 test crashed hard when running the timedemo benchmark on my system.

                            ------------------
                            Abit BH6 r1.1
                            Celeron2-566 o/c to 850MHz+Slotket!!! 1.8v
                            256 Megs PC-133 Cas3
                            Matrox G400 SH OEM (not oc'd) rev=03h
                            Diamond Monster Sound MX400
                            ECS K7S5A Pro, Athlon XP 2100+, 512 Megs PC-3200 CAS2.5, HIS Radeon 9550/VIVO 256Meg DDR

                            Asus A7N8X-E Deluxe C Mobile Athlon 2500+ @ 2.2GHz, 1GB PC-3200 CAS2.5, Hauppauge MCE 150, Nvidia 6600 256DDR

                            Asus A8R32 MVP, Sempron 1600+ @ 2.23GHz, 1 Gig DDR2 RAM, ATI 1900GT

                            Comment


                            • #15
                              in 32 bit I don't get the pixelated clouds either.

                              But you indeed need to enable "use 32 bit z-buffer" in PD properties, even though it has as description "enabled the use of a 32-bit z-buffer for Direct3D games", which is _not_ true (it wil enable the use of 32-bit z-buffer for OpenGL too).

                              Comment

                              Working...
                              X