Announcement

Collapse
No announcement yet.

What do you think about DXTC and G450??

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • What do you think about DXTC and G450??

    You "think" the G450 would support the standard DirectX based texture compression algorythm methods? Or do you think DXTC would be with G800 later on?

    Please give me your ideas!

  • #2
    I've heard that texture compression just rocks. IF the G450 doesn't have it... the G800 sure as hell had better have it!

    Just my opinion.

    ------------------
    Abit BH6
    Celeron 550
    Matrox G400 32mb "MAX"
    256MB PC100 RAM
    IBM 10GB 7200rpm HDD
    Creative Labs DVD 5x
    Mitsumi 4x/2x/8x CD-RW
    Monster Sound MX300
    USR 56K Modem
    Creative Labs DTT 2500
    ADi 6P (19" Monitor)
    Win98 SE

    Abit BH6
    Celeron 450
    Matrox G400 32mb "MAX"
    256MB PC100 RAM
    IBM 10GB 7200rpm HDD
    Creative Labs DVD 5x
    Mitsumi 4x/2x/8x CD-RW
    Monster Sound MX300
    USR 56K Modem
    ADi 6P (19" Monitor)
    Windows ME

    Comment


    • #3
      The detail on the S3TC textures in Unreal Tournament is amazing (based on screenshots I've seen). If more games adopt DXTC, we'd be in for some sweet eye candy.

      The Rock
      Bart

      Comment


      • #4
        I think we don't need such a thing as texture compression.

        The G400 is capable of 2048x2048 as max texture dimensions, which is more than enough to fullfill all needs for details.
        Furthermore tehre is its excellent AGP DiME capability and the newer cards even have AGP4x texture transfer.

        The reasons above make the support for S3TC obsolete imho. What we'd need is a converter to unpack those textures manually ... or could it be that the textures or stored uncompressed and will only on demand be compressed to reduce the transfer times ?

        Anybody remember those special Q2 levels that were designed by S3 to show off their compression cards ?
        Well those levels ran amazingly fluid on my G400 simply by not using the bundled OpenGL32.dll that came with those maps.

        I guess I have to do some tests and figure out if it'S possible to use the S3TC textures that came on the 2nd disk of UT ...

        ------------------
        Despite my nickname causing confusion, I am not female ...

        ASRock Fatal1ty X79 Professional
        Intel Core i7-3930K@4.3GHz
        be quiet! Dark Rock Pro 2
        4x 8GB G.Skill TridentX PC3-19200U@CR1
        2x MSI N670GTX PE OC (SLI)
        OCZ Vertex 4 256GB
        4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
        Super Flower Golden Green Modular 800W
        Nanoxia Deep Silence 1
        LG BH10LS38
        LG DM2752D 27" 3D

        Comment


        • #5
          AFAIK, Unreal has a different executable for use with S3TC. Anybody wanna correct me, since my memory of this may be shaky?

          - Gurm

          ------------------
          Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
          The Internet - where men are men, women are men, and teenage girls are FBI agents!

          I'm the least you could do
          If only life were as easy as you
          I'm the least you could do, oh yeah
          If only life were as easy as you
          I would still get screwed

          Comment


          • #6
            2048x2048x32 = 16MB for one texture, kinda makes the case for texture compression that can lower that to 4MB. Not that texture compression has to mean huge textures, it just allows more of them or it gives you more fillrate with the regular amount. AGP is never as good as local video mem storage of textures and never will be, and it's not just local vs agp mem, it's bandwidth consumption from local mem to the chip as well that is much reduced. Texture compression is not only a good feature, it is inevitable and necessary. It's a non issue, all cards will have it eventually, next card I buy will have it, not hard to do since 3dfx has it, S3 has it, and NVIDIA has it.

            Comment


            • #7
              Wow I just love it when professionals answer me at these forums.

              Hey Maggi, after you get .S3t based textures to run on the standard texturing formula at 2048x2048 texture size, let me know.

              Better yet, if you write a program that provides a special set of commands/intructions to run it, please send it to me!! Pleaz!!

              Comment


              • #8
                Himself is right... Your vidcard can have 128 megs of RAM an do textures as big as 4196x4196, but if you compress them you'll eventually get 4, 5 or even 6 times more of them, what means cooler graphics with amazing details... TC rocks and it's a must for every new board that comes out. Or imagine a cheap version of G400MAX with 8 Megs of RAM and texture compression performing as good as the normal one that is much more expensive due to 32 Megs of hi-speed RAM that is on board...

                Just my SQRT0.0001$
                _____________________________
                BOINC stats

                Comment


                • #9
                  Maybe I'm missing something here? How would the avg gamer with a 17" monitor even be able to see those textures. I have a 21" and know it's not feasable to see that kind of detail... let alone the lack of games that have textures that size and the cpu power I don't even have to run them if they were out!
                  Lets get real here people and wakeup to the reality of what's practical.
                  "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                  "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                  Comment


                  • #10
                    Hi Guys,

                    I manually copied the textures from CD-2 into the textures folder from UT and on prompt I overwrote the existing ones.

                    Now I have the biggest textur at ~60MB, but I have no clue by which map it is used ... d'oh !

                    Anyway, no probs so far ...

                    Does anybody have an idea about which textures belong to which map ???

                    ------------------
                    Cheers,
                    Maggi

                    Despite my nickname causing confusion, I am not female ...
                    Despite my nickname causing confusion, I am not female ...

                    ASRock Fatal1ty X79 Professional
                    Intel Core i7-3930K@4.3GHz
                    be quiet! Dark Rock Pro 2
                    4x 8GB G.Skill TridentX PC3-19200U@CR1
                    2x MSI N670GTX PE OC (SLI)
                    OCZ Vertex 4 256GB
                    4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
                    Super Flower Golden Green Modular 800W
                    Nanoxia Deep Silence 1
                    LG BH10LS38
                    LG DM2752D 27" 3D

                    Comment


                    • #11
                      Greebe - Say your playing at 1024x768. Imagine you have a 2048x2048 texture on a wall. When you are far away, you cant tell the difference between it and a 256x256 texture. Thats why you have mip-mapping. When you get close and the wall (or whatever) more than fills the screen, you only see part of that texture on your 1024 display. Say you see half of the texture. Thats 1024 pixels, so now you can make out all the detail, whereas a 256 texture would be blocky as hell.

                      Extreme example I know, but it makes the point.

                      Maggi - we use textures that run into 100s of mb at work, and my G400MAX can't handle anything much above 100mb before it starts getting really slow. Let us know how you get on.

                      Comment


                      • #13
                        Yeap you are right Raptor^.

                        Mip-Mapping is generally a process where the same textures are interpreted to each scene and lower texture sizes are used for further distance sequences. This is, to mainly minimize overall memory bandwidth, but.. hey what is it to do with Maggi's idea of transferring the default 256x256 texture size to use a customized OpenGL32.dll file to run UT at 2048x2048(Highest available) through S3TC textures? I know this seems impossable logically but hey, programmers/engineers get real creative with software development.

                        Comment


                        • #14
                          I'd rather 256 128x128 textures, 64 256x256 textures, 16 512x512 textures, or 4 1024x1024 textures than one 2048x2048 texture, 2048x2048 is more along the lines of saying infinite size in terms of hardware capability and practical use. It's the total amount of memory taken up that matters in terms of performance, not the size of individual textures. 128x128 is a large texture in today's games for the sake of putting it in perspective.

                          You could fill up a screen with a 512x512 texture and not have very noticeable blockiness, it's not far off the 720x480 resolution used for DVDs even. Texture compression is along the lines of bump mapping and T&L for achieving the goal of better looking games and more gfx card performance, they are all equally important right now.

                          I agree that the G400 has a nice AGP implementation and that it's texture handling is top notch, it could be even better with texture compression. Given a choice I would rather 3dfx's open version of texture compression to be implemented in future products.

                          [Mipmaps are progressively reduced scaled textures generated from a source texture, they add more performance when texture mapping distant objects by reducing the fetches for texels. The whole process beings into play things like trilinear filtering to avoid mipmap stepping artifacts, etc, it's a basic part of gfx card techology today. It's unrelated to texture compression, with or without, you'll still have mipmaps, which of course can also be in compressed format, if necessary, mipmaps can get very tiny.]

                          Comment


                          • #15
                            Good info, Himself. I agree with you that the next card should use texture compression, as well as all the goodies that Matrox has become known for. That interview that someone posted a link to a while back seems to indicate that Matrox is planning to have it too, since the rep that was interviewed indicated that was one of the features that the next generation of graphics cards (anybody's) should include.

                            ------------------
                            Ace
                            "..so much for subtlety.."

                            System specs:
                            Gainward Ti4600
                            AMD Athlon XP2100+ (o.c. to 1845MHz)

                            Comment

                            Working...
                            X