Results 1 to 9 of 9

Thread: 32-bit vs. 24-bit colour?

  1. #1
    Jon McGarry
    Guest

    Default 32-bit vs. 24-bit colour?

    This may sound like a stupid question to many of you, but it has been a question that has nagged me every since this 32-bit rendering argument started.

    I keep hearing over and over that 32-bit rendering is 16.7 million colors. This does not seem to support the math. A few years ago, 24-bit was the standard for desktop resolution. 24-bit also had 16.7 million colors. If you do the math, by raising 2 to the power of 24 you get 16.7 million. (2^24 = 16777216 colors)

    Now this seems very strange to me, but 32-bit has a hell of a lot more colors than 24-bit (2^32 = 4294967296 colors)!!!!!

    So could someone please explain to me why 32-bit rendering only supports 16.7 million colors? And if so, why then even bother to use 32-bit? Why not just use 24-bit? As far as I know, games would look perfectly fine in 24-bit color and probably run faster.

  2. #2
    Super MURCer SteveC's Avatar
    Join Date
    Aug 1999
    Location
    Rayleigh, Essex, UK
    Posts
    3,537

    Default

    Hi Jon,

    The extra 8 bits are used for the alpha channel.
    And on Matrox cards at least, 24 bit is non-accelerated.

    ------------------
    Cheers,
    Steve

    My PC? Not that bad, got all sorts of crap in it, and all sorts of crap around it and my desk is also messy. Now what does that say about me? ;¨)

  3. #3
    Jon McGarry
    Guest

    Default

    Okay, I'll accept that answer. So on the 16-bit rendering for the Voodoo3 there's no alpha channel?


    Also, what's with the 32-bit DESKTOP display depths used on most new video cards? I remember my old Matrox mystique supported 24-bit only, but my new Fire GL supports 32-bit. I can't see any difference between the two. Image files like .jpg and .bmp only use 16.7 million colors anyway, so in theory 32-bit color wouldn't benefit them.

  4. #4
    The Berserker Jammrock's Avatar
    Join Date
    Aug 1999
    Location
    Right behind you.
    Posts
    8,985

    Default

    The alpha channel is used mainly for transparencies and other effects. There are only 16.7 million true refracted colors. Or at least that's what everybody tells me

    Jammrock

    ------------------
    PIII 540, 256 MB SDRAM, ASUS P3B-F, Winblows 98 SuckyEdition, 18 GB HDD, 6x DVD w/ decoder, (TEMPORARY!!!) Voodoo 3 2000 which will be replaced by a Matrox G400, SB Live!

  5. #5

    Default

    Now I knew that, but I've always wanted to ask how excatly are those 8 bits used in practise?
    I know that Alpha means transparency, but what does it have to do with 3D graphics?
    .
    .
    Oh, now I get it! You mean that for example clouds are rendered separately and then layered on top of the background using Alpha information.. eh?

    B

  6. #6
    Helmchen2000
    Guest

    Default

    You have mixed up 2 things:
    1. Desktopresolution with 24/32 bit colordepth where the difference is only in the way the pixels are stored in videomemory. in 24bit mode (packed pixel mode) the 3 bytes of a pixel a stored one after the other. This stands in conflickt with the memory organization. The controller can 'only' handle 32bit 64bit 128... parts of the memory. Because of this there is a lot to calculate for the controller.
    in 32bit the 3 bytes of a pixel are stored together with a 'spare' byte.
    In times where videocards had only 1 2 or 4 MB memory you can achieve a higher resolution with 24bit mode but the display worked much slower. I think today there's no need for a 24 bit mode because the cards have enough memory to display all resolutions with 32 bit.
    2. 32bit rendering on 3D cards this is something complete different there is the 24bit mode used to have the other 8bit as stencil buffer.

    Hope this helps

    Helmchen

  7. #7

    Default

    Helmchen2000 got it right .. i read it somewhere that 32 bit is faster in desktop than 24 bit coz it's in power of .. while 24 isn't .. so the card need to do more calculations ..
    dunno about the 3D difference though ..


    ------------------
    GigaByte 6BXE, celeron 300A@464, 128 PC100 RAM,
    G200 8 M SD @112.5 core. driver 5.13, bios 1.6.



  8. #8
    Super MURCer Maggi's Avatar
    Join Date
    Jan 2000
    Location
    somewhere lost in the middle of nowhere, L√ľneburger Heide, Germany
    Posts
    5,087

    Default

    see my answer over here:

    http://forums.gagames.com/forums/For...ML/002895.html



    ------------------
    Cheerio,
    Maggi

    Asus P2B-S @ 112MHz FSB - Bios 1009 final
    Celeron300A @ 504Mhz
    128MB 7ns SDRAM
    G400 DualHead 32MB SGRAM @ 201 MHz memory clock

  9. #9
    Super MURCer Maggi's Avatar
    Join Date
    Jan 2000
    Location
    somewhere lost in the middle of nowhere, L√ľneburger Heide, Germany
    Posts
    5,087

    Default

    forgot to mention that there is a way to use 32bpp in 2D ...

    Use a 24bpp RGB image and add an 8bit alpha channel for transparency ... the additional 8 bits can be used by Photoshop via hardware.

    For example:





    ------------------
    Cheerio,
    Maggi

    Asus P2B-S @ 112MHz FSB - Bios 1009 final
    Celeron300A @ 504Mhz
    128MB 7ns SDRAM
    G400 DualHead 32MB SGRAM @ 201 MHz memory clock

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 6
    Last Post: 11th August 2003, 05:29
  2. Replies: 4
    Last Post: 2nd January 2003, 11:39
  3. 32 bit textures show up as 8-bit???
    By lindoorn in forum Matrox Hardware
    Replies: 0
    Last Post: 24th February 2000, 07:42
  4. To play 32-bit games, desktop must be 32-bit?
    By antdude in forum MURC Gaming
    Replies: 8
    Last Post: 23rd January 2000, 15:57
  5. 32-bit colour & Q3A?
    By boyracer in forum MURC Gaming
    Replies: 3
    Last Post: 20th December 1999, 23:58

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •