I keep hearing people say this but I have never seen official documentation about this:
Will RBG textures automatically store to RGBA? If so I would like to make sure I store useful information in all my alpha channels instead of the gpu filling in 1’s.
It sounds insane to me that you would take that much more memory when not needed. I keep hearing it helps with cache lines, but because you are adding more memory it means that there is more memory to cache. (not an cpu designer but that is my initial thought).
If someone replies, is there any documentation of this?
That is hardware dependent, but yes, that is done most of the time.
That is also documented, at least it was for older GPUs.
Please have a look at the “Texture Formats” tables in the NVIDIA GPU Programming Guides (in the older one at the bottom if this site):
Then there is this texture format table which lists which GPU supported which OpenGL format natively.
Note that only the NV44 GPU can do RGB8 textures natively (because it was a low end board and needed to save memory but that costs performance.)
I don’t get how having an extra byte of data that needs to be transferred every time I sample a texture is faster. I have heard that it has to do with the cache, but if you have more data, aren’t you going to have more caching problems because you are jumping around an extra 33% of the memory for an RGB vs RGBA?
And to be clear then, always use the alpha channel as some kind of useful information? Or should I just use
RGB5 If I really don’t need an alpha channel?
I forced all my textures from RGB to R5G6B5 and had no performance increase. Interesting.
In the CPU/GPU world there is no native 24-bit data type. So more often that non, you will probably end up with unalign memory access when using something like a RGBA texture. Yes, it make take less space memory wise, but if you are paying the price for unalign access, then that may outweigh the benefit of having a smaller memory footprint.