I lost a few days wondering why some textures were completely distorted when loaded in OpenGL.
The thing is, they were only distorted when the colours components were packed as GL_UNSIGNED_SHORT_5_5_5_1 or GL_UNSIGNED_SHORT_4_4_4_4. When packing colour components as GL_UNSIGNED_BYTE (RGBA8888), the textures were loaded correctly.
Why ?
Since I'm using a small personal Ruby hack to generate raw textures from BMP with the desired colour packing, I really thought the problem was in the Ruby code. After verifying that the generated 4444 and 5551 textures were the exact counterpart of the working 8888 textures, and tracing the OpenGL glTexImage2D calls to be sure that the data were sent correctly, I wondered if a special parameter was to be passed to glTexImage2D after all.
Ok, maybe I missed something in the glTexImage2D manual...
Sure did...
width × height texels are read from memory, starting at location data. By default, these texels are taken from adjacent memory locations, except that after all width texels are read, the read pointer is advanced to the next four-byte boundary. The four-byte row alignment is specified by glPixelStorei with argument GL_UNPACK_ALIGNMENT, and it can be set to one, two, four, or eight bytes.
The solution
Either :
- have textures with a width multiple of 4,
- call glPixelStorei(GL_UNPACK_ALIGNMENT, 2); before calling glTexImage2D.
RTFM, as they always say !