gl.TexImage2D(level, internalformat, w, h, border, format, type, pixels)
#GL_TEXTURE_2D
.
To define texture images, call gl.TexImage2D()
. The arguments describe the parameters of the texture image, such as height,
width, width of the border, level-of-detail number (See gl.TexParameter for details.), and number of color components
provided. The last three arguments describe how the image is represented in memory; they are identical to the pixel formats
used for glDrawPixels.
Data is read from pixels
as a sequence of signed or unsigned bytes, shorts, or longs, or single-precision floating-point values, depending on type
which can be #GL_UNSIGNED_BYTE
, #GL_BYTE
, #GL_BITMAP
, #GL_UNSIGNED_SHORT
, #GL_SHORT
, #GL_UNSIGNED_INT
, #GL_INT
, and #GL_FLOAT
. These values are
grouped into sets of one, two, three, or four values, depending on format
, to form elements. If type is #GL_BITMAP
, the data is considered as
a string of unsigned bytes (and format
must be #GL_COLOR_INDEX
). Each data byte is treated as eight 1-bit elements, with bit ordering determined
by #GL_UNPACK_LSB_FIRST
(See gl.PixelStore for details.).
The first element corresponds to the lower left corner of the texture image. Subsequent elements progress left-to-right through the remaining texels in the lowest row of the texture image, and then in successively higher rows of the texture image. The final element corresponds to the upper right corner of the texture image.
format
determines the composition of each element in pixels. It can assume one of nine symbolic values:
#GL_COLOR_INDEX
#GL_INDEX_SHIFT
, and added to #GL_INDEX_OFFSET
(See gl.PixelTransfer for details.). The resulting index is converted to a set of color components using the #GL_PIXEL_MAP_I_TO_R
,
#GL_PIXEL_MAP_I_TO_G
, #GL_PIXEL_MAP_I_TO_B
, and #GL_PIXEL_MAP_I_TO_A
tables, and clamped to the range [0,1].
#GL_RED
#GL_c_SCALE
, added to the signed bias
#GL_c_BIAS
, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
#GL_GREEN
#GL_c_SCALE
, added to the signed bias
#GL_c_BIAS
, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
#GL_BLUE
#GL_c_SCALE
, added to the signed bias
#GL_c_BIAS
, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
#GL_ALPHA
#GL_c_SCALE
, added to the signed bias #GL_c_BIAS
,
and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
#GL_RGB
#GL_c_SCALE
, added to the signed bias #GL_c_BIAS
, and clamped to the range [0, 1]
(See gl.PixelTransfer for details.).
#GL_RGBA
#GL_c_SCALE
, added to the signed bias
#GL_c_BIAS
, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
#GL_LUMINANCE
#GL_c_SCALE
, added to the signed bias #GL_c_BIAS
, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
#GL_LUMINANCE_ALPHA
#GL_c_SCALE
,
added to the signed bias #GL_c_BIAS
, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
#GL_DEPTH_COMPONENT
#GL_DEPTH_SCALE
,
added to the signed bias #GL_DEPTH_BIAS
, and clamped to the range [0, 1] (See gl.PixelTransfer for details.).
If an application wants to store the texture at a certain resolution or in a certain format, it can request the resolution and format
with internalformat
. internalformat
specifies the internal format of the texture array. See Internal pixel formats for details.
The GL will choose an internal representation that closely approximates that requested by internalformat
, but it may not match exactly.
(The representations specified by #GL_LUMINANCE
, #GL_LUMINANCE_ALPHA
, #GL_RGB
, and #GL_RGBA
must match exactly. The numeric values 1, 2,
3, and 4 may also be used to specify the above representations.)
A one-component texture image uses only the red component of the RGBA color extracted from pixels. A two-component image uses the R and A values. A three-component image uses the R, G, and B values. A four-component image uses all of the RGBA components.
Texturing has no effect in color index mode.
The texture image can be represented by the same data formats as the pixels in a gl.DrawPixels() command, except
that #GL_STENCIL_INDEX
and #GL_DEPTH_COMPONENT
cannot be used. gl.PixelStore() and gl.PixelTransfer()
modes affect texture images in exactly the way they affect gl.DrawPixels().
Please note that this command operates directly with memory pointers. There is also a version which works with tables instead of memory pointers, but this is slower of course. See gl.TexImage for details. See Working with pointers for details on how to use memory pointers with Hollywood.
Please consult an OpenGL reference manual for more information.
#GL_INVALID_ENUM
is generated if format
is not an accepted format constant. Format constants other than #GL_STENCIL_INDEX
are accepted.
#GL_INVALID_ENUM
is generated if type
is not a type constant.
#GL_INVALID_ENUM
is generated if type
is #GL_BITMAP
and format
is not #GL_COLOR_INDEX
.
#GL_INVALID_VALUE
is generated if level
is less than 0.
#GL_INVALID_VALUE
may be generated if level
is greater than log2max, where max is the returned value of #GL_MAX_TEXTURE_SIZE
.
#GL_INVALID_VALUE
is generated if internalformat
is not 1, 2, 3, 4, or one of the accepted resolution and format symbolic constants.
#GL_INVALID_VALUE
is generated if width
or height
is less than 0 or greater than 2 + #GL_MAX_TEXTURE_SIZE
, or if either cannot be represented as 2k + 2*border for some integer value of k.
#GL_INVALID_VALUE
is generated if border
is not 0 or 1.
#GL_INVALID_OPERATION
is generated if gl.TexImage2D()
is executed between the execution of gl.Begin() and the corresponding execution of gl.End().
gl.IsEnabled() with argument #GL_TEXTURE_2D