gl.ReadPixelsRaw(x, y, width, height, format, type, pixels)
pixels
. This must be a memory buffer allocated by Hollywood's AllocMem()
function and
returned by GetMemPointer()
. See Working with pointers for details on how to use memory pointers with Hollywood.
See gl.ReadPixels for a list of supported types for the format parameter.
Additionally, gl.ReadPixelsRaw()
also allows you to define the data type that should be used
when reading pixels from the frame buffer. type
can assume the following values: #GL_UNSIGNED_BYTE
,
#GL_BYTE
, #GL_BITMAP
, #GL_UNSIGNED_SHORT
, #GL_SHORT
, #GL_UNSIGNED_INT
, #GL_INT
, or #GL_FLOAT
.
gl.ReadPixels() always uses #GL_FLOAT
. With gl.ReadPixelsRaw()
you
can adjust this parameter to your specific needs.
See gl.ReadPixels for details.
Please consult an OpenGL reference manual for more information.
#GL_INVALID_ENUM
is generated if format
or type
is not an accepted value.
#GL_INVALID_VALUE
is generated if either width
or height
is negative.
#GL_INVALID_OPERATION
is generated if format
is #GL_COLOR_INDEX
and the color buffers store RGBA color components.
#GL_INVALID_OPERATION
is generated if format
is #GL_STENCIL_INDEX
and there is no stencil buffer.
#GL_INVALID_OPERATION
is generated if format
is #GL_DEPTH_COMPONENT
and there is no depth buffer.
#GL_INVALID_OPERATION
is generated if gl.ReadPixels() is executed between the execution of gl.Begin() and the corresponding execution of gl.End().
#GL_INDEX_MODE