I am trying to use an Arraybuffer to create a teximage2d, and
process/display in a fragment shader.
The Arraybuffer contains values for one channel, i.e.
numPixels = width*height
size of ArrayBuffer= numPixels
and not numPixels*3
I created the texture as-
var lumTexture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, lumTexture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
console.log("float texture support: " +
gl.getExtension("OES_texture_float"));
gl.texImage2D(gl.TEXTURE_2D, 0, gl.LUMINANCE, width, height,
0, gl.LUMINANCE, gl.UNSIGNED_BYTE, buffer);
gl.bindTexture(gl.TEXTURE_2D, null);
Looking at the above created texture in Webgl Inspector, I get a
grayscale image for it.
But when trying to using the texture to sample, and draw to canvas,
it's simply black.
Here's the test shader-
uniform sampler2D tex_v;
varying vec2 vTexCoord;
void main(void) {
vec4 colorv = texture2D(tex_v, vTexCoord);
gl_FragColor = vec4(colorv[0], colorv[0], colorv[0], 1.0);
}
The shader program has a warning in WebGL Inspector-
implicit truncation of vector type
Can someone please throw some light on using single channel textures
in WebGL, and what I may be doing incorrect here.
Thanks,
Seema
I was able to get textures load correctly when using a buffer of unsigned_bytes.
However, the shader gives errors when using an Uint16Array buffer, and
the datatype as gl.SHORT or gl.UNSIGNED_SHORT.
//create buffer
var YBuffer = new Uint16Array(data, offset, size_Y_channel);
// create texture
gl.texImage2D(gl.TEXTURE_2D, 0, gl.LUMINANCE, width, height, 0,
gl.LUMINANCE, gl.SHORT, YBuffer);
The texture elements in WebGL Inspector shows the correct pixel
values, but the format appears as ?? 0x1402 ??
0x1402 is the enum for gl.SHORT type.
The texture fails draw with the error "unsupported texture type". I am
running in Chrome 17.0.963.56 m
What is the correct way to create textures from UInt16Array, with range 0-65000?
Thanks.
RGBA UNSIGNED_BYTERGB UNSIGNED_BYTERGBA UNSIGNED_SHORT_4_4_4_4RGBA UNSIGNED_SHORT_5_5_5_1RGB UNSIGNED_SHORT_5_6_5LUMINANCE_ALPHA UNSIGNED_BYTELUMINANCE UNSIGNED_BYTEALPHA UNSIGNED_BYTE
RGBA FLOATRGB FLOATLUMINANCE_ALPHA FLOATLUMINANCE FLOATALPHA FLOAT
I use these on iOS devices and I'd love to use them in WebGL.
Thanks -
Sean
Is it possible for you to use OES_texture_float in the interim? If
not, is the memory consumption too high for your application, or is
there some other problem?
-Ken