Thanks for your quick answer Alon!
Data2 has at first normal size: 64 bytes, but sometimes returned
value "return" is being again reused as a parameter. So when I use
debugger to see sizes of arraybuffers all data.buffer.byteLength,
data2.buffer.byteLength and targetdata.buffer.byteLength return 64
bytes. But when I check result.buffer.byteLength it returns
exactly 16777216 bytes.
So this line: var result = new Float32Array(targetHeap.buffer,
targetHeap.byteOffset, 16);
produces a result with value as expected:
[0.000001385850055157789, -1, 7.756424391658356e-10, 0, -1,
-0.000001385850055157789, 0.0000013868001360606286, 0,
-0.0000013868001360606286, -7.775643462437642e-10, -1, 0, 0, 0, 0,
1]
but after checking with debugger in the console
result.buffer.byteLength the length is 16777216 bytes, same as
targetHeap.buffer. Now it's impossible to feed it back into this
function(so it's buffer will be used in this line: dataHeap2.set(
new Uint8Array(data2.buffer) ).
Is it an error of Float32Array that it doesn't return a buffer of
byteLength of 64 or is it because Uint8Array should atomatically
truncate input? Or is it because the design of my function is
wrong?
Following "Uint8Array should atomatically truncate input" I made a
change to limit amount of data coming to Uint8Array by adding
data2.byteOffset and nDataBytes parameters:
dataHeap2.set( new Uint8Array(data2.buffer, data2.byteOffset,
nDataBytes) );
what's interesting is that the game is now running, but the screen
is white. At least it's not crashing. Something is still very
wrong but it looks like tons of calculations are happening now.
Only wrong.
The returned value after checking it with debugger() on first run