/*** {@inheritDoc}*/@Override public RFuture<ByteBuffer> getByteBuffer(final String path) {final RPromise<ByteBuffer> result = RPromise.create();final String fullPath = pathPrefix + path;doBinaryXhr(fullPath, result);return result;}
private void doBinaryXhr(final String path, final RPromise<ByteBuffer> result) {if (!TypedArrays.isSupported()) {throw new UnsupportedOperationException("TypedArrays aren't supported by your brownser");}final XMLHttpRequest xhr = XMLHttpRequest.create();xhr.setResponseType(ResponseType.ArrayBuffer);xhr.setOnReadyStateChange(new ReadyStateChangeHandler() {@Override public void onReadyStateChange(final XMLHttpRequest xhr) {final int readyState = xhr.getReadyState();if (readyState == XMLHttpRequest.DONE) {final int status = xhr.getStatus();// status code 0 will be returned for non-http requests,// e.g. file://if ((status != 0) && ((status < 200) || (status >= 400))) {platform.log().error("xhr::onReadyStateChange[" + path + "]" + "(readyState = " + readyState+ "; status = " + status + ")");result.fail(new Exception("Error getting " + path + " : " + xhr.getStatusText()));} else {if (HtmlAssetsConverter.LOG_XHR_SUCCESS) {platform.log().debug("xhr::onReadyStateChange[" + path + "]" + "(readyState = " + readyState+ "; status = " + status + ")");}result.succeed(TypedArrayHelper.wrap(xhr.getResponseArrayBuffer()));}}}});if (HtmlAssetsConverter.LOG_XHR_SUCCESS) {platform.log().debug("xhr.open('GET', '" + path + "')...");}xhr.open("GET", path);if (HtmlAssetsConverter.LOG_XHR_SUCCESS) {platform.log().debug("xhr.send()...");}xhr.send();}
To handle browsers that do not support TypedArrays and binary XHR, I wrote a simple Servlet that transcodes the binary file to BASE64 text, fetch that using the getText(...) method and then translates it back to a ByteBuffer by decoding the received BASE64 text. Simple as that.
On the Java platform, I managed to use OGG Vorbis audio files by adding a simple JAR to the maven project (com.googlecode.soundlibs:vorbisspi).
On the HTML platform, despite it is written on the cookbook page to use MP3 files, it couldn't make it on all browsers. Some just doesn't support it. Finally I got on using OGG Vorbis too as it is supported by Chrome and Firefox at least.
On Android, MP3 files just work fine as described in the cookbook.
On iOS, I used for the moment MP3 files and it works as it is. But I know it should be better to use CAFF files. But from what I know (maybe I'm wrong), CAFF files are just an container file format as AVI or MKV is, so what should be the best audio codec to use on the iOS platform ? AAC ? AIFC ? MP3 ?
Moreother, on the mobile (and HTML ?) platform, is it efficient enough to use compressed file format for audio ? Doesn't it impact too much effort on the CPU ? Maybe it's better to use compressed audio files for music (sounds that can be longer than a few seconds) and uncompressed audio files for soundfx ?
Maybe I'm just overthinking too much, but if someone who perform a real commercial game with PlayN can see if it's really a big problem or not.
Audio Data chunk, containing the audio data for the file. If the data chunk’s size isn’t known, it must be the final chunk in the file. If this chunk’s header specifies the size, the chunk can appear anywhere after the Audio Description chunk. See Audio Data Chunk.
I converted the audio file using FFMPEG with those options : ffmpeg -i soundfx.wav -f caf -acodec pcm_s16le soundfx.caf as recommanded by the Apple Developer specs (CAF with uncompressed 16 bits little endian audio), but it keeps throwing an exception inside the CAFLoader class.