HTML Assets getBytes(...) + Clarification about sounds format for each platform

30 views
Skip to first unread message

Mickael Barbeaux

unread,
Dec 1, 2015, 3:33:39 AM12/1/15
to PlayN
Hey there,

Two questions for the PlayN maintainers today :)

  • Why isn't there still no implementation for the getBytes(...) method on the HTML platform ? I know some browsers still doesn't implement TypedArrays nor binary XHR (IE11 on my Win7 still doesn't) but the most used of them just do right now (Chrome, Firefox, Safari...) For my project I had to use binary files that are not images nor sounds, so I wrote an implementation of getBytes(...) based on the getText(...) implementation, only by changing a little bit of code. I used ByteBuffer instead of byte[] because I found it more handly in my case, but it could be easily adapted for returning byte[] instead :

/**
* {@inheritDoc}
*/
@Override public RFuture<ByteBuffer> getByteBuffer(final String path) {
final RPromise<ByteBuffer> result = RPromise.create();
final String fullPath = pathPrefix + path;
doBinaryXhr(fullPath, result);
return result;
}

private void doBinaryXhr(final String path, final RPromise<ByteBuffer> result) {
if (!TypedArrays.isSupported()) {
throw new UnsupportedOperationException("TypedArrays aren't supported by your brownser");
}

final XMLHttpRequest xhr = XMLHttpRequest.create();
xhr.setResponseType(ResponseType.ArrayBuffer);
xhr.setOnReadyStateChange(new ReadyStateChangeHandler() {
@Override public void onReadyStateChange(final XMLHttpRequest xhr) {
final int readyState = xhr.getReadyState();
if (readyState == XMLHttpRequest.DONE) {
final int status = xhr.getStatus();
// status code 0 will be returned for non-http requests,
// e.g. file://
if ((status != 0) && ((status < 200) || (status >= 400))) {
platform.log().error("xhr::onReadyStateChange[" + path + "]" + "(readyState = " + readyState
+ "; status = " + status + ")");
result.fail(new Exception("Error getting " + path + " : " + xhr.getStatusText()));
} else {
if (HtmlAssetsConverter.LOG_XHR_SUCCESS) {
platform.log().debug("xhr::onReadyStateChange[" + path + "]" + "(readyState = " + readyState
+ "; status = " + status + ")");
}
result.succeed(TypedArrayHelper.wrap(xhr.getResponseArrayBuffer()));
}
}
}
});
if (HtmlAssetsConverter.LOG_XHR_SUCCESS) {
platform.log().debug("xhr.open('GET', '" + path + "')...");
}
xhr.open("GET", path);
if (HtmlAssetsConverter.LOG_XHR_SUCCESS) {
platform.log().debug("xhr.send()...");
}
xhr.send();
}

To handle browsers that do not support TypedArrays and binary XHR, I wrote a simple Servlet that transcodes the binary file to BASE64 text, fetch that using the getText(...) method and then translates it back to a ByteBuffer by decoding the received BASE64 text. Simple as that.


  • I need some clarifications about how to store audio files on the different platforms. I know this cookbook page explains it a little, but it isn't clear to myself. On my project sources, I store all audio files as uncompressed WAV files, and in the build process of each platform, I use the maven-exec-plugin to call the FFMPEG binaries to encode the audio files to the correct compression format for the platform.
On the Java platform, I managed to use OGG Vorbis audio files by adding a simple JAR to the maven project (com.googlecode.soundlibs:vorbisspi).
On the HTML platform, despite it is written on the cookbook page to use MP3 files, it couldn't make it on all browsers. Some just doesn't support it. Finally I got on using OGG Vorbis too as it is supported by Chrome and Firefox at least.
On Android, MP3 files just work fine as described in the cookbook.
On iOS, I used for the moment MP3 files and it works as it is. But I know it should be better to use CAFF files. But from what I know (maybe I'm wrong), CAFF files are just an container file format as AVI or MKV is, so what should be the best audio codec to use on the iOS platform ? AAC ? AIFC ? MP3 ?

Moreother, on the mobile (and HTML ?) platform, is it efficient enough to use compressed file format for audio ? Doesn't it impact too much effort on the CPU ? Maybe it's better to use compressed audio files for music (sounds that can be longer than a few seconds) and uncompressed audio files for soundfx ? 

Maybe I'm just overthinking too much, but if someone who perform a real commercial game with PlayN can see if it's really a big problem or not.


Mickael 
 





Mickael Barbeaux

unread,
Dec 3, 2015, 4:29:34 PM12/3/15
to PlayN
Something else I noticed...

I can't make a simple CAF file to read by my application. I converted the audio file using FFMPEG with those options : ffmpeg -i soundfx.wav -f caf -acodec pcm_s16le soundfx.caf as recommanded by the Apple Developer specs (CAF with uncompressed 16 bits little endian audio), but it keeps throwing an exception inside the CAFLoader class.

It seems that the "data" chunk size calculated by the CAFLoader class always returns -1, so it keeps crashing inside the data.getSubdata(...) method with a BufferOverflow.
From the Apple specs, it is written :

  • Audio Data chunk, containing the audio data for the file. If the data chunk’s size isn’t known, it must be the final chunk in the file. If this chunk’s header specifies the size, the chunk can appear anywhere after the Audio Description chunk. See Audio Data Chunk.


It seems that the "data" chunk size could be unspecified, so it should be calculated by starting from the data offset until the end of the buffer, right ? Or maybe it's a FFMPEG bug which always write -1 as the data chunk size ? But I don't think so

Mickael Barbeaux

unread,
Dec 5, 2015, 5:09:10 AM12/5/15
to PlayN
I just created a pull request on the official GitHub Playn repository for the CAFLoader bug ;)



Le mardi 1 décembre 2015 09:33:39 UTC+1, Mickael Barbeaux a écrit :

Michael Bayne

unread,
Dec 5, 2015, 12:12:47 PM12/5/15
to pl...@googlegroups.com
On Thu, Dec 3, 2015 at 1:29 PM, Mickael Barbeaux <mbar...@gmail.com> wrote:
I converted the audio file using FFMPEG with those options : ffmpeg -i soundfx.wav -f caf -acodec pcm_s16le soundfx.caf as recommanded by the Apple Developer specs (CAF with uncompressed 16 bits little endian audio), but it keeps throwing an exception inside the CAFLoader class.

I would recommend using afconvert to generate CAF (or AIFF):

afconvert -f caff -d LEI16 $IN $OUT

Mickael Barbeaux

unread,
Dec 6, 2015, 4:33:31 AM12/6/15
to PlayN
I know this is the best way to perform it.

The problem that occurs is that it was decided internally that all assets would be delivered as a unique binary file, which contains all data needed to render a scene (images, sounds, localization, etc)
That's the reason why I needed to have this method where you can load a CAF file from a ByteBuffer instead of a FIle.

When building that big asset file, I need to transcode WAV to CAF without really outputting a file to the hard drive, I just need to get the data procuded from the conversion and save it to a OutputStream.
I checked the documentation for "afconvert" and couldn't find a way to send the converted data to the stdout instead of a file.
FFMPEG permits that, that's why we use it here, and it's quite modular and multiplatform so that's why the decision was taken to use it.
Reply all
Reply to author
Forward
0 new messages