Cloud Composer app updated and on Github

13 views
Skip to first unread message

Greg

unread,
Aug 8, 2011, 12:53:56 PM8/8/11
to CHI-HTML5-Hackathon
Hi Guys,

I had a lot of fun developing the cloud composer app at the hackathon
last week. Since we didn't get the app finished at the hackathon I
decided to spend a couple more days developing it and uploaded it to
Github here: https://github.com/GregJ/HTML5-Cloud-Composer

You can check out the demo here: http://gregj.github.com/HTML5-Cloud-Composer/

and I also wrote a blog post about the app:
http://www.gregjopa.com/2011/08/html5-cloud-composer-app/

-Greg

Raphael

unread,
Aug 8, 2011, 12:57:45 PM8/8/11
to chi-html5...@googlegroups.com
Nicely done!

Chris Sutton

unread,
Aug 8, 2011, 10:35:23 PM8/8/11
to chi-html5...@googlegroups.com
It looks great

On Mon, Aug 8, 2011 at 11:53 AM, Greg <grj...@gmail.com> wrote:

Agnel Antony

unread,
Aug 9, 2011, 12:00:09 PM8/9/11
to chi-html5...@googlegroups.com
Thats really cool.!!. Nice work.!

John Smith

unread,
Aug 9, 2011, 1:01:30 PM8/9/11
to chi-html5...@googlegroups.com
Awesome work!

John Smith
Sr. Interactive Developer

Elijah Snyder

unread,
Aug 9, 2011, 2:35:46 PM8/9/11
to chi-html5...@googlegroups.com
I haven't looked at the source.... :\
... but I was under the impression you can't signal a webaudio sample
to play at a specific time.

How did you overcome that in your app? Got any tips on how someone
else can get the same great results you did?

Gregory Jopa

unread,
Aug 10, 2011, 10:16:03 AM8/10/11
to chi-html5...@googlegroups.com

Actually this app is not doing any audio sampling. Instead its doing audio synthesis with the new Web Audio API that is being developed. I am using the Audiolet.js library to do the audio synthesis which is basically producing a sine wave and applying effects to it to try and simulate the sound of a note. Check out this presentation from the Mozilla Summit that explains the new Audio APIs: http://www.youtube.com/watch?v=1Uw0CrQdYYg

 

One tip I have is when developing a project that is a mashup of several different technologies try to get each technology working separately first and then mash them together. With our Cloud Composer app we first got the audio working in a basic web page, then got the canvas click events working in a separate web page, and then got the add notes feature working with a basic vexflow canvas in a separate web page. Once we got all these features working separately we then mashed them together and added in the jQuery Mobile framework. With this approach its much easier to find bugs and troubleshoot your code.


-Greg

Elijah Snyder

unread,
Aug 10, 2011, 10:57:29 AM8/10/11
to chi-html5...@googlegroups.com
Thanks for that link. I'm not a very big audio guy.... and I'm
somewhat tone deaf. :)

I look to other people to help me out with audio when it's necessary. :)

And great advice - that's pretty much the exact same thing our team
did with the "picture guessing game" thing. I worked on gathering
images, "randomizer" PHP script, metadata for each image and finally
checking for matches in the voice recognition. Another teammate did
the game-loop and canvas elements with countdown and masking, another
kept the ideas flowing and our final teammate insured the wireframes
were created. Keeping it isolated in sections (and/or by
technologies) is without a doubt a priceless practice. :)

Reply all
Reply to author
Forward
0 new messages