The character at the beginning of the string isn't a corrupt ':', it is a Unicode control character '\uFDD0' which seems to be output as an internal detail so that clojurescript can distinguish keywords and strings.
The clojurescript compiler outputs javascript as utf-8.
So technically, everything is fine - but if you are serving your javascript up from a webserver, you need to ensure that the javascript is served in a way that the browser understands that it is encoded with UTF-8. You could use .htaccess to ensure that the content type header is set correctly, eg:
Content-Type: text/javascript; charset=utf-8
Or you could try including the encoding on the script tag (although theoretically that isn't supposed to override the headers):
<script type="text/javascript" src="script.js" charset="utf-8"></script>
However... that is all a bit too fragile. I think clojurescript needs fixing to be more robust:
When you use compile with optimizations, the closure compiler accepts utf-8 input, but outputs everything as us-ascii, using unicode backslash-u escapes to represent unicode characters. This is much safer. But when you compile without optimizations, clojurescript just outputs unicode as utf-8 directly.
I think clojurescript should be modified to use us-ascii encoding, and backslash-u escaping like the closure compiler does.
--
Dave