Weekend Hackaton

2 views
Skip to first unread message

Leonardo Ludueña

unread,
Jul 31, 2009, 1:28:45 PM7/31/09
to Hathi Developers, hathi-devel
Hi All, good friday !!

This weekend we will be online working in some small but important
changes in Hathi.

We will change current globalization mechanism for a more standard approach.
The idea is use gettext# or Mono.Unix i18N

http://mono-project.com/I18nGettext
http://mono-project.com/I18N_with_Mono.Unix

Behind this change is an effort to bring a better platform for
translators, so we could include more localizations in an easier way,
using tools that are used in other open source projects.

As an extra, I found a tool called "Resource Translator For
Localization" that uses a translation API based on google translate
service

We will try to add .isl, regular xml and .po support in order to get
some automated translation.

You are invited to participate.

#hathi channel and msn messenger will be available all weekend,
starting this afternoon in America.

Regards.

--
-----------------------------------------------------------------
Leonardo Ludueña
msn: elno...@gmail.com
blog: www.leonardoluduena.com.ar
-----------------------------------------------------------------

Jonathan Swift - "May you live every day of your life." -
http://www.brainyquote.com/quotes/authors/j/jonathan_swift.html

Dmytro

unread,
Jul 31, 2009, 1:38:22 PM7/31/09
to hath...@googlegroups.com
Leonardo Ludueña пишет:

> Hi All, good friday !!
>
> This weekend we will be online working in some small but important
> changes in Hathi.
>
> We will change current globalization mechanism for a more standard approach.
> The idea is use gettext# or Mono.Unix i18N
>
> http://mono-project.com/I18nGettext
> http://mono-project.com/I18N_with_Mono.Unix
>
> Behind this change is an effort to bring a better platform for
> translators, so we could include more localizations in an easier way,
> using tools that are used in other open source projects.
>
> As an extra, I found a tool called "Resource Translator For
> Localization" that uses a translation API based on google translate
> service
>
> We will try to add .isl, regular xml and .po support in order to get
> some automated translation.
>
> You are invited to participate.
>
> #hathi channel and msn messenger will be available all weekend,
> starting this afternoon in America.
>
> Regards.
>
>
Unfortunately I can't take a part in this. I am leaving on vacation
tomorrow early in the morning. But I hope this event is not last time :)

Ramone Hamilton

unread,
Jul 31, 2009, 2:07:34 PM7/31/09
to hath...@googlegroups.com
Same thing for me, I am taking my kids to Disney World and will be gone from the 1st through the 5th.

Justin Pierce

unread,
Jul 31, 2009, 2:22:50 PM7/31/09
to Hathi DEV
Haha thats awesome Ramone, I bet you and your kids have a blast! I've never been..but heard its a neat place.  I'll be around all weekend Leo, because I'm a loser...:P But all that language/localization stuff isn't really my area ...so I will continue work/research on Kad, and maybe somebody could join with me? Preferably someone good at programming, who can help me implement my research, but any help would be appreciated.  I've already researched Kademlia extensively and need help turning that information into code! My programming skills are progressing nicely, but this is some very complicated stuff. Kademlia is a totally decentralized network, with users and file hashes stored in a very large distributed hash table.  It computes 'distances' between users by XOR'ing their respective userID hashes, and the user has to bootstrap in to the network, meaning the client app must know an existing ip address of a member of the Kad network to get in, then get their stored list of client addresses, and so forth.  Most of the time this is done by finding eDonkey clients who are also connected to Kad, because it is very popular for eMule clients to be connected to both eDonkey and Kad simultaneously.  After the cleint bootstraps, it keeps on getting more userID's (nodes) by traversing through the network, and storing these hashes into buckets, each bucket being a relative distance from the client, ie, it stores userID's with distance of 5 in this bucket, ID's with distance of 10 in this bucket, etc.  There are also many complex routing schemes throughout the network.  File hashes are searched for by traversing through the known nodes.  After a file is found, the client requests a connection with the client who has the file, and the transfer begins.  So, Kad is for searching for files, users, etc..the actual file transfer is  between clients, client-to-client. This is just a basic outline of how things work, if anyone is interested in helping please notify me or Leo and we'd be glad for your help.  I guess I've taken on the main role in the Kad implementation, which is fine, but I shouldn't be the lead programmer. I'm very decent at networking and understanding all of this research, but I will need help turning all of that stuff into code, and I hope to learn a lot by doing so.  My programming ability is improving, but I won't be able to do this on my own :P Also, the main underlying protocol of Kad is UDP. Thanks!


Windows Live™ Hotmail®: Celebrate the moment with your favorite sports pics. Check it out.
Reply all
Reply to author
Forward
0 new messages