Home Of The Underdogs (.net)

18 views
Skip to first unread message

Acharis

unread,
Mar 24, 2009, 8:58:35 AM3/24/09
to Home of the Underdogs Revival Project
Mission Statement: To continue the spirit, look, feel and
functionality of the HotU.

http://HomeOfTheUnderdogs.net

It's a quite close ressurection of the original website. The most
basic features work (browse game/publisher, search, forum, etc), no
downloads yet.


Technical stuff:
It uses 3 servers right now (website, forum, image host), later it
will use additional to handle downloads. It was designed so the
website (database to browse games) is on the best machine capable of
handling 10-20 million impressions a month so it would be online all
the time. Lower quality servers (or free mirrors provided by users)
will be used to store other files.

Finances:
Fully free. Only 'normal' advertisement (no popups and probably a bit
fewer than before), no donations of any kind, no premium membership,
etc. I can also put up front money to operate for the first months.
Anyway, there is no problem here, since I was keeping online sites of
similar sizes before.

Legal issues:
No plan yet, also I have no experience on this, I hope I get some help
on this from you...

Community:
The forum is set up the way it was before (more or less).

Future:
First, I want to make it fully operational before going further. But
the general plan is that a group of maintainers would create a xml
file (like the current one) which would be added to the database once
per 3 months. As for files, these can be handled by individuals/small
teams since it would be a distributed network independednt to the
website. Once these are done a legal team would check the new entries
and enable download link for these that do not violate copyright, it
is quite convenient since mirror providers can upload anything they
want without checking and do not need to worry about copyright because
it would be marked as downloadable from their server only after it was
checked.


I hope you like it :)

Nacho

unread,
Mar 26, 2009, 4:43:32 AM3/26/09
to hotu-r...@googlegroups.com
Yes! It's loyal to the old site. That's what Dan and Andrew were looking for (a real snapshot of the old site). Once it's rebuilt, we could perform a snapshot copy for the historians.

Afterwards, any mod, improvement,etc could be done working on the infrastructure, code and web design.
It's damn fast! (that's why I don't like Joomla ^___^ ).

2009/3/24 Acharis <krzyszt...@gmail.com>

Acharis

unread,
Mar 26, 2009, 7:19:14 AM3/26/09
to Home of the Underdogs Revival Project
Update:
- themes work
- instead of download is a websearch link (at least for now)


> It's damn fast! (that's why I don't like Joomla ^___^ ).
Yes, speed and handling high load was the priority :)
- it uses Dual Xeon CPU (4 cores total)
- the database is highly optimised (max 2 querries per page, usually 1
query)
- soon the images will be hosted on 2 machines (browsers can handle
effectively 2-3 simultaneous connections) which will result in even
faster page display.


On 26 Mar, 09:43, Nacho <mithrilfo...@gmail.com> wrote:
> Yes! It's loyal to the old site. That's what Dan and Andrew were looking for
> (a real snapshot of the old site). Once it's rebuilt, we could perform a
> snapshot copy for the historians.
>
> Afterwards, any mod, improvement,etc could be done working on the
> infrastructure, code and web design.
> It's damn fast! (that's why I don't like Joomla ^___^ ).
>
> 2009/3/24 Acharis <krzysztofkoz...@gmail.com>

Rince

unread,
Mar 26, 2009, 7:21:56 AM3/26/09
to hotu-r...@googlegroups.com
Browsers handle 2-3 simultaneous connections *PER MACHINE*.

- Rich
--

Bilbo's First Law: You cannot count friends that are all packed up in barrels.

Walter

unread,
Mar 26, 2009, 8:42:39 PM3/26/09
to hotu-r...@googlegroups.com
Browsers handle 2-3 simultaneous connections *PER MACHINE*.

Actually I'm fairly sure that this is 'per domain'.

The implication being that if you create x.yourdomain.com,
y.yourdomain.com, z.yourdomain.com and split resources on a
single page between them, then the browser will agree to open
more connections and download content quicker - even if they
all resolve to the same system.

- Walter

Dylan Marsh

unread,
Mar 27, 2009, 4:33:10 AM3/27/09
to hotu-r...@googlegroups.com
I think what you all are talking about here is a CDN or Content
Delivery Network. CDN's are used to do two things: First and most
important, they obviously distribute the load on both the server's
backbone internet connection and CPU / system resources across multiple
machines, resulting in faster load times. The second service they
provide is content delivery from servers in close(er) proximity to the
host receiving the data, an optional and often overlooked aspect on many
CDN implementations. To provide the second service, the geographical
location of the host must first be determined, which is only done once
per session during the initial request. Obviously, there must be one
"primary" server that handles all the initial requests and determines
the hosts location, then places a cookie on the host (so the host's
session can be tracked) and routes that request to the closest
"tertiary" server available. For the purposes of HOTU, I would ignore
this second aspect for now. CDN servers do not contain an entire copy of
everything on the site, they only server STATIC content -- images,
mostly. Static is really the operative word when discussing CDNs. For
example, if you goto ebay.com and look at the properties of the eBay
logo at the top, you'll notice that it was served from a different
domain, "ebaystatic.com". This is a CDN in action. There is one other
thing I want to mention, and I'm not sure if this has been brought up or
even considered, but implementing a physically separate machine or
machines to act as database severs for a Web Application is NOT a CDN
and has nothing to do with a CDN. Using separate boxes whose sole
purpose is serving database queries is a great idea, in fact it can help
reduce the bottleneck on a site more then anything, but this setup is a
"Multi-Tier" Web Application, and generally more difficult to implement.
The distinguishing characteristic is, with a database server, the
communication occurs on the SERVER-side, between the web site's
Application Logic layer server and the database server. This is opposed
to the setup of a CDN, where communication is with the client-side,
between the server and the site visitors browser.
That said, I'm not sure of the traffic numbers on the old or new
HOTU sites, but I suppose implementing such a system is a good idea even
if it's not completely necessary at this point, and I'm not saying it
isn't. If you have additional servers available, my first priority (as
far as a CDN goes) would be implementing a true CDN dedicated static
image server (the first aspect I mentioned above, ignore the second for
now). You would do this by simply: Moving all the images to the new
static CDN server box. You can use a sub-domain, as has already been
discussed, like img.hotud.org -- or you could use a completely separate
domain like hotud-static.org, one is really no better then the other. In
either case, DNS records will need to point to a DIFFERENT IP address,
which points to a different server, of course.
That's the key to what was being discussed earlier, in order for a
browser to open multiple _sets_ of simultaneous connections while
loading a single website, there must be content on that site being
served from two different IP's. Since it was decided that this site
would run on Joomla, a system I have had many unfortunate experiences
with -- mostly related to it's "obese" code base & excessive queries,
you want to lighten the load on the machine executing the PHP code (god
forbid the same machine be acting as the DB server as well, which I'd
guess it is) as much as possible, from a CPU time usage point-of-view
more then bandwidth. I also agree with previous suggestions about
storing the download-able game files on a different machine -- to
prevent a bandwidth bottle neck. The best way to do this would be to use
a sub-domain whose DNS points to a separate IP as well, like
download.hotud.org.
If you wanted to use a distributed database solution (multiple boxes
with different IPs serving DB queries), you would need a load-balancing
system in place to distribute all the Joomla queries across the servers.
This will NOT happen on it's own. You would also need to correctly
configure database replication on all systems, so that when a record is
updated on one DB server it propagates to the other DB servers --
ensuring they each have an exact copy of the database.
I've been a web developer for many years, so I just thought I would
offer some assistance on this issue. While incredibly slow page-load
times do not seem to be a problem on the site yet, if it becomes
anywhere near as popular as the old HOTU was at its peak it will be a
problem soon. The methods I outlined above are the only true solution to
such an issue. People may tell you that simply upgrading your internet
back-bone or the components in the server box will make this go away,
but I've been there and I can tell you beyond a shadow of a doubt that
this is not the solution. While those things may help, they are not
solutions which address the real issue causing the poor performance and
are band-aid solutions at best. I hope these are issues that the new
HOTU will someday face, these things are a good sign of course -- but
left uncorrected there's nothing that will kill a site faster then poor
page-load times and download speeds. If and when this becomes the case,
I'm always here. I've have had experience with such situations with
several sites months after they were deployed, and I've always been able
to get the issues corrected and restore load-times to satisfactory levels.

All the best,
gonzo





Walter wrote:
>
> Browsers handle 2-3 simultaneous connections *PER MACHINE*.
>
>
> Actually I'm fairly sure that this is 'per domain'.
>
> The implication being that if you create x.yourdomain.com
> <http://x.yourdomain.com>,
> y.yourdomain.com <http://y.yourdomain.com>, z.yourdomain.com
> <http://z.yourdomain.com> and split resources on a

Lord_Pall

unread,
Mar 27, 2009, 3:38:41 AM3/27/09
to Home of the Underdogs Revival Project
This is some great advice. I'm already looking into future changes for
the Hotud.org site to keep the performance up-to-par as you're
mentioning. My goal is to start the efficiency improvements ASAP so
that I stay ahead of the curve for performance.

Basically, the idea is to have a plan of action with individual
thresholds for performance and site growth. Once a threshold is
crossed (bandwidth, response time, page loads, whatever), the site
expands. That should give us enough time to do the prep work so that
the upgrades and growth aren't difficult, but without taking on the
additional overhead or cost before it's necessary.
Reply all
Reply to author
Forward
0 new messages