I want to have some information on how to structure the database and
the php files so that I can achieve speed and efficiency.
Can some one please give me suggestions and point me to references
where I can get this information.
I don't have any sites offhand - but you can start with the database by a google
search for
database normalization
Will help you with theory of normalization and even some tutorials. Most sites
are around third normal form.
Other than that, it's basically a matter of style. There's really no "right"
way to do it. The main thing is to design your pages ahead of time (don't code
them - just scratch them out on paper if nothing else). This will tell you what
data you need in the database. Then figure out what code you need for each piece.
The main things being the site works and is easy to maintain.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstu...@attglobal.net
==================
Lazy computation can speed up things as well (see for instance
http://www.w-p.dds.nl/article/wrtabrec.htm )
Best regards
Try to study up on SQL. Usually you can pass a lot of heavy load to the DB
engine, rather then doing it in PHP. For instance mySQL 5 supports
subselects. Just using subselects in my script has improved my PHP run
time by 300%. Also, try to use native php SQL functions instead of DB.php
class. In my tests DB.php class is almost 2000% slower when direct mySql
functions. Finally, try to use persistent connection. That is, create a DB
object, open it in the beginning of the script and don't close it until
the script has finished.
As far as DB structure, rule of thumb is, never save data in more then one
place in DB.
- Bogdan
x0054 wrote:
> Finally, try to use persistent connection. That is, create a DB
> object, open it in the beginning of the script and don't close it until
> the script has finished.
Well, technically, a persistent connection is never closed - the PHP engine
keeps it open, so the next script to "open" a persistent connection does
not open it - it reuses the connection from a previously run script.
See php.net/mysql_pconnect ...
- --
- ----------------------------------
Iván Sánchez Ortega -i-punto-sanchez--arroba-mirame-punto-net
Cabeza: El aparato con que pensamos qué pensamos.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (GNU/Linux)
iD8DBQFEIe1O3jcQ2mg3Pc8RAl3tAJ4m1A4zviByJFgYZt/vRgj1E5jzHgCfSypD
Gd9oZS+L7Ct7dLhkxadg0o8=
=naUp
-----END PGP SIGNATURE-----
Thanks for the input.
More is welcome.
- PD
I use MySQL 4 so subselect is out.
Also I had a confusion. I dont understand how a persistant connection
can help boost the speed. I thought it the other way. So I open and
close a connection at every query. A persistant connection can run the
db into a lot of open connections, wont it? Please throw some light on
this one.
Thanks again!
In some cases, one may run into a data storage problem before running into a
script/query speed problem, even without application server optimization.
This can be taken care of using a storage area network.
Until then, if you want to squeeze the most out of one box, bear these in
mind:
PHP - Split your files up and only run the code that you need for the action
being carried out
PHP - Limit your use of abstracted methods - make sure they are efficient,
and if not write your own.
PHP - You can often write little scripts to loop over subroutines that
contain code you suspect to be sluggish, and determine exactly what the
overhead is. Some built-in PHP methods are faster than combinations of
others that achieve the same thing, especially in specific contexts and
especially for string parsing tasks.
MySQL - Indexes. Learn & use them. Keep them as small as possible.
Good luck & happy programming,
ECRIA Dev Team
http://www.ecria.com
Yes and no. Persistent connections will stay open - but your script will run
very quickly compared to real-time. You probably won't have very many
connections in use at one time. OTOH, there is a significant amount of overhead
in connecting to MySQL.
Personally, I do it a little differently. I make the connection at the
beginning of the script (or no later than first use). Then I close the
connection before the script exits.
In most cases it works as fast as persistent connection (which have their own
overhead associated with them) but doesn't make multiple connections per script
to the database.
As no-one's pointed you to it yet, get Advanced PHP Programming by George
Schlossnagle.
It's an absolutely fantastic book and shows about SQL optimising, caching
within your code, accelerators/caches and everything you'd need to run a
high traffic site.
Cheers,
Andy
--
Andy Jeffries MBCS CITP ZCE | gPHPEdit Lead Developer
http://www.gphpedit.org | PHP editor for Gnome 2
http://www.andyjeffries.co.uk | Personal site and photos
Well, I think I can give you some sound advice here since
we have implemented a couple of those in PHP/MySQL. Of the advice
I saw so far, definitely get the book on optimization. Just some
quickies:
1. When you layout your tables go for FIXED field sizes,
not varchars, for all the basic user info. You may waste a little
space, but you make MySQL's indexing job MUCH easier!
2. If you have stuff like essays, message contents, that
have to be large text -- put those in tables by themselves and index
in.
3. The Query cache is your friend, but to make good use of
it you've got to limit writes to the table. Think of keeping stuff
like 'last active time' in another table.
4. Just about FORGET treating each person as some type
of OBJECT so that everytime you want to put up info on a person
you have to do an individual query. If you are showing search results
on 50 people, go get all the information at once that you will need
on each of them.
5. Sessions data is your friend. Keep as much as you can
there of the stuff you will need on every page, call it a cache.
If you design it in, not too hard to have the cache updated when
you write to the real database.
6. Somethings like enforcing number of saved message
limites, etc... can be a real pain to verify on EVERY query.
Setup a cron job to run even once a day to run off-hours to
check and remove extra stuff (they can 'temporarily' be over
a limit).
Hope this helps!
--
John
___________________________________________________________________
John Murtari Software Workshop Inc.
jmurtari@following domain 315.635-1968(x-211) "TheBook.Com" (TM)
http://thebook.com/
I actually disagree, there's nothing wrong with having an object for a
person, but I agree about not doing simple queries for every lookup.
My classes all implement a selectIDs($array) method that you call once and
it then gets all of those IDs and caches them so when I call my
findID($ID) method it just retrieves it from the cache.