Im programming a simple webcrawler with threading for the fun of it, which
is inserting the data fetch into a mysql database, but after continuously
cause my mysql server to produce error during database queries (i assume
its cause because of the many execution at the same time.) the scipt
produces errors.
I figured out i need some kind of queue for the function i use to insert
into the database, to make sure its only called once at a time.
I have looked at the Queue module but its for more complicated than my
current python skills can cope with. :)
Would somebody please help me out here?
Thanks.
--
John P.
I wonder if there is a simple way of just queueing the run of a function
make it only run once at a time but by multiply threads? :)
On Fri, 12 Mar 2010 00:54:57 -0800, Jonathan Gardner
<jgar...@jonathangardner.net> wrote:
> For lots of transactions running at once, MySQL is a terrible choice.
> Give PostgreSQL a try. It does a much better job with that kind of
> load.
Why? It's your best option. Any other solutions that you can't use
before people give you more suggestions?
--
D'Arcy J.M. Cain <da...@druid.net> | Democracy is three wolves
http://www.druid.net/darcy/ | and a sheep voting on
+1 416 425 1212 (DoD#0082) (eNTP) | what's for dinner.
Chill. I didn't ask for an alternative to my database, it could be writing
in a file as well. What I need is some kind of queue to my function so it
doesnt crack up running 20 times at the same time. I'm working on a remote
server which I share with some friends, meaning I can't install whatever I
want to.
The problem seems to be that my threads are making MySQL queries at the
same time before it can fetch the data just requested. Also I said thanks
for his effort trying to help and kept a nice tone, shouldn't that be
enough?
/John
Don't worry guys, I found a solution. My problem was caused because I used
the same mysql connection for all the treads, now its working perfectly,
with mysql.
/John.