Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Need help with handling timeouts in curl_multi_exec

402 views
Skip to first unread message

Donald Nash

unread,
Jul 27, 2010, 1:22:35 PM7/27/10
to
Hello,

In order to use curl_multi_exec, I've derived my code from the following
link: http://www.rustyrazorblade.com/2008/02/curl_multi_exec/

I've got a bunch of URLs to execute, usually 10 in number. Sometimes, I
get the error message:
Fatal error: Maximum execution time of 30 seconds exceeded in... This
message points to the curl_multi_exec line in my code.

I need to continue processing in spite of this and also keep a note of
which URLs failed to execute.

Is there a way of handling timeout errors resulting from curl_multi_exec
operations?

Thanks!

--
A

Marious Barrier

unread,
Jul 27, 2010, 1:36:22 PM7/27/10
to
> operations.

It is not an error of curl, it is because your php script has lasted too
much before returning something to the user.

Is that server yours?

You could do some workaround by working with one or two urls then
redirecting you and start a page load again.

Jerry Stuckle

unread,
Jul 27, 2010, 3:11:23 PM7/27/10
to

It means you've been processing for over 30 seconds, which is the limit
in most installations (and actually quite a bit of execution time). You
can change that value, if your hosting company allows. See
http://us.php.net/manual/en/function.set-time-limit.php for more info.

But remember, if you take too long to respond to the client, it will
time out also (which is what Marious referred to, but is an entirely
different problem).

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstu...@attglobal.net
==================

Donald Nash

unread,
Jul 27, 2010, 11:00:06 PM7/27/10
to
On Wednesday 28 July 2010 12:41 AM, Jerry Stuckle wrote:
> Donald Nash wrote:
>> Hello,
>>
>> In order to use curl_multi_exec, I've derived my code from the following
>> link: http://www.rustyrazorblade.com/2008/02/curl_multi_exec/
>>
>> I've got a bunch of URLs to execute, usually 10 in number. Sometimes, I
>> get the error message:
>> Fatal error: Maximum execution time of 30 seconds exceeded in... This
>> message points to the curl_multi_exec line in my code.
>>
>> I need to continue processing in spite of this and also keep a note of
>> which URLs failed to execute.
>>
>> Is there a way of handling timeout errors resulting from curl_multi_exec
>> operations?
>>
>> Thanks!
>>
>> --
>> A
>
> It means you've been processing for over 30 seconds, which is the limit
> in most installations (and actually quite a bit of execution time). You
> can change that value, if your hosting company allows. See
> http://us.php.net/manual/en/function.set-time-limit.php for more info.
>
> But remember, if you take too long to respond to the client, it will
> time out also (which is what Marious referred to, but is an entirely
> different problem).
>

Thanks for the replies.

I want to maintain the time limit at 30 seconds.

After the time has elapsed for a certain URL, I want to move on with the
next URL and attempt to execute it.

My script ends abruptly. I need a way to handle each timeuot, and note
the URL which fails.

Thanks.


Jerry Stuckle

unread,
Jul 28, 2010, 8:13:32 AM7/28/10
to

The 30 second timeout will stop your PHP script. That's the way it's
designed. You can't keep it at 30 seconds and limit execution of one
part of your script to 30 seconds. The script execution time will
always take precedence and terminate your script.

If you want to control the curl time limits, see the various options in
http://us2.php.net/manual/en/function.curl-setopt.php. There are
several to control timeouts, transfer speed, etc.

You'll probably have to handle the url's via individual calls, however.

0 new messages