Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How to minimize server load when program is run

27 views
Skip to first unread message

Justin C

unread,
Jun 13, 2013, 10:50:20 AM6/13/13
to
My web-hosts are running perl 5.8.8, other software there is of a
similar age, and some things are missing (I wanted to 'nice' my
program, but there is no 'nice').

I have written a backup program to tar and gzip my entire directory
tree on their site, and also to dump the db and add that to the tar.
The program I have written runs one of my cores at 100% for two
minutes, and uses almost 100MB RAM. If there is a way I'd like to
reduce this load (as I can't 'nice' it).

I haven't tried running the program yet, I don't want to get a
bad name for maxing out the hardware. I've used core modules only,
and I've used them as per documentation for the versions that were
part of 5.8.8. I've pasted the code below, I'd be grateful for
suggestions on how I could do the same while putting as little
load on the server as possible.

~ $ cat bin/wp-backup.pl
#!/usr/bin/perl
use warnings;
use strict;
use Archive::Tar;
use File::Find;

# global vars
chomp (my $now = `date +"%Y-%m-%d-%H%M"`);
my $tar;
my $file = "site.com.$now.tar.gz";
my $backup_dir = '/var/sites/s/site.com/backups';

create_archive();
my $db = extract_db_data();

$tar->add_files($db);
$tar->write($archive, 9);

sub archive_it {
my $new_name = 'public_html/' . $_;
(my $old_name = $File::Find::name) =~ s/^\///;
$tar->add_files($File::Find::name);
$tar->rename($old_name, $new_name);
}
sub create_archive {
my $www_dir = '/var/sites/s/site.com/public_html';

$tar = Archive::Tar->new; # declared in globals
find(\&archive_it, $www_dir); # &archive_it adds it to the tar
$tar->write($archive);
}
sub extract_db_data {
my $db = {
user => 'name',
pass => 'password',
name => 'db',
file => "site.com.$now.sql",
host => '1.0.0.0',
};

my @args = ('mysqldump', '--add-drop-table', '--complete-insert',
'--extended-insert', '--hex-blob', "--host $db->{host}", "--user=$db->{user}",
"--password=$db->{pass}", $db->{name}, '>', "$backup_dir/$db->{file}");
system @args == 0 or die "problem running mysqldump: $!";
return $db_file;
}

__END__

Thankyou for any help or suggestions.


Justin.

--
Justin C, by the sea.

Dr.Ruud

unread,
Jun 13, 2013, 11:17:42 AM6/13/13
to
On 13/06/2013 16:50, Justin C wrote:

> The program I have written runs one of my cores at 100% for two
> minutes, and uses almost 100MB RAM. If there is a way I'd like to
> reduce this load (as I can't 'nice' it).

If you want to nice it, see POSIX::nice().

--
Ruud

Jim Gibson

unread,
Jun 13, 2013, 11:45:22 AM6/13/13
to
In article <cq2p8a-...@zem.masonsmusic.co.uk>, Justin C
<justi...@purestblue.com> wrote:

> My web-hosts are running perl 5.8.8, other software there is of a
> similar age, and some things are missing (I wanted to 'nice' my
> program, but there is no 'nice').
>
> I have written a backup program to tar and gzip my entire directory
> tree on their site, and also to dump the db and add that to the tar.
> The program I have written runs one of my cores at 100% for two
> minutes, and uses almost 100MB RAM. If there is a way I'd like to
> reduce this load (as I can't 'nice' it).
>
> I haven't tried running the program yet, I don't want to get a
> bad name for maxing out the hardware. I've used core modules only,
> and I've used them as per documentation for the versions that were
> part of 5.8.8. I've pasted the code below, I'd be grateful for
> suggestions on how I could do the same while putting as little
> load on the server as possible.

What about doing a sleep(1) after every n files (or n bytes or n
seconds)? Your program will still max out a CPU while it is active, but
the average usage will be less. If you sleep(1) after each 1 second of
execution, then your program will take 4 minutes to run and use 50% of
a CPU.

--
Jim Gibson

Ben Morrow

unread,
Jun 13, 2013, 12:18:00 PM6/13/13
to

Quoth Justin C <justi...@purestblue.com>:
> My web-hosts are running perl 5.8.8, other software there is of a
> similar age, and some things are missing (I wanted to 'nice' my
> program, but there is no 'nice').
>
> I have written a backup program to tar and gzip my entire directory
> tree on their site, and also to dump the db and add that to the tar.
> The program I have written runs one of my cores at 100% for two
> minutes, and uses almost 100MB RAM. If there is a way I'd like to
> reduce this load (as I can't 'nice' it).
>
> I haven't tried running the program yet, I don't want to get a
> bad name for maxing out the hardware. I've used core modules only,
> and I've used them as per documentation for the versions that were
> part of 5.8.8. I've pasted the code below, I'd be grateful for
> suggestions on how I could do the same while putting as little
> load on the server as possible.
>
> ~ $ cat bin/wp-backup.pl
> #!/usr/bin/perl
> use warnings;
> use strict;
> use Archive::Tar;

If you're worried about memory use tar(1) is probably more parsimonious
than Archive::Tar, not least because it doesn't try to build the entire
tarball in memory before writing it out. Of course, you may not have
tar(1)...

You could also use something like Archive::Tar::Streamed, possibly
passing it a pipe to gzip(1) or an IO::Compress::Gzip filehandle. (If
you don't have IO::Compress::Gzip use IO::Zlib instead. You must have
that, Archive::Tar requires it.)

> use File::Find;
>
> # global vars
> chomp (my $now = `date +"%Y-%m-%d-%H%M"`);

Ouch! POSIX::strftime.
^^^
Nope, that won't work. That will pass the '>' as a literal argument to
mysqldump, when you want it interpreted as a redirection. You need to
use 1-arg system (and be careful about your quoting) or do the whole
fork/dup/exec dance by hand.

> system @args == 0 or die "problem running mysqldump: $!";

Ben

Willem

unread,
Jun 17, 2013, 1:05:16 PM6/17/13
to
Justin C wrote:
) My web-hosts are running perl 5.8.8, other software there is of a
) similar age, and some things are missing (I wanted to 'nice' my
) program, but there is no 'nice').
)
) I have written a backup program to tar and gzip my entire directory
) tree on their site, and also to dump the db and add that to the tar.
) The program I have written runs one of my cores at 100% for two
) minutes, and uses almost 100MB RAM. If there is a way I'd like to
) reduce this load (as I can't 'nice' it).

Odd, I would have expected a tar/gzip action to be I/O bound.
That is, use 100% disk read/write capacity and not as much CPU.

Have you tried how long the 'tar' command takes, how much CPU that uses,
etc? Or perhaps the database dump is the culprit. You should test those
separately.


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

Ben Morrow

unread,
Jun 17, 2013, 4:06:51 PM6/17/13
to

Quoth Willem <wil...@turtle.stack.nl>:
> Justin C wrote:
> ) My web-hosts are running perl 5.8.8, other software there is of a
> ) similar age, and some things are missing (I wanted to 'nice' my
> ) program, but there is no 'nice').
> )
> ) I have written a backup program to tar and gzip my entire directory
> ) tree on their site, and also to dump the db and add that to the tar.
> ) The program I have written runs one of my cores at 100% for two
> ) minutes, and uses almost 100MB RAM. If there is a way I'd like to
> ) reduce this load (as I can't 'nice' it).
>
> Odd, I would have expected a tar/gzip action to be I/O bound.
> That is, use 100% disk read/write capacity and not as much CPU.

The OP is using Archive::Tar, which builds the tar entirely in memory
(uncompressed, I think). I have recommended he switch to tar(1)...

Ben

Justin C

unread,
Aug 9, 2013, 6:02:40 AM8/9/13
to
On 2013-06-13, Justin C <justi...@purestblue.com> wrote:
> My web-hosts are running perl 5.8.8, other software there is of a
> similar age, and some things are missing (I wanted to 'nice' my
> program, but there is no 'nice').
>
> I have written a backup program to tar and gzip my entire directory
> tree on their site, and also to dump the db and add that to the tar.
> The program I have written runs one of my cores at 100% for two
> minutes, and uses almost 100MB RAM. If there is a way I'd like to
> reduce this load (as I can't 'nice' it).


[snip]

Apologies for the (very) late follow up to this. I spent some time
pondering the options, and tried Ben's suggestion of
Archive::Tar::Streamed, but it's not installed (and fixed my bad
date call, thank you Ben). In the end I used bash, and the program
runs in about ten seconds.

I realise that, had I written the program well enough, I might have
got close to that short a time with Perl, but I'm happy with the
bash solution.

Thanks to all who replied, all suggestions were useful.

johannes falcone

unread,
Aug 15, 2013, 11:19:44 PM8/15/13
to
bzip2 with -1 level is best combo of speed and compress on linux.
consider lvm snapshots for db backup
:)
use rsync or place it in http server and wget pull it
0 new messages