I'm pretty new to shell programming. I need a little shell script that
provides to start copy processes for parallel execution. It should be
running both on HP-UX and Sun Solaris (nice to have: RedHat Linux).
I have to made an cold backup of an oracle instance. The datafiles to copy
are on
/disk1/oradata/SID/
/disk2/oradata/SID/
....and so on.
So I would like to have:
cp /disk1/oradata/SID/* /disk1/oradata/SID/hotbackup
cp /disk2/oradata/SID/* /disk2/oradata/SID/hotbackup
.....
These copy command should be run parallel, at the same time. I got several
systems with 2 till 10 cpus in the box installed, so I would like to be
asked first how much parallel cp processes I would like to run parallel
before the script starts copying.
BTW: Do you know hot to pipe the command to get tar.gz files? This would be
slow down the whole hotbackup procedure, but disk space is rare on my
systems...
thanks in advance for your answers! Casi
>Dear all
>
>I'm pretty new to shell programming. I need a little shell script that
>provides to start copy processes for parallel execution. It should be
>running both on HP-UX and Sun Solaris (nice to have: RedHat Linux).
>
>I have to made an cold backup of an oracle instance. The datafiles to copy
>are on
>/disk1/oradata/SID/
>/disk2/oradata/SID/
>....and so on.
>
>So I would like to have:
>
>cp /disk1/oradata/SID/* /disk1/oradata/SID/hotbackup
>cp /disk2/oradata/SID/* /disk2/oradata/SID/hotbackup
>.....
>
>
>These copy command should be run parallel, at the same time. I got several
>systems with 2 till 10 cpus in the box installed, so I would like to be
>asked first how much parallel cp processes I would like to run parallel
>before the script starts copying.
You can execute multiple 'cp' operations as background processes by putting
an ampersand '&' on the end of the line. This will be documented in the
shell man page (man sh/bash/ksh). I would try just running them all
in one go and letting the system do the scheduling first, *after* that
if you think that you aren't getting enough out of the system resources
look for ways to tweak it,
>
>BTW: Do you know hot to pipe the command to get tar.gz files? This would be
>slow down the whole hotbackup procedure, but disk space is rare on my
>systems...
Maybe it will speed it up; afterall if you compress the file you will
generally have less data to write. The CPU's will get more work to do
I suppose but if you were just using 'cp' they don't have hardly
any work to do - most of the time is going to be idle waiting for IO
to finish, so...
To tar/gzip, "man tar/gzip". basic usage;
tar cf - /disk1/oradata/SID/* |gzip >/disk1/oradata/SID/hotbackup
Hopefully you have 'gzip' present with Sun Solaris. If not then go to;
:-)
>
>
>
>thanks in advance for your answers! Casi
>
>
>
>
>
byefornow
laura
--
alt.fan.madonna |news, interviews, discussion, writings
|chat, exchange merchandise, meet fans....
|Get into the groove baby you've got to... check us out!
thanks for your quick answer - it runs very well.
but to specify how much cp processes are running at the same time would be
nice to, because I have otherwise 12 processes running in parallel, and the
amount of cpu for sys is very (too) high - so 1 process per cpu would be the
fastest way.
do you have any idea how to implement this?
best regards casi
"laura fairhead" <LoveMrs...@madonnaweb.com> schrieb im Newsbeitrag
news:3e67c2e0...@NEWS.CIS.DFN.DE...
cp <source1> <dest1> &
cp <source2> <dest2> &
cp <source3> <dest3> &
wait
cp <source4> <dest4> &
cp <source5> <dest5> &
cp <source6> <dest6> &
wait
and so on
will do them 3 at a time.
If you want to generalize it to N, you should be able to write a simple
shell script that uses a loop.
--
Barry Margolin, barry.m...@level3.com
Genuity Managed Services, Woburn, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
Please DON'T copy followups to me -- I'll assume it wasn't posted to the group.
>dear laura
>
>thanks for your quick answer - it runs very well.
>
>but to specify how much cp processes are running at the same time would be
>nice to, because I have otherwise 12 processes running in parallel, and the
>amount of cpu for sys is very (too) high - so 1 process per cpu would be the
>fastest way.
>
>do you have any idea how to implement this?
>
Okay. I've worked out a method to do this for another request (which was asking
a very similar thing and the post is right after yours ! :) you should have
no trouble modifying it for your own purposes - but if you do then just post
here what the problem is and I'll try to help,
My solution is in this thread, should be almost right after this post (in
comp.unix.shell)
From: "[new]" <n...@reply.to.group>
Newsgroups: comp.unix.shell
Subject: bash script, wget queue script......
Message-ID: <tQO9a.34961$Rc7.4...@news2.e.nsc.no>
Date: Thu, 6 Mar 2003 22:11:38 +0100
Good luck :-)
bestwishesfrom
In de.comp.os.unix.shell Casi Schmid <casimir...@swisscom.com> wrote:
> I need a little shell script that provides to start copy processes
> for parallel execution.
If a Perl script would also be okay you might try this:
http://www.h.shuttle.de/mitch/backgrounder.en.html
Regards,
Christian,
f'up2poster
--
....Christian.Garbs.....................................http://www.cgarbs.de
Sechsen, setz!
>> >cp /disk1/oradata/SID/* /disk1/oradata/SID/hotbackup
>> >cp /disk2/oradata/SID/* /disk2/oradata/SID/hotbackup
Assuming that your 'cp' is smart enough not to copy 'hotbackup' back
into itself, then parallel copy is
xargs -P 12 -l 1 eval cp <<EOF
/disk1/oradata/SID/* /disk1/oradata/SID/hotbackup
/disk2/oradata/SID/* /disk2/oradata/SID/hotbackup
...
EOF
where '12' can be adjusted to your taste. No trailing whitespaces in
the input.
--
William Park, Open Geometry Consulting, <openge...@yahoo.ca>
Linux solution for data management and processing.