Re: [cloudlab-users] Large file (>100GB) transfer between machines in the same experiment

17 views
Skip to first unread message

Mike Hibler

unread,
Jan 24, 2022, 12:13:57 AM1/24/22
to cloudla...@googlegroups.com
If you just want it on every machine in the experiment, you can use "scp",
"rsync", whatever from one machine to the others. As long as you have an
exerimental LAN with all the machine in it, you can copy over that network
using the non-FQDN names from the /etc/hosts file (i.e., the ones that have
10.x.x.x addresses). If you want to share a single copy of the pcap file,
then setup an NFS server on one of your nodes and use NFS across the
experiment fabric.

On Sun, Jan 23, 2022 at 04:41:43PM -0800, Yucheng Yin wrote:
> Hi,
>
> I have a very large pcap file (>100GB) in one machine's local disk (/mydata).
>
> Since we will perform I/O intensive tasks on the pcap, we prefer not to move it
> to NFS to overload/crash NFS.
>
> Is there a way to "share"/"download" this file for other machines in the same
> experiment and also put the file in each machine's local disk? (maybe utilize
> the ultra high-speed switch there). All of the machines in the experiment are
> inter-connected (we are using a star topology).
>
> Thanks,
> Yucheng
>
> --
> You received this message because you are subscribed to the Google Groups
> "cloudlab-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an email
> to cloudlab-user...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> cloudlab-users/bdde809b-05aa-42c9-85a8-9fcaadb41395n%40googlegroups.com.

Reply all
Reply to author
Forward
0 new messages