On Tue, 15 Jan 2019 18:12:08 +0100
José Valim <
jose....@plataformatec.com.br> wrote:
> It is best to leave it up the caller as doing it concurrently does not
> necessarily make it faster. You need to be confident you have enough
> content to copy to warrant concurrency. In such cases, you can split it up
> the source directory by calling File.ls and then File.cp_r each of the
> directories.
Then a library is a the place for a function like this. Currently working on something.
The one thing that is missing or that could be improved, is that
1. cp_r doesn't clean up after itself in case of an error,
2. lack of functions to move files between different file systems (not easy to implement)
Regarding one, I've been playing around with this library which intents to do concurrent copying,
and has 3 on_error strategies, :quit, :continue, :clean_after.
:quit is the way cp_r/3 behaves now "quick and dirty", :continue is to copy as many files as
possible and :clean_after leave everything as in the intial state where it's "all or nothing".
having an on_error strategy can save a lot of time if the user is intended only on having all files
copied (:clean_after), an initial walk through source and destination could be
done to check for file permissions (which is a common source of errors I guess) and if any error is
found, no file is copied at all.
Let me know if an addition like this to File.cp_r/3 would be welcomed by the core team.
>
>
> *José Valim*