Going by our initial research and criteria, for each round of testing we settle on external desktop hard drives and portable models to evaluate. We first test them using the benchmarking program HD Tune. For a more real-world measurement, we then time the transfer of a 15 GB folder including a Blu-ray movie and a 31 GB folder of music. We perform each test six times, and we determine the average read and write speeds to rule out performance hiccups.
The Seagate Expansion drive performed fine in our large-file transfer tests but had the slowest speeds of any desktop drive we tested in our small-file transfer tests. It also disconnected itself from our PC without warning in the midst of testing and failed to connect again afterward.
The Toshiba Canvio Flex was a pick in a previous version of this guide. The Canvio Flex and Canvio Advance Plus come with USB-A and USB-C cables, but both drives still have a Micro-B connector on the body. The more durable USB-C port and the higher capacity of our new pick, the Western Digital My Passport Ultra, make it a more attractive option.
Users can first start by understanding what they need and then choose from the number of backup options that exist. This means understanding the pros and cons of each choice. The backup options depend on the source of data storage, which in this case is an external hard drive.
External hard drives do not come attached to your computer. They help keep data secure in a separate storage unit not attached to your computer. It can be categorized as a form of local backup. However, it can also be vulnerable to damage.
The automated process involves using backup software to handle the process for you. In this scenario, users can opt for AOMEI Backupper. An open-source application, it can sync all of the contents of one external hard drive to another. The software is capable of taking full, incremental and differential backups of your files and folders.
The process of copying all of the contents from an external hard drive to a computer can even be general in nature. This means that users might not be backing up data, rather it could also be a process of transferring data to a computer.
In a cloud storage service, users are presented with a certain amount of space for an online backup. Here, users upload data from their digital device on to the cloud. This helps provide an opportunity to manage your data offsite, without fear of data loss.
Cloud storage removes the fear of your data being lost by any physical damage. It also has a number of added factors for backup, such as users can access this data from any digital device. It also saves costs spent on maintaining a physical system. These added advantages make cloud storage the best option for your backups.
To backup data from your external hard drive to the cloud, a computer will provide the easiest option. This is because an external hard drive cannot transfer the data to the cloud itself. It must first be transferred to a computer and then moved to the cloud. However, once this data is on the cloud, users do not need to worry about data loss.
Cloud storage services such as Google Drive, Dropbox and OneDrive are key examples of popular options available. Cloud storage services usually provide limited storage space and manual uploading of data. For users who wish to backup large amounts of data and with an automated process, should go for cloud backup services.
Unlike cloud storage, cloud backup services are dedicated to backups. This means the service centers meet all forms of backup needs. As such, most services provide unlimited storage space. For users with multiple external hard drives, cloud backup services such as Backup Everything, Amazon Web Services or Microsoft Azure can help.
Out of all the alternatives present, backing up data from an external hard drive to cloud is the best option available. This is because the chances of losing your backup files are extremely low. Also, users can access the backed-up data from any digital device.
While the methods presented above might serve as primary use cases, others still exist. Such as, users can even choose between Network Attached Storage (NAS) or an external USB flash drive. A number of disadvantages are presented within these options, but some user requirements might even make them a good alternative.
The limited storage space of USB drives might be an issue for users who wish to backup large amounts of data. However, if the files you wish to backup are selective and need small storage space, USB drives can help. They would cost less and provide more portability.
In contrast, users can even opt for Network Attached Storage (NAS). Connected via Ethernet, the user is provided the ability to share data without an internet connection. The data stored here can be accessed by multiple people. Backing up data from an external hard drive to this network can remove the need for storage space. However, the high cost and complex installation might be an issue. Chances of data loss would be low, and users can manage their data offsite.
In the end, your backup plan will matter the most. The different backup options presented here provide a viable space for backups. Given the vulnerable nature of physical storage, cloud services seem the best option available due to its efficient data security and cost-effectiveness.
We pride ourselves on having a cloud backup solution for everyone as every business has a different requirement. Whether you want to backup Servers, Virtual Machines or Microsoft 365 we will have something for you. Our alignment is not with any vendor or product but with the best fit for your backup and disaster recovery needs. Contact us anytime for a transparent chat about what we have and what is there in the market today, I am sure we will be able to help you.
From my point of view I would simply add another path on the PC and then configure the NAS to send-only to that path (ext. drive). And as long as the USB config stays the same, the path should not change when plugging in the device a few weeks later.
You are better off using a backup program instead, like for example restic. There are many benefits: Snapshots which allow you to go back in time, protection against ransomware under the right circumstances. And more.
Just a quick comment, if you do decide to go this route, I would not rely on the path alone but rather hardcode the device/volume ID in the path. This will differ depending on the OS, e.g. in Windows instead of letters like D:\Folder you would use \\?\Volume1b3b1146-4076-11e1-84aa-806e6f6e6963\Folder, etc.
I'm assuming that none of that stuff is necessary as long as I'm just rsyncing from a computer to a firewire-connected external drive. I'm I wrong in assuming that? Are things really going to be more complicated than that innocuous command?
Rsync works fine across local drives. However, if it detects local paths it automatically goes into --whole-file mode which does not copy the diffs, but just copies the source file over the destination file. Rsync will still ignore files that haven't changed at all though. When bandwidth between the source and destination is high (like two local disks) this is much faster than reading both files, then copying just the changed bits.
However, if one or both drives happen to be NTFS formatted, being accessed from *nix or even from within Windows using Mobaxterm/cygwin, then rsync incremental functionality wouldn't work well with rsync -a (archive flag)
One thing you might want to consider when using rsync however is to make use of the --link-dest option. This lets you keep multiple backups, but use hard links for any unchanged files, effectively making all backups take the space of an incremental. An example use would be:
Your command as written should work, however you might want to look at a program called rsnapshot which is built on top of rsync and keeps multiple versions of files so you can go back and look at things as they were last week or last month. The configuration is pretty easy and it is really good at space optimization so unless you have a lot of churn it doesn't take up much more space then a single backup.
Finally I have ended up with 'backup2l - low-maintenance backup/restore tool', it's easy. I like the way it manages planning and rotation (in levels). I run it whenever I have my USB external drive attached from the command line but you can also automate it.
Try dirvish to do the backup.
It uses the hardlinks from rsync in the so-called vaults. You can keep as much of your older dumps as the USB disk can take. Or set it up in a an automated way.
Once you understand the idea of dirvish it more convient to use than rsync with all his options it self.
I do not use rsync with local drives but Rsync is wonderful for sync, cloning, backup and restore of data between networked linux systems. A fantastic Linux network enabled tool worth spending time learning. Learn how to use rsync with hard links (--link-dest=) and life will be good.
In the name of speed, Rsync, in my experience, changes automatically many operational parameters when rsync believes it has detected two local drives. Making matters worse. What is considered "local" from rsync's perspective can at times not really be that local. One example rsync sees a mounted SMB network share as a local drive. One can argue and be correct in explaining that in this case for rsync as a program instance the drives are all "local" but this misses the point.
The point is that scripts that operate as expected when used between a local and a remote drives do not work as expected when the same script is used where rsnyc sees the data paths as two local drives. Many rsync options seem changed or do not seem to work as expected when working with all local drives. File updates can slow to a crawl when one of the "local" drives is a networked SMB share or a large slower USB drive.
760c119bf3