Hi my wife is the keen amateur photographer (although I am getting into it from a nerd / tech perspective) and she has only just contemplated shooting in DNG rather than JPEG. Currently she uses Icloud to keep her pics, and I am about to download Lightroom. She takes a lot of pics for an amateur (on her Q2), recently in Venice about 2200 over a few days. Clearly now shooting in DNG is going to present storage problems of its own. We also have a desktop Mac as well as ipad and iphone as well as quite slow wifi (despite advertised speeds). In the same scenario, how would members proceed? We have been struggling with data transfer speeds. How do people normally shoot? Straight to memory card and then memory card into computer? Sorry for multi-faceted beginners question! Thanks Graham
As @jaapv says, do it old school and download to your own external drive, it will outlast the vagaries of future computers or any glitches in uploading to the cloud. Upload to the Cloud as a backup, but even then your genuine and important backups should be to another hard disk run in tandem. I'm not a fan of massive TB external drives, but I don't think the risk is as high as needing to work on your photos if your internet goes down and everything is locked away on the iCloud.
I dig into the archive occasionally to make sure the older drives are ok, I've only ever had one external drive go down. The main caveat about archiving this way is to be disciplined with regular backing-up to both drives.
I can't comment on how shoot it normally. But to me 2K pictures over few days is something I was doing at early stage of using digital. Later, I have grown into Leica photography with film M. I observe, wait and take. No retakes.
No reason why you should have a problem with any size drive. I prefer smaller capacity drives mainly because I keep my paid work seperate from my personal work. I also archive different types of work on seperate pairs of drives, ie weddings, portraits, events etc.
Basic minimum setup is a box with two drives arranged in RAID 1. This means all your data are mirrored on two drives, so that if one drive fails you can continue working with the other drive. All your devices can connect to your NAS through your home network, as well as remotely. Your main computer and NAS need to be on the wired network, otherwise it will be probably be too slow. You still need to make at least two backups of your NAS for data safety. The NAS box has USB ports for connecting external drives.
But what happens in the failure is not in one of the drives, but in the NAS box itself? Will a replacement NAS box built to a NAS standard that has moved along in the meantime, and is perhaps made by a different manufacturer, recognise your existing drives and recreate the array?
Drive system failure, fire, theft and drive system manufacturer doing out of business (eg Drobo) - all have the same impact. So as Jeff said in post #6, you need a duplicate storage system, one of which is off site.
My QNAP box eventually failed once. I bought a new one (different model) slotted the old drives in and it all worked perfectly. I had the data all backed up on separate external hard drives though so I was covered either way.
If the OP decides to invest in a NAS, remember that RAID is designed primarily for business continuity. I also have mine set up as RAID 1, which means the second drive mirrors the first one which, in turn means that if you delete something, it is deleted off both drives.
To the OP, like others here, I import using an SD card reader with the images going straight into an external hard drive, which is then backed up several times but with the Library, as called in Capture One (ie the interface through which you view, modify and organise your images) held on the computer and backed up online to my Dropbox account.
My image store is now on a 10Tb internal HDD on the desktop PC, backed up to an external RAID 1 drive, which is backed up to the cloud (Backblaze). My Lightroom catalogue is on an internal m.2 drive for speed.
Minor disagreement. I prefer to save the instructions needed to create the edited file on demand. That might be in a Lightroom catalog, written to the DNG file (possible with Lightroom, not Capture One), or Capture One .cos and .comask files. In my case the odds are that I'll tweak the edit before re-creating an image, anyway. That's not to say I don't have some edited copies. If I prepare an image for a web page the copy will live on that web page, but it is unlikely it will be at full resolution.
I very much agree. And came to that conclusion the hard way (twice!) There is one constant in software and that is that everything changes very fast. Your DNG is to be seen as a film negative. To see the result you need some extra steps and these steps are made by your PP software with a TIFF or JPEG as a final result. What if your future PC or Mac refuses to run your current PP software? All you can do is try to import your project files into the future version... And what if there is no upgrade available, and no migration possible without loss of information? I found out that all edits that I made in Apple Aperture where completely lost once I tried to move to Adobe LightRoom. All it could import from my catalogs were keywords and ratings (and not even that in full). This is because every PP uses a different engine behind each slider. Exposure +1 does not mean the same in LR or C1 or Aperture, and clarity +1 can not even be translated to some of the PP software packages out there.... And the 'future' is sometimes closer than you think. Lots of things happen in 5 years, 10 years is a lot and 20 years is an eternity.
So it is very important to save the time spent in editing your DNGs by exporting them as full resolution JPGS or TIFFS and have a backup of those in a simple, logical file structure. Keep this and your original DNGs and you can decide later to use your current result or redo the work you did today with more advanced PP software later.
As said above, this works for a few years, but not over decades because even upgrading within one brand will become impossible without loss over time. All software we now use will become incompatible one day. Even reading current hardware and even JPEG or TIFF will become hard one day. If you do not believe me... Try to read a 5 1/4" floppy disk on any computer now, or try to open a Word Perfect 1.0 file or a Lotus notes file from the 80s.
Exactly. Any backup system depending on anything more than a basic file standard like JPEG or TIFF and a common file system is doomed to fail by incompatibility rather soon. A nice backup of organized JPEGS is the best bet for the future.
@Graham H My long term backup strategy is to keep several copies (2 at least of everything important). Each year I use an old hard drive to copy all of it and give it to a friend or family member to keep safe as offsite storage. When disks grow larger, I copy over the small disks to a larger one. I can go back this way to about 1990 in files, some of which might now be very hard to read because the original software is not around anymore. The original hardware would be impossible to read now because hardware to do that would be extremely hard to find, and the disk itself would probably not work anymore anyway. But because it is always copied over to the next computer, the files are intact.
Most of my data is copied in the cloud, but my photos are only stored in the cloud as reduced size JPGs, not as DNG. Offline storage is only used as worst case backup scenario.
Price of storage and size of disks have improved more than my needs for data over the years. In 1990 my computer had only 100MB of disk storage and all important files only took up about 10 MB of disk space. Now I can store everything including my DNGs on one 2TB disk (50K+ images). I have about 10 TB in use now (more if you count every disk in the house), most of it is redundant or for backups. My largest disk is now 5TB. So I can still keep this up for the foresee-able future. Having a yearly copy of everything, kept off site is almost completely safe. Over more than 30 years of computer usage (started in 1987), I can say the hardest issues were human error and software incompatibility rather than hardware failures.