--
--
You received this message because you are subscribed to the Google
Groups "Harbour Users" group.
Unsubscribe: harbour-user...@googlegroups.com
Web: https://groups.google.com/group/harbour-users
---
You received this message because you are subscribed to the Google Groups "Harbour Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to harbour-user...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/harbour-users/dcfb0c81-1759-4577-afb7-aeb691b6996dn%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/harbour-users/47f3f64d-6797-443a-a885-85ecc33935c4n%40googlegroups.com.
I use the filesystem capabilities (and no SQL). The table has the file name and sudirectory. Be careful, a single directory under NTFS can handle about 17000 files IIRC, not mentioning the slowness to access such a crowded directory. So I create a subdir for every day, named after the date() (dtos(date)), thus the limit is 17000 documents per day. The file also is opened by the OS with whatever PDF reader you have as default viewer. Even in networked environment the technique shows an acceptable access time. The program is quite efficient, in other words.
I fear using blobs and similar solutions. With this technique, if
the database goes nuts, you'll have all your original files intact
and you could reconstruct the database starting from the documents
dir/subdirs (a mess, but feasible).
Dan
To view this discussion on the web visit https://groups.google.com/d/msgid/harbour-users/47f3f64d-6797-443a-a885-85ecc33935c4n%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/harbour-users/27571896-5924-4bbd-8d31-b91f11f97162n%40googlegroups.com.