On 09/08/2013 11:07 PM, benj wrote:
> On Fri, 09 Aug 2013 16:21:21 -0700, Don Kelly wrote:
>
>> On 08/08/2013 12:15 AM, benj wrote:
--------snip of past for brevity---------------
>
> So Right now I've got a server with 9 2 terabyte drives, with some in
> RAID configuration, Plus a second server with a terabyte of important
> things, plus a drawer of thumb drives as redundancy, Plus I've got stacks
> of DVDs that are slowly sinking into the sunset and then I've been
> putting important things on blueray Nitride disks for which I have high
> hopes. Pretty close to carved on stone!
>
> Long term safe storage of lots of data is not as simple as you think. I
> really think that there is a possibility of nanomagnetic core solving
> some of this hassle.
>
> Unlike you I don't go back to '61. In those days I was really anti-
> computer and while everyone else was gaga and signing up to run the
> (tube) Univac, I turned my nose up at the whole idea. Later, my first
> efforts were in SCATRAN a private version of Fortran that is now totally
> a dead language. So tell me how am I ever going to get that space back in
> my brain that I used up with the Scatran manual? Or for that matter how
> do I get the space back used up by the DOS manual?
>
I do not have your storage needs and intend to keep it that way.
I have some photos and other data duplicated on a hard drive and flash
drives and DVDs (tax stuff is also stored on paper). I recognize the
need to refresh these storages but my world won't fall apart if I lost
them.
I did not imply that long term storage of lots of data was simple. What
I intended to say was that possibly more effort should be made to
improve long term storage. Nanomagnetic storage is a possibility but
should it be "core" if there are other approaches that can give equal or
better storage densities with equal or better retention-recognizing that
as sizes shrink, the greater chance exists of a random wandering photon
or cosmic particle buggering things up.
Not my area -I am used to Watts, Volts and Amperes to positive powers.
Mad was related to Fortran but more powerful than the Fortrans available
at the time. Of course, at that time, the priests in the air
conditioned computer room fed cards into the machine and later in the
day a printout of results was available -usually with an error message
"This is Mad" along with an Ascii picture of A E Newman.
Do I remember MAD (Michigan Algorithic Decoder) which was better than
the Fortran of its days and introduced concepts later taken up by
Fortran, Fortran (various versions), Pascal and C++ (ugh). Turbo Basic
was good for its time (Fortran on steroids) as it was user friendly.
APL was and is a dream -allowing concentration on the problem rather
than having to make programming decisions that the idiot box could do
for itself but is an interpreter. Presently I am dealing spasmodically
with J which is APL on steroids as an interesting exercise and I am
still learning- spasmodic and short term memory problems don't fit
together very well.
However when one can type +/%# y to get the arithmetic mean of a list of
y values (and can assign this operation to a variable name) or %.X is
the inverse of x (real or complex) and the actual operation involved is
at the machine language level or a (bloated vs assembly) C++ compiled
level. My objective for programming is getting the desired job done-and
I do not want piddly details, even though they are necessary) to get in
the way of this.
Naturally APL and J are not beloved of computer scientists in general.