Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Holographic/Quantum Neural Technology (HNeT) !?

411 views
Skip to first unread message

news

unread,
Nov 1, 1999, 3:00:00 AM11/1/99
to
I was surfing the web for neural network information
and came across the And Corporation website.
They have a technology called Holographic/Quantum Neural Technology (HNeT).
About which them claimed --

"It is now realized and accepted that a single holographic/quantum neural
neuron cell is capable of learning stimulus-response patterns or "memories"
orders of magnitude faster, and far more accurately than the traditional
back-propagation or genetically based neural networks."

My questions are --

1) Is this a technological breakthrough recognized by the neural network
community, or is this a marketing breakthrough that repackages
the same old sh*t in cool new terminology?

2) Is there any commercial technology out there that is clearly superior
to others?

I'm a comp sci student, with some basic knowledge of neural networks.
Any input would be greatly appreciated.


Peter Davies

unread,
Nov 1, 1999, 3:00:00 AM11/1/99
to

news wrote in message ...

>
>"It is now realized and accepted that a single holographic/quantum neural
>neuron cell is capable of learning stimulus-response patterns or "memories"
>orders of magnitude faster, and far more accurately than the traditional
>back-propagation or genetically based neural networks."
>
>My questions are --
>
>1) Is this a technological breakthrough recognized by the neural network
>community, or is this a marketing breakthrough that repackages
>the same old sh*t in cool new terminology?
>

It is the latter. The community has not "... realized and accepted that a
single holographic/quantum neural
neuron cell ....". This is simply marketing hype.

Peter Davies


bj flanagan

unread,
Nov 1, 1999, 3:00:00 AM11/1/99
to
"Peter Davies" wrote:

> >"It is now realized and accepted that a single holographic/quantum neural
> >neuron cell

> >1) Is this a technological breakthrough recognized by the neural network
> >community, or is this a marketing breakthrough that repackages
> >the same old sh*t in cool new terminology?
> >
>
> It is the latter. The community has not "... realized and accepted that a
> single holographic/quantum neural
> neuron cell ....".


bj: Try to keep in mind that the academic community is traditionally a
generation or so behind the real innovators.

--
Quanta & Consciousness

http://www.freeyellow.com/members/sentek/index.html


Sent via Deja.com http://www.deja.com/
Before you buy.

Peter Davies

unread,
Nov 1, 1999, 3:00:00 AM11/1/99
to

bj flanagan wrote in message <7vkm01$d1h$1...@nnrp1.deja.com>...

> "Peter Davies" wrote:
>> >
>>
>> It is the latter. The community has not "... realized and accepted that
a
>> single holographic/quantum neural
>> neuron cell ....".
>
>
>bj: Try to keep in mind that the academic community is traditionally a
>generation or so behind the real innovators.
>
>

It is SOMETIMES true that real innovators are not from the academic
community and are spurned by it. However, it does not follow that EVERY
idea from outside it is automatically a breakthrough.

Peter Davies


E. Weddington

unread,
Nov 1, 1999, 3:00:00 AM11/1/99
to
There is a book that describes in detail the Holographic Neural Network,
including the mathematics involved. Very interesting stuff. I've been wondering
myself why other people haven't taken a look at it.

I don't have the info about the book here at my work. I've got it at home. I'll
post the info about it tomorrow.

Eric Weddington
umg...@attglobal.net


Will Dwinnell

unread,
Nov 1, 1999, 3:00:00 AM11/1/99
to
news wrote:
"It is now realized and accepted that a single holographic/quantum
neural neuron cell...

1) Is this a technological breakthrough recognized by the neural network
community, or is this a marketing breakthrough that repackages the same
old sh*t in cool new terminology?"

"Peter Davies" wrote:


"It is the latter. The community has not "... realized and accepted
that a single holographic/quantum neural neuron cell ....".

bj flanagan wrote:
"Try to keep in mind that the academic community is traditionally a
generation or so behind the real innovators."

We may restate this argument as: "You don't agree with me, therefore
you're wrong." Besides, some of us who are skeptical of holographic
neural networks are not academics.

Will Dwinnell
pred...@compuserve.com

bj flanagan

unread,
Nov 2, 1999, 3:00:00 AM11/2/99
to

"Peter Davies" wrote:

> >bj: Try to keep in mind that the academic community is traditionally a generation or so behind the real innovators.


> >
> >
>
> It is SOMETIMES true that real innovators are not from the academic
> community and are spurned by it. However, it does not follow that EVERY idea from outside it is automatically a breakthrough.
>

bj: To be sure. 'twere mere drollery.

E. Weddington

unread,
Nov 2, 1999, 3:00:00 AM11/2/99
to

news wrote:

> I was surfing the web for neural network information
> and came across the And Corporation website.
> They have a technology called Holographic/Quantum Neural Technology (HNeT).
> About which them claimed --
>

> "It is now realized and accepted that a single holographic/quantum neural

> neuron cell is capable of learning stimulus-response patterns or "memories"
> orders of magnitude faster, and far more accurately than the traditional
> back-propagation or genetically based neural networks."
>
> My questions are --
>

> 1) Is this a technological breakthrough recognized by the neural network
> community, or is this a marketing breakthrough that repackages
> the same old sh*t in cool new terminology?
>

> 2) Is there any commercial technology out there that is clearly superior
> to others?
>
> I'm a comp sci student, with some basic knowledge of neural networks.
> Any input would be greatly appreciated.

If you want details of HNeT including theory of operation, mathematics
involved, and more check out this book:

Fuzzy, Holographic, and Parallel Intelligence
Branko Soucek, Editor
Copyright 1992
Published by John Wiley & Sons
ISBN #0-471-54772-7

The book is one in a series, with each book being a compilation of articles.
This book contains an article "The Holographic Neural Method" written by John
G. Sutherland of the AND Corporation, Ontario, Canada, going from page 7 to
page 92.

Read it and judge for yourself. It seems like there is enough information there
to construct some things yourself (if one had time). I personally haven't
worked out the math to see if it is any better.

I would be interested to hear from anyone who has WORKED with HNeT, or READ the
above book and done a critical analysis and what, if any, specific critiques
they might have.

Good luck.

Eric Weddington
umg...@attglobal.net

bj flanagan

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to

"E. Weddington" wrote:

> If you want details of HNeT including theory of operation, mathematics
> involved, and more check out this book:
>
> Fuzzy, Holographic, and Parallel Intelligence
> Branko Soucek, Editor
> Copyright 1992
> Published by John Wiley & Sons
> ISBN #0-471-54772-7

bj: Great. Thanks.

bj flanagan

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to
Will Dwinnell wrote:


> bj flanagan wrote:
> "Try to keep in mind that the academic community is traditionally a
> generation or so behind the real innovators."
>

> We may restate this argument as: "You don't agree with me, therefore
> you're wrong." Besides, some of us who are skeptical of holographic
> neural networks are not academics.


[This is a curious one.

The force is strong with this one.

We will instruct him.]


bj: But that would be an inductive fallacy. I.e., just because people
who disagree with me are usually mistaken does not therefore imply that
all who disagree with me are mistaken. This is a kind of corollary to
the ancient maxim which tells us: You can't be wrong all the time.

JoshCahoon

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to
Death is like being a rock.

Ian

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to
"news" <flq...@hotmail.com> wrote:

>I was surfing the web for neural network information
>and came across the And Corporation website.
>They have a technology called Holographic/Quantum Neural Technology (HNeT).
>About which them claimed --
>
>"It is now realized and accepted that a single holographic/quantum neural
>neuron cell is capable of learning stimulus-response patterns or "memories"
>orders of magnitude faster, and far more accurately than the traditional
>back-propagation or genetically based neural networks."
>
>My questions are --
>
>1) Is this a technological breakthrough recognized by the neural network
>community, or is this a marketing breakthrough that repackages
>the same old sh*t in cool new terminology?

The latter. In fact, it's particularly old and weak sh*t (sort of like
perceptrons, AKA statistical linear regression in a funny hat), with
particularly bullshit marketing.

I particularly like the reference to them using Quantum Neurodynamics on
Pentium PCs. Quantum Neurodynamics is the idea that the brain uses quantum
processes that couldn't possibly be replicated on a computer such as a PC.
That has to qualify as one of the ultimate pieces of bullshit marketing of
all time.

>2) Is there any commercial technology out there that is clearly superior
>to others?
>
>I'm a comp sci student, with some basic knowledge of neural networks.
>Any input would be greatly appreciated.

For a student's purposes, you might want to check out Matlab. It has a
neural network toolkit and high-level capabilities that make playing around
with neural nets relatively painless.


Ian

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to
"Peter Davies" <pet...@NOSPAMSPAMSPAMSPAMinterlog.com> wrote:

>
>bj flanagan wrote in message <7vkm01$d1h$1...@nnrp1.deja.com>...

>> "Peter Davies" wrote:
>>> >
>>>
>>> It is the latter. The community has not "... realized and accepted that
>a
>>> single holographic/quantum neural
>>> neuron cell ....".
>>
>>
>>bj: Try to keep in mind that the academic community is traditionally a


>>generation or so behind the real innovators.
>>
>>
>

>It is SOMETIMES true that real innovators are not from the academic
>community and are spurned by it. However, it does not follow that EVERY
>idea from outside it is automatically a breakthrough.

Indeed, in most cases, commercial innovations introduced are usually about
a generation _behind_ what is known in CS academia.


Mike James

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to

> I particularly like the reference to them using Quantum Neurodynamics on
> Pentium PCs. Quantum Neurodynamics is the idea that the brain uses
quantum
> processes that couldn't possibly be replicated on a computer such as a PC.
> That has to qualify as one of the ultimate pieces of bullshit marketing of
> all time.

Clever though - making it sound like Quantum Electrodynamics.
But yes total bullshit but this isn't the first time this field has had to
put up with it. Read some of the early papers on Neurodynamics (without the
Quantum) and you can see that all they were doing was writing down big sets
of equations that said nothing but the obvious and were impossible to take
any further.

mikej

Mike James

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to
> > If you want details of HNeT including theory of operation, mathematics
> > involved, and more check out this book:
> >
> > Fuzzy, Holographic, and Parallel Intelligence
> > Branko Soucek, Editor
> > Copyright 1992
> > Published by John Wiley & Sons
> > ISBN #0-471-54772-7
>
> bj: Great. Thanks.
But have you seen the price!
Second hand copy anyone?
mikej

Paul Victor Birke

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to
Ian wrote:

> Indeed, in most cases, commercial innovations introduced are usually about
> a generation _behind_ what is known in CS academia.

***********************************************
Dear Ian

Strong statement!! Always a bad assumption to assume what other people
have or don't have; know or don't know!

Tidbit from a boss about 10 years ago.

Although from what I have seen of commericial software, I would tend to
agree with you.

Paul Birke, P. Eng. NN Researcher in Guelph ON CANADA

E. Weddington

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to

Mike James wrote:

You could try going to your local library and do an "inter-library loan"; Some
college campus somewhere in the US surely has a copy of it in their library.

Eric Weddington


Mike James

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to
> > > bj: Great. Thanks.
> > But have you seen the price!
> > Second hand copy anyone?
> > mikej
>
> You could try going to your local library and do an "inter-library loan";
Some
> college campus somewhere in the US surely has a copy of it in their
library.

Not a serious problem - I can get it from the British Library - my real
point (subtle I admit) was that perhaps publishing the book is the main way
anyone is going to make money out of the idea.
:-)
mikej

E. Weddington

unread,
Nov 3, 1999, 3:00:00 AM11/3/99
to

Mike James wrote:

Whoops! Sorry, I didn't notice that you're in the UK!

I agree, perhaps that is the only way they're going to make money out of it
(they have to pay for the patents somehow!).

Except... I mentioned that the book is a collection of papers written by
various authors, and the book is one in a series of books put together the same
way (promoted as "advanced research", at least at the time). I don't see how
ANY of the authors will get any real money out of the deal. Probably the only
people making money off of the books is the publishers, Wiley and Sons. Because
of the limited audience, they have to make money off of it somehow too.

Eric Weddington


Ian

unread,
Nov 4, 1999, 3:00:00 AM11/4/99
to
Paul Victor Birke <nonl...@home.com> wrote:

>Ian wrote:
>
>> Indeed, in most cases, commercial innovations introduced are usually about
>> a generation _behind_ what is known in CS academia.
>***********************************************
>Dear Ian
>
>Strong statement!! Always a bad assumption to assume what other people
>have or don't have; know or don't know!

Strong statement yes, but often true. I've seen many cases. It's a lot
easier to create a research system and verify that the concept works, than
it is to create a highly integrated, working commercial system _and_ get
the market to accept it.


Mike James

unread,
Nov 4, 1999, 3:00:00 AM11/4/99
to
> Whoops! Sorry, I didn't notice that you're in the UK!

No need to apologies its the nice thing about a news group that we all seem
local :-)


> I don't see how
> ANY of the authors will get any real money out of the deal. Probably the
only
> people making money off of the books is the publishers, Wiley and Sons.
Because
> of the limited audience, they have to make money off of it somehow too.

I know what the finances are of book publishing (I used to run a small
academic publishing company) and all I can say is the current price of such
books is a scandal - its mainly continues because libraries will pay the
asking price.

cheers
mikej

bj flanagan

unread,
Nov 4, 1999, 3:00:00 AM11/4/99
to

"Mike James" wrote:

> I know what the finances are of book publishing (I used to run a small
> academic publishing company) and all I can say is the current price of such
> books is a scandal - its mainly continues because libraries will pay the
> asking price.


bj: Yes. Publishing is central to the larger, pervasive pattern of
corruption which befouls academia. Happily, there is evidence that real
changeis in the wind. Witten et al., have launched an online publication
in the wake of boycott threats against a highly prominent physics
journal.

news

unread,
Nov 5, 1999, 3:00:00 AM11/5/99
to
Thanks to everyone for the comments.

I'm going to get "Fuzzy, Holographic, and Parallel Intelligence."
If I learn something technical, great; if it's all BS, I've learned
something about marketing.

Thanks


Mike James wrote in message
<941622706.23647.1...@news.demon.co.uk>...

Mike James

unread,
Nov 5, 1999, 3:00:00 AM11/5/99
to
> bj: Yes. Publishing is central to the larger, pervasive pattern of
> corruption which befouls academia. Happily, there is evidence that real
> changeis in the wind. Witten et al., have launched an online publication
> in the wake of boycott threats against a highly prominent physics
> journal.
>

Tell us more - I think its just about on topic.
In particular what online publication?
mikej

Konrad Freeman

unread,
Nov 8, 1999, 3:00:00 AM11/8/99
to
Dear news,

They have an article in the most recent issue of PCAI magazine (sept/oct.1999
issue http://www.pcai.com/pcai). My problem with them is that they keep using
(or abusing?) those biological terms such as "neurons", "granule cell",
"Purkinjee cell", "neural plasticity", etc,etc. with little mathematical
insight on how "Hnet" works. Think about it: If this was such a great
"breakthough", how come the AI and data analysis community are so cool to it
after so many years? At least, they have done a poor job convincing people, due
to their nebulous, vaque explanations.

K.Freeman

news wrote:

> I was surfing the web for neural network information
> and came across the And Corporation website.
> They have a technology called Holographic/Quantum Neural Technology (HNeT).
> About which them claimed --
>
> "It is now realized and accepted that a single holographic/quantum neural
> neuron cell is capable of learning stimulus-response patterns or "memories"
> orders of magnitude faster, and far more accurately than the traditional
> back-propagation or genetically based neural networks."
>
> My questions are --
>
> 1) Is this a technological breakthrough recognized by the neural network
> community, or is this a marketing breakthrough that repackages
> the same old sh*t in cool new terminology?
>

> 2) Is there any commercial technology out there that is clearly superior
> to others?
>
> I'm a comp sci student, with some basic knowledge of neural networks.
> Any input would be greatly appreciated.

The contents of this message express only the sender's opinion.
This message does not necessarily reflect the policy or views of
my employer, Merck & Co., Inc. All responsibility for the statements
made in this Usenet posting resides solely and completely with the
sender.

Paul Pace

unread,
Nov 15, 1999, 3:00:00 AM11/15/99
to
Gentlemen/Ladies,

I have been reading your discussion concerning "Holographic/Quantum
Technology" with some interest. I work for the Department of National
Defence, Research and Development Branch in Canada We have been
involved in the use of HNet for the past year and we have applied the
technology to some very difficult military recognition and
identification problems. In particular, we have been using it for the
automatic identification of radar images of military targets,
identification of sonar signals as well as a couple of other signal
processing applications. We are also working on a project in the area
of face recognition and the idetification of objects in IR and visible
images.

In all cases HNet has performed exceptionally well. It has been able to
exceed the performance other methodologies we have tried. Over the
coming months, we will be continuing to validate the applicability of
the method to our problems; however, to date the results have been
exceptional.

Dr Paul Pace
Research and Development Branch
Department of National Defence
Ottawa


* Sent from RemarQ http://www.remarq.com The Internet's Discussion Network *
The fastest and easiest way to search and participate in Usenet - Free!


Konrad Freeman

unread,
Nov 16, 1999, 3:00:00 AM11/16/99
to
Paul,

Very interesting. I guess it's the first time I hear any positive feedback
about HNet. Now, when you say "exceed the performance", do you mean computation
time or fitness of the model? Have you benchmarked it against classical
statistical regression models? (I heard someone say that HNet is just linear
model in a funny hat).

Thank you very much.

K. Freeman

Paul Pace wrote:

> In all cases HNet has performed exceptionally well. It has been able to
> exceed the performance other methodologies we have tried. Over the
> coming months, we will be continuing to validate the applicability of
> the method to our problems; however, to date the results have been
> exceptional.
>
> Dr Paul Pace
> Research and Development Branch
> Department of National Defence
> Ottawa
>

The contents of this message express only the sender's opinion.

bj flanagan

unread,
Nov 17, 1999, 3:00:00 AM11/17/99
to

Thanks, Paul. I have been reading John Sutherland's piece in:

> Fuzzy, Holographic, and Parallel Intelligence
> Branko Soucek, Editor
> Copyright 1992
> Published by John Wiley & Sons
> ISBN #0-471-54772-7

For which much thanks to E. Weddington.

It is obviously very solid work and indeed, I think, quite brilliant.

In fact, having seen the moment of my greatness flicker, I have decided
to descend from my ivory tower (my sanctuary, my asylum) and make my way
among mortal men for a time.

Thanks to all for an invigorating discussion.

Our serene splendor,


bj

===

Quanta & Consciousness

http://www.freeyellow.com/members/sentek/index.html


> I have been reading your discussion concerning "Holographic/Quantum
> Technology" with some interest. I work for the Department of National
> Defence, Research and Development Branch in Canada We have been

> involved in the use of HNet for the past year ...


>
> In all cases HNet has performed exceptionally well. It has been able to
> exceed the performance other methodologies we have tried.
>

> Dr Paul Pace
> Research and Development Branch
> Department of National Defence
> Ottawa


--

Paul Pace

unread,
Nov 18, 1999, 3:00:00 AM11/18/99
to
Konrad,

A little additional information about the results we are obtaining with
HNet may be useful.I believe the individual who made the comment
"linear model in a funny hat" is a student from Waterloo and this
comment appears to be have been made without detailed experience with
the technology. To my knowledge the University of Waterloo does not
currently have access to the HNeT technology.

In general, the systems we are using have been developed by
a team of scientists and engineers over a period of many years. They
extend considerably beyond classical regression models. Our
applications are to a large extent classified; however, I can
provide you with typical performance comparisons. For one
important ATR application (identification of targets in SAR imagery) we
have attained a 100X decrease in classification error in conjunction
with a 60,000X increase in operational speed. Benchmarks for the
comparisons were obtained from classified sources and cannot be
discussed in this forum.

I would be happy to continue this discussion

Paul Pace

Peter Davies

unread,
Nov 18, 1999, 3:00:00 AM11/18/99
to

Paul Pace wrote in message <336ec6b0...@usw-ex0107-050.remarq.com>...

>I believe the individual who made the comment
>"linear model in a funny hat" is a student from Waterloo and this
>comment appears to be have been made without detailed experience with
>the technology.

This student seems to be on the ball. His comment is consistent with
leading NN researchers and many of the knowledgeable people on this board.

I would be interested to know what methods you have compared with HNET. As
you can tell I am skeptical but I am prepared to change my views if I find
hard evidence to do so.

Peter Davies


Will Dwinnell

unread,
Nov 18, 1999, 3:00:00 AM11/18/99
to
Peter Davies wrote:
"I would be interested to know what methods you have compared with
HNET. As you can tell I am skeptical but I am prepared to change my
views if I find hard evidence to do so."

I agree and second this motion. Can anyone provide any sort of real
tests of this technology on standard test data sets? No statistician
would ever accept these claims without this evidence: it is no sort
experiment.

Will Dwinnell
pred...@compuserve.com

Ian

unread,
Nov 19, 1999, 3:00:00 AM11/19/99
to
Paul Pace <paul.pac...@crad.dnd.ca.invalid> wrote:

>Konrad,
>
>A little additional information about the results we are obtaining with

>HNet may be useful.I believe the individual who made the comment


>"linear model in a funny hat" is a student from Waterloo and this
>comment appears to be have been made without detailed experience with
>the technology.

That was me. I was paraphrasing a comment made by Warren Sarle, who is on
this NG.

>To my knowledge the University of Waterloo does not
>currently have access to the HNeT technology.

No, though I do have access to the HNeT website, home to some of the most
bogus, crappy, and generally blatant-bullshit marketing of all time.


Peter Davies

unread,
Nov 19, 1999, 3:00:00 AM11/19/99
to

Paul Pace <paul.pac...@crad.dnd.ca.invalid> wrote:
>
>To my knowledge the University of Waterloo does not
>currently have access to the HNeT technology.
>

The University of Waterloo has one of the top Computer Science schools in
North America. The fact that it does not have "access to the HNeT
technology" might give one food for thought about this "technology". They
do not usually have any difficulty getting any hardware/software they want,
usually for free.

Peter Davies


bj flanagan

unread,
Nov 19, 1999, 3:00:00 AM11/19/99
to

> No, though I do have access to the HNeT website, home to some of the most
> bogus, crappy, and generally blatant-bullshit marketing of all time.


bj: Is there a subtext here?


--
Quanta & Consciousness

http://www.freeyellow.com/members/sentek/index.html


bj flanagan

unread,
Nov 19, 1999, 3:00:00 AM11/19/99
to

"Peter Davies" wrote:

> The University of Waterloo has one of the top Computer Science schools in
> North America. The fact that it does not have "access to the HNeT
> technology" might give one food for thought about this "technology".


bj: Your assessment of the University's stature in turn suggests that
you are a paid agent of certain dark forces whose names inspire dread in
all who hear them.


They
> do not usually have any difficulty getting any hardware/software they want, usually for free.


bj: Due, no doubt, to fears of swift and horrific reprisal should their
demands not be met.

LZ

unread,
Nov 19, 1999, 3:00:00 AM11/19/99
to
I have used HNeT (Holographic Neural Technology) in my work for many
years in the field of energy R&D. Indeed I have found HNeT has
significant advantages over other neural network products on the
market. One obvious advantage is its speed of learning, if you have
used HNeT and others the difference is very big. It also gives
indications of the importance of each individual parameters and their
combinations. I have done some comparson of neural networks
performance. HNeT comes out at top for its speed of learning, degree of
accurancy, and many others.

Warren Sarle

unread,
Nov 19, 1999, 3:00:00 AM11/19/99
to

In article <avk0OHOiUe=MG0YBasOpIcj=K5...@4ax.com>, Ian <iadm...@undergrad.math.uwaterloo.ca> writes:

|> Paul Pace <paul.pac...@crad.dnd.ca.invalid> wrote:
|>
|> >A little additional information about the results we are obtaining with
|> >HNet may be useful.I believe the individual who made the comment
|> >"linear model in a funny hat" is a student from Waterloo and this
|> >comment appears to be have been made without detailed experience with
|> >the technology.
|>
|> That was me. I was paraphrasing a comment made by Warren Sarle, who is on
|> this NG.

A lovely example of memetic mutation. :-)

What I actually said (as I recall, without bothering to look it up) was
that based on a long and detailed presentation by Sutherland at JCIS'98,
the performance characteristcs of HNet seemed to be virtually the same
as least-squares polynomial regression trained by conjugate gradients,
which would be called a funtional link network or higher order network
in neural net lingo. For example:

* With P inputs, you can fit P+1 cases exactly.
* Learning nonlinear functions requires augmenting the data with
nonlinear functions of the original inputs.
* One training pass over the data will give fairly low training
error, sometimes with good generalization, but exact minimization
of the training error requires several passes.
* The error function has only one minimum, so you automatically
get a global optimum.
* HNet gives you a number for each case that is a measure of how
unfamiliar the inputs are. The corresponding quantity in regression
is leverage or (when the model has an intercept) Mahalanobis
distance.

Of course, the big difference is that HNet takes complex data, so
comparisons with regression may not be straightforward.

I cannot guarantee that my recollections are completely accurate, so
it would be nice if someone who has read the book would comment.

I have never used HNet. I have no reason to think it is not good,
useful software, but I have no hard evidence one way or the other.

--

Warren S. Sarle SAS Institute Inc. The opinions expressed here
sas...@unx.sas.com SAS Campus Drive are mine and not necessarily
(919) 677-8000 Cary, NC 27513, USA those of SAS Institute.

Ian

unread,
Nov 20, 1999, 3:00:00 AM11/20/99
to
LZ <lzhengN...@nrcan.gc.ca.invalid> wrote:

This is a bit suspicious.

We have seen posts from two people who claim to have used the HNeT software
for government work in Canada. They both claim that it is great stuff, but
provide no details. But consider:

- both posts come through RemarQ, which makes it rather difficult to tell
where they actually originated.
- both posters supply email addresses which are spam-blocked in the _exact_
same format (initial,last name,NO,initials,SPAM,address,invalid).


Warren Sarle

unread,
Nov 21, 1999, 3:00:00 AM11/21/99
to

In article <iazZ3.4479$UX1.1...@cac1.rdr.news.psi.ca>, "Peter Davies" <pet...@NOSPAMSPAMSPAMSPAMinterlog.com> writes:
|>
|> A nice bit of sleuthing, Ian. I wondered myself whether these
|> "testimonials" were real or bogus. You have shed some light on the subject.

Somebody could shed some real light by programming the HNet algorithm,
which is only a few lines of complex matrix algebra, and trying it out.

Will Dwinnell

unread,
Nov 22, 1999, 3:00:00 AM11/22/99
to
LZ wrote:
"I have used HNeT (Holographic Neural Technology) in my work for many
years in the field of energy R&D. Indeed I have found HNeT has
significant advantages over other neural network products on the market.
One obvious advantage is its speed of learning, if you have used HNeT
and others the difference is very big. It also gives indications of the
importance of each individual parameters and their combinations. I have
done some comparson of neural networks performance. HNeT comes out at
top for its speed of learning, degree of accurancy, and many others."

It has been my experience, after reviewing many such claims about
technologies, that these claims are incomplete. That is, they ignore
potential weaknesses of the technology. I have reviewed and examined
scores of these products, and one of my first questions to vendors is
"What are the disadvantages of this techniques?" You'd be surprised at
the candid answers I sometimes receive. Anyway, I have observed that
many technologies can be characterized by a number of behavioral
traits. While people often concentrate on learning speed and model
accuracy, there are other important qualities of a learning algorithm,
such as understandability of the model to humans, automatic selection of
relevant input variables, scalability to large datasets, ability to deal
with multiple outputs, speed of recall, distribution of errors and size
of resulting model, among others. Typically (again, in my experience),
improvements in some of these areas means degradation in the others.
Many times this newsgroup has witnessed rash claims of the speed of
learning of RBF neural networks versus MLP neural networks. For many
problems, RBF neural networks do learn faster than MLP neural networks,
but recall speed begins to suffer significantly for RBF NNs as the data
set becomes larger. This can be offset, of course, through clustering
or some other preprocessing method, but this diminishes the speed
advantage of RBF NNs.

Will Dwinnell
pred...@compuserve.com

Paul Victor Birke

unread,
Nov 23, 1999, 3:00:00 AM11/23/99
to
Will Dwinnell wrote:
>
> LZ wrote:
> "I have used HNeT (Holographic Neural Technology) in my work for many
> years in the field of energy R&D. Indeed I have found HNeT has
> significant advantages over other neural network products on the market.
> One obvious advantage is its speed of learning,

I would rather have accuracy than speed.

if you have used HNeT
> and others the difference is very big. It also gives indications of the
> importance of each individual parameters and their combinations. I have
> done some comparson of neural networks performance. HNeT comes out at
> top for its speed of learning, degree of accurancy, and many others."
>
> It has been my experience, after reviewing many such claims about
> technologies, that these claims are incomplete. That is, they ignore
> potential weaknesses of the technology. I have reviewed and examined
> scores of these products, and one of my first questions to vendors is
> "What are the disadvantages of this techniques?" You'd be surprised at
> the candid answers I sometimes receive. Anyway, I have observed that
> many technologies can be characterized by a number of behavioral
> traits. While people often concentrate on learning speed and model
> accuracy,

Yes indeed!!


there are other important qualities of a learning algorithm,
> such as understandability of the model to humans,

always a tough job to the >>other humans<<, no doubts

> automatic selection of
> relevant input variables,

can be a problem

scalability to large datasets,
Indeed

> ability to deal
> with multiple outputs,

Yes

speed of recall??
distribution of errors Yes


and size
> of resulting model, among others. Typically (again, in my experience),
> improvements in some of these areas means degradation in the others.
> Many times this newsgroup has witnessed rash claims of the speed of
> learning of RBF neural networks versus MLP neural networks.

Yes


For many
> problems, RBF neural networks do learn faster than MLP neural networks,
> but recall speed begins to suffer significantly for RBF NNs as the data
> set becomes larger.

Not clear here are you referring to O(n^2) for matrix solutions??

This can be offset, of course, through clustering

Yes

> or some other preprocessing method, but this diminishes the speed
> advantage of RBF NNs.

I cannot accept that last one- a bit too broad. I would say that is not
my experience with RBFNNs.

Remember what Prof. Geoffrey Hinton said:==>

>> Every year, you may get a NEW and BIGGER and FASTER MACHINE. (So don't worry!)<<

all the best,


Paul

Will Dwinnell

unread,
Nov 23, 1999, 3:00:00 AM11/23/99
to
I (Will Dwinnell) wrote:
"For many problems, RBF neural networks do learn faster than MLP neural
networks, but recall speed begins to suffer significantly for RBF NNs as
the data set becomes larger."

Paul Victor Birke wrote:
"Not clear here are you referring to O(n^2) for matrix solutions??"

I meant plain vanilla radial basis function neural networks, like PNN
and GRNN.

I (Will Dwinnell) continued:
"This can be offset, of course, through clustering or some other


preprocessing method, but this diminishes the speed advantage of RBF
NNs."

Paul Victor Birke responded:


"I cannot accept that last one- a bit too broad. I would say that is
not my experience with RBFNNs."

I said 'diminished', not 'eliminated'. I am not offering an analysis of
a specific case: maybe for the kind of problems you have worked on, the
preprocessing does not represent a significant cost, but it is a cost
(which is all I was saying), and it generally will grow with the size of
the data set.

Will Dwinnell
pred...@compuserve.com

Paul Victor Birke

unread,
Nov 24, 1999, 3:00:00 AM11/24/99
to
Will Dwinnell wrote:
>
> I (Will Dwinnell) wrote:
> "For many problems, RBF neural networks do learn faster than MLP neural
> networks, but recall speed begins to suffer significantly for RBF NNs as
> the data set becomes larger."
>
> Paul Victor Birke wrote:
> "Not clear here are you referring to O(n^2) for matrix solutions??"
>
> I meant plain vanilla radial basis function neural networks, like PNN
> and GRNN.

I use an SVD matrix method to solve for the weights in the RBF problem
so I was referring to the natural buildup of solution time with number
of variables n. Although the length of the data set m is important not
as important as n.



> I (Will Dwinnell) continued:
> "This can be offset, of course, through clustering or some other
> preprocessing method, but this diminishes the speed advantage of RBF
> NNs."
>
> Paul Victor Birke responded:
> "I cannot accept that last one- a bit too broad. I would say that is
> not my experience with RBFNNs."
>
> I said 'diminished', not 'eliminated'. I am not offering an analysis of
> a specific case: maybe for the kind of problems you have worked on, the
> preprocessing does not represent a significant cost, but it is a cost
> (which is all I was saying), and it generally will grow with the size of
> the data set.

Well you have hit upon the problem that the Unsupervised Learning indeed
takes considerabel time. While LMS/Tikhonov solutions can occur in
seconds, the Unsupervised Part may take hours. I have not found an
automatic method yet for this part. It is more Art than Science, at
least for me.

all the best,
Paul

Greg Heath

unread,
Nov 24, 1999, 3:00:00 AM11/24/99
to
Date: Wed, 24 NOV 1999 01:40:39 GMT
From: Paul Victor Birke <nonl...@home.com>

I've obtained extremely fast learning by using a home grown *supervised*
leader clustering (sort of a combination of ART and LVQ). Overfitting is
mitigated by pruning the hidden layer and regularized pseudoinversion to
obtain the output layer.

Greg

Hope this helps.

Gregory E. Heath he...@ll.mit.edu The views expressed here are
M.I.T. Lincoln Lab (781) 981-2815 not necessarily shared by
Lexington, MA (781) 981-0908(FAX) M.I.T./LL or its sponsors
02420-9185, USA


Paul Victor Birke

unread,
Nov 24, 1999, 3:00:00 AM11/24/99
to
*********************************************************
Dear Greg

I have been considering a Supervised Learning Method for Clustering but
wasn't too clear on how to proceed. What I had in mind:==> I was going
to make a comb linear approach in the output space and then find input
clusters mapped onto them. However, I suspect the optimization must be
channeled or controlled as one has a lot of free variables in the means
and variances and these could end up anywhere they liked so to say. One
can a least set the number of nodes to something and start this
process. Did you constrain anything in your method to kind of make the
problem more defined?

I think Supervised Learning for Clustering must be a good method, it
makes sense.

Again as you say standard methods to prevent overfitting of pruning and
regularization techniques to be sure.

You method looks and sounds good. Any good references you could
recommend re Supervised Learning of Input Output Clusters?

all the best,

Paul V. Birke, P. Eng. NN Researcher in Guelph ON CANADA

Greg Heath

unread,
Nov 24, 1999, 3:00:00 AM11/24/99
to
Date: Wed, 24 NOV 1999 22:49:46 GMT
From: Paul Victor Birke <nonl...@home.com>

> Greg Heath wrote:

> >-----SNIP

> > I've obtained extremely fast learning by using a home grown *supervised*
> > leader clustering (sort of a combination of ART and LVQ). Overfitting is
> > mitigated by pruning the hidden layer and regularized pseudoinversion to
> > obtain the output layer.
> *********************************************************
> Dear Greg
>
> I have been considering a Supervised Learning Method for Clustering but
> wasn't too clear on how to proceed. What I had in mind:==> I was going
> to make a comb linear approach

What is a comb linear approach?

> in the output space and then find input
> clusters mapped onto them. However, I suspect the optimization must be
> channeled or controlled as one has a lot of free variables in the means
> and variances and these could end up anywhere they liked so to say. One
> can a least set the number of nodes to something and start this
> process. Did you constrain anything in your method to kind of make the
> problem more defined?
>
> I think Supervised Learning for Clustering must be a good method, it
> makes sense.
>
> Again as you say standard methods to prevent overfitting of pruning and
> regularization techniques to be sure.
>
> You method looks and sounds good. Any good references you could
> recommend re Supervised Learning of Input Output Clusters?

My comments were aimed at classification, not regression. If you are
familiar with k-means, ART and LVQ, it's not hard to throw one together.

Relatively plain vanilla:

1. Assume c classes with Ni samples/class (i=1,c).
2. The presentation frequency for class i can be inversely proportional
to Ni and/or you can weight the updating.
3. Use your favorite metric to assign input vector x to the closest cluster
4. If the cluster is of the correct class, use x to update its membership
count, mean vector and covariance matrix.
5. If not, use x to create a new cluster.
6. Test epoch error rates with a validation set
7. Merge and/or prune clusters
8. Test again
9. If satisfied, stop.
10. If not, go back to step 3

Variations are unlimited.

With regression it seem that you would have to first use unsupervised
clustering to define the "classes".

Greg Heath

unread,
Nov 24, 1999, 3:00:00 AM11/24/99
to
Date: Thu, 25 NOV 1999 01:05:14 GMT
From: Paul Victor Birke <nonl...@home.com>

> Greg Heath wrote:
> >>snippings<<


> > What is a comb linear approach?
>

> Define the output range as Ylow ==> Yhigh and divide equally into same
> number of regions as assumed number of trial Hidden Nodes-1. Place
> Output means in equal intervals starting at Ylow.

Oh. I was thinking in terms of a multivariable output. If you use leader
clustering, you don't have to guess at the number. Instead, you define a
threshold boundary such that if D(x,mi) > Ti, then x does not belong to
cluster i. You then define a new cluster with mean at x.

Paul Victor Birke

unread,
Nov 25, 1999, 3:00:00 AM11/25/99
to
Greg Heath wrote:
>>snippings<<
> What is a comb linear approach?

Define the output range as Ylow ==> Yhigh and divide equally into same
number of regions as assumed number of trial Hidden Nodes-1. Place
Output means in equal intervals starting at Ylow.

Paul

Paul Victor Birke

unread,
Nov 25, 1999, 3:00:00 AM11/25/99
to
Greg Heath wrote:
>>snippings<<

BTW:==> What is >>leader clustering<< ?

Thanks,
Paul

Peter Davies

unread,
Nov 25, 1999, 3:00:00 AM11/25/99
to
Paul Victor Birke wrote in message <383CBB95...@home.com>...

About time this thread was renamed, don'tya think, Paul?

Peter


Paul Victor Birke

unread,
Nov 25, 1999, 3:00:00 AM11/25/99
to

Greg Heath

unread,
Nov 25, 1999, 3:00:00 AM11/25/99
to
Date: Thu, 25 NOV 1999 12:17:42 GMT

From: Paul Victor Birke <nonl...@home.com>

> BTW:==> What is >>leader clustering<< ?

Plain vanilla:

1. Use your favorite metric, D(x-mi), to assign input vector x to the
closest cluster with membership, mean, covariance matrix and boundary
Ni, mi, Ci, and Dmaxi, respectively.
2. If D(x-mi) <= Dmaxi, the cluster is acceptable and x is used to
update cluster parameters.
3. If not, use x to create a new cluster.
4. Test epoch results using your favorite objective
5. Split, merge and/or prune clusters using your favorite criteria
6. Test again
7. If satisfied, stop.
9. If not, go back to step 1.

Variations are unlimited.

Starting with no clusters, the "lead" input vector becomes the mean of
the first cluster.

Peter Davies

unread,
Nov 19, 1999, 3:00:00 AM11/19/99
to
Your responses to my messages and to others on this board leave me
perplexed. Maybe you could help us all out here. Should we:

a) Laugh with you?
b) Laugh at you?
c) Ask what you have been smoking?
d) Suggest that you change your initials to bs?

Peter Davies

bj flanagan wrote in message <813ngu$n3m$1...@nnrp1.deja.com>...

Peter Davies

unread,
Nov 20, 1999, 3:00:00 AM11/20/99
to

Ian wrote in message ...

>
>This is a bit suspicious.
>
>We have seen posts from two people who claim to have used the HNeT software
>for government work in Canada. They both claim that it is great stuff, but
>provide no details. But consider:
>
>- both posts come through RemarQ, which makes it rather difficult to tell
>where they actually originated.
>- both posters supply email addresses which are spam-blocked in the _exact_
>same format (initial,last name,NO,initials,SPAM,address,invalid).
>

A nice bit of sleuthing, Ian. I wondered myself whether these


"testimonials" were real or bogus. You have shed some light on the subject.

Peter Davies


Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted

Sean O'Connor

unread,
Jul 24, 2017, 12:29:54 PM7/24/17
to
Message has been deleted

Sean O'Connor

unread,
Sep 19, 2017, 1:48:48 AM9/19/17
to
If you regard random projections as a sort of hologram:
https://randomprojectionai.blogspot.com/
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted

Deep Swindle

unread,
Jun 30, 2021, 8:42:05 PM6/30/21
to
On Friday, November 19, 1999 at 3:00:00 AM UTC-5, Peter Davies wrote:
> Your responses to my messages and to others on this board leave me
> perplexed. Maybe you could help us all out here. Should we:
> a) Laugh with you?
> b) Laugh at you?
> c) Ask what you have been smoking?
> d) Suggest that you change your initials to bs?
> Peter Davies
> bj flanagan wrote in message <813ngu$n3m$1...@nnrp1.deja.com>...
> >
> >
> > "Peter Davies" wrote:
> >
> >> The University of Waterloo has one of the top Computer Science schools in
> >> North America. The fact that it does not have "access to the HNeT
> >> technology" might give one food for thought about this "technology".
> >
> >
> >bj: Your assessment of the University's stature in turn suggests that
> >you are a paid agent of certain dark forces whose names inspire dread in
> >all who hear them.
> >
> >
> >They
> >> do not usually have any difficulty getting any hardware/software they
> want, usually for free.
> >
> >
> >bj: Due, no doubt, to fears of swift and horrific reprisal should their
> >demands not be met.
> >
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted

Deep Swindle

unread,
Aug 4, 2021, 7:46:46 PM8/4/21
to
On Wednesday, August 4, 2021 at 7:46:21 PM UTC-4, Deep Swindle wrote:
> On Wednesday, August 4, 2021 at 7:42:47 PM UTC-4, Deep Swindle wrote:
0 new messages