Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Genetic FPGA

134 views
Skip to first unread message

mixed nuts

unread,
Mar 3, 2016, 8:15:32 PM3/3/16
to
...

"Dr. Adrian Thompson is a researcher operating from the Department of
Informatics at the University of Sussex, and his experimentation in the
mid-1990s represented some of science’s first practical attempts to
penetrate the virgin domain of hardware evolution. The concept is
roughly analogous to Charles Darwin’s elegant principle of natural
selection, which describes how individuals with the most advantageous
traits are more likely to survive and reproduce. This process tends to
preserve favorable characteristics by passing them to the survivors’
descendants, while simultaneously suppressing the spread of less-useful
traits.

Dr. Thompson dabbled with computer circuits in order to determine
whether survival-of-the-fittest principles might provide hints for
improved microchip designs. As a test bed, he procured a special type of
chip called a Field-Programmable Gate Array (FPGA) whose internal logic
can be completely rewritten as opposed to the fixed design of normal
chips. This flexibility results in a circuit whose operation is hot and
slow compared to conventional counterparts, but it allows a single chip
to become a modem, a voice-recognition unit, an audio processor, or just
about any other computer component. All one must do is load the
appropriate configuration."
...
Dr. Thompson peered inside his perfect offspring to gain insight into
its methods, but what he found inside was baffling. The plucky chip was
utilizing only thirty-seven of its one hundred logic gates, and most of
them were arranged in a curious collection of feedback loops. Five
individual logic cells were functionally disconnected from the rest—
with no pathways that would allow them to influence the output— yet when
the researcher disabled any one of them the chip lost its ability to
discriminate the tones. Furthermore, the final program did not work
reliably when it was loaded onto other FPGAs of the same type.

http://www.damninteresting.com/on-the-origin-of-circuits/

--
Grizzly H.

George Herold

unread,
Mar 3, 2016, 11:46:35 PM3/3/16
to
> individual logic cells were functionally disconnected from the rest--
> with no pathways that would allow them to influence the output-- yet when
> the researcher disabled any one of them the chip lost its ability to
> discriminate the tones. Furthermore, the final program did not work
> reliably when it was loaded onto other FPGAs of the same type.
>
> http://www.damninteresting.com/on-the-origin-of-circuits/
>
> --
> Grizzly H.

Ahh, AFAICT natural selection is fairly chaotic,
breeding... choosing what you want, is much faster.
(I didn't read the whole post...)

George H.

rickman

unread,
Mar 4, 2016, 12:40:38 AM3/4/16
to
I don't think you understand the idea here. The purpose is to remove
the programmer and let the software develop on its own. Rules are set
up to select variants that are closer to the desired behavior in each
generation and eventually the process generates something that works,
mostly.

What is interesting here is that each generation appears to have been
tested on a real chip rather than in simulation. So the circuit
depended on specific properties of delay and it sounds as if noise
coupling for operation which means it didn't work properly on other
chips. I think with some guidance such as providing a clock and running
in the world of simulation (with useful boundary conditions on what
features are supported) might produce a design that was closer to being
useful.

--

Rick

Allan Herriman

unread,
Mar 4, 2016, 4:52:40 AM3/4/16
to
That was in a Xilinx 6200 family part. Xilinx bought and then
discontinued that line of parts in the 1990s.
They were unique (at the time) in that the bitstream format was fully
specified, allowing researchers to write tools that manipulated the
configuration directly.

Regards,
Allan

Syd Rumpo

unread,
Mar 4, 2016, 6:12:17 AM3/4/16
to
On 04/03/2016 05:40, rickman wrote:

<snipped>

> I don't think you understand the idea here. The purpose is to remove
> the programmer and let the software develop on its own. Rules are set
> up to select variants that are closer to the desired behavior in each
> generation and eventually the process generates something that works,
> mostly.

Trouble is genetic algorithms tend to get 'trapped' in local maxima, so
you end up with a chip which works ok but has a bad back.

Cheers
--
Syd

Syd Rumpo

unread,
Mar 4, 2016, 6:15:06 AM3/4/16
to
Or, of course, a "...terrible pain in all the diodes down my left side".

Cheers
--
Syd

bitrex

unread,
Mar 4, 2016, 7:08:32 AM3/4/16
to
On 03/03/2016 08:15 PM, mixed nuts wrote:
> ...
>
Furthermore, the final program did not work
>reliably when it was loaded onto other FPGAs of the same type.
>
> http://www.damninteresting.com/on-the-origin-of-circuits/
>

Yeah, wasn't the final problem with making any practical use of this
that there was too much variation between the hardware of individual
devices nominally of the same device type? So you could never know
exactly what it would be doing with any specific device, or if it would
work when ported to another.

The only reason that there could be gates not connected to something
that breaks functionality when they're removed is that the algorithm is
exploiting some nonideality in the silicon.

Genetic programming/algorithms is a practical technique, but seems to
only work well in software, where the physical layer is abstracted away.
Could work for FPGAs too, but it would make more sense to use an
abstract model of the device when "evolving" the hardware.


John Larkin

unread,
Mar 4, 2016, 10:20:06 AM3/4/16
to
Yikes, hairball async logic! Of course it wasn't repeatable.


>>
>> --
>> Grizzly H.
>
>Ahh, AFAICT natural selection is fairly chaotic,
>breeding... choosing what you want, is much faster.
>(I didn't read the whole post...)
>
>George H.

Random mutation and selection is a terrible way to design software or
logic. The numbers are not promising.

I suspect that nature doesn't use mutation+selection either; it's too
inefficient.

There are periodic bursts of enthusiasm for automatic design. So far,
none seem to work. Give it another few hundred years and we'll see.




--

John Larkin Highland Technology, Inc

lunatic fringe electronics

rickman

unread,
Mar 4, 2016, 11:47:38 AM3/4/16
to
Wow, when I read your post I thought you were talking about natural
selection. That runs in bursts too. lol

So what *does* nature use? Why does nature need to be "efficient" with
natural selection. Lots of things in nature are not efficient. Plants
store about 2% of the sunlight that hits them.

--

Rick

bitrex

unread,
Mar 4, 2016, 1:11:40 PM3/4/16
to
Mutation and selection aren't great ways to design full pieces of
software or hardware, no.

What nature-inspired algorithms _are_ good for is developing solutions
for particular problems, ones which by their nature (NP hard/NP
complete) are intractable to direct computation, and where heuristic
aids to solution are hard to come by.

bitrex

unread,
Mar 4, 2016, 1:14:36 PM3/4/16
to
Yeah, saying that "nature wouldn't do anything that's inefficient"
misunderstands the "nature" of natural selection.

Nature is full of inefficiencies. A trait doesn't have to be actively
beneficial to be selected for for a particular evolutionary niche,
simply not harmful.


John Devereux

unread,
Mar 4, 2016, 1:53:28 PM3/4/16
to
Oh no don't start him on that again....

--

John Devereux

John Larkin

unread,
Mar 4, 2016, 2:06:35 PM3/4/16
to
People are defined by the things they refuse to think about.


--

John Larkin Highland Technology, Inc
picosecond timing precision measurement

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com

rickman

unread,
Mar 4, 2016, 3:02:34 PM3/4/16
to
You mean like most of electronics? Yes, if a problem is "intractable"
to direct computation it is a hard problem. Why worry about the easy
problems?

http://www.economist.com/node/10202662

https://en.wikipedia.org/wiki/Evolved_antenna

--

Rick

John Larkin

unread,
Mar 4, 2016, 3:35:19 PM3/4/16
to
On Fri, 4 Mar 2016 07:08:24 -0500, bitrex
<bit...@de.lete.earthlink.net> wrote:

>On 03/03/2016 08:15 PM, mixed nuts wrote:
>> ...
>>
> Furthermore, the final program did not work
>>reliably when it was loaded onto other FPGAs of the same type.
>>
>> http://www.damninteresting.com/on-the-origin-of-circuits/
>>
>
>Yeah, wasn't the final problem with making any practical use of this
>that there was too much variation between the hardware of individual
>devices nominally of the same device type? So you could never know
>exactly what it would be doing with any specific device, or if it would
>work when ported to another.
>
>The only reason that there could be gates not connected to something
>that breaks functionality when they're removed is that the algorithm is
>exploiting some nonideality in the silicon.


Asynchronous logic behavior depends on prop delays, which vary from
chip to chip, and with time/temperature/Vcc. Synchronous logic will
work exactly the same on any number of chips.


>
>Genetic programming/algorithms is a practical technique, but seems to
>only work well in software, where the physical layer is abstracted away.
>Could work for FPGAs too, but it would make more sense to use an
>abstract model of the device when "evolving" the hardware.
>

The numbers are discouraging. Randomly changing on bit in an FPGA
config file, or a program binary, will almost always break it.
Changing several bits is worse. The number of ways to break even a
small design vastly exceeds the number of electrons in the universe.
And every "evolutionary" experiment has to be tested, for both
functionality and for bugs.

"Intelligent design" works better than mutation and selection.

Jeroen Belleman

unread,
Mar 4, 2016, 4:09:38 PM3/4/16
to
On 04/03/16 21:34, John Larkin wrote:
> On Fri, 4 Mar 2016 07:08:24 -0500, bitrex
> <bit...@de.lete.earthlink.net> wrote:
>
>> On 03/03/2016 08:15 PM, mixed nuts wrote:
>>> ...
>>>
>> Furthermore, the final program did not work
>>> reliably when it was loaded onto other FPGAs of the same type.
>>>
>>> http://www.damninteresting.com/on-the-origin-of-circuits/
>>>
>>
>> Yeah, wasn't the final problem with making any practical use of this
>> that there was too much variation between the hardware of individual
>> devices nominally of the same device type? So you could never know
>> exactly what it would be doing with any specific device, or if it would
>> work when ported to another.
>>
>> The only reason that there could be gates not connected to something
>> that breaks functionality when they're removed is that the algorithm is
>> exploiting some nonideality in the silicon.
>
>
> Asynchronous logic behavior depends on prop delays, which vary from
> chip to chip, and with time/temperature/Vcc. [...]

That isn't true. A properly designed asynchronous state machine
will have *one* well-defined action associated with each
combination of state and input change, just like a synchronous
SM. Admittedly, problems may arise if several inputs are allowed
to change 'simultaneously'.

SSMs can have their quirks too. The key point is that either has
to be properly designed.

Jeroen Belleman

rickman

unread,
Mar 4, 2016, 4:33:13 PM3/4/16
to
On 3/4/2016 3:34 PM, John Larkin wrote:
> On Fri, 4 Mar 2016 07:08:24 -0500, bitrex
> <bit...@de.lete.earthlink.net> wrote:
>
>> On 03/03/2016 08:15 PM, mixed nuts wrote:
>>> ...
>>>
>> Furthermore, the final program did not work
>>> reliably when it was loaded onto other FPGAs of the same type.
>>>
>>> http://www.damninteresting.com/on-the-origin-of-circuits/
>>>
>>
>> Yeah, wasn't the final problem with making any practical use of this
>> that there was too much variation between the hardware of individual
>> devices nominally of the same device type? So you could never know
>> exactly what it would be doing with any specific device, or if it would
>> work when ported to another.
>>
>> The only reason that there could be gates not connected to something
>> that breaks functionality when they're removed is that the algorithm is
>> exploiting some nonideality in the silicon.
>
>
> Asynchronous logic behavior depends on prop delays, which vary from
> chip to chip, and with time/temperature/Vcc. Synchronous logic will
> work exactly the same on any number of chips.

Clearly a man out of his element. Asynchronous logic is designed to
give repeatable results independent of the various logic delays. That
is why it is harder to design and synchronous logic is used almost
exclusively. There are companies working on async logic FPGAs which may
provide large benefits once they tame the design issues by a combination
of hardware and software targeted to the various problems.


>> Genetic programming/algorithms is a practical technique, but seems to
>> only work well in software, where the physical layer is abstracted away.
>> Could work for FPGAs too, but it would make more sense to use an
>> abstract model of the device when "evolving" the hardware.
>>
>
> The numbers are discouraging. Randomly changing on bit in an FPGA
> config file, or a program binary, will almost always break it.
> Changing several bits is worse. The number of ways to break even a
> small design vastly exceeds the number of electrons in the universe.
> And every "evolutionary" experiment has to be tested, for both
> functionality and for bugs.

Twiddling bits in a config file is not the only way to implement
evolving hardware. Your argument is exactly the argument used to refute
natural selection.


> "Intelligent design" works better than mutation and selection.

Some would disagree.

bitrex

unread,
Mar 4, 2016, 6:02:07 PM3/4/16
to
I don't think anything less than a true AI will really be any good at
designing analog electronics, except, like in the links you show, the
algorithms are used for optimizing or finding an optimal solution in a
large but in some sense still fairly restricted domain.

I'm not an expert at engineering, but a lot of it seems to be
understanding the human factor. And just because a problem is
intractable to direct algorithmic computation doesn't mean that there
won't be "obvious" solutions apparent to a human immediately which would
still take an algorithm a long time to grind out.

Machines aren't creative. They don't "understand" anything about humans.
They don't "understand" anything about anything. They're not "really"
good at playing chess.

All current search algorithms, including genetic algorithms, are
essentially a variation on a theme: you can traverse a tree or you can
roll dice, or some combination, and that's about it.

"Find the optimal antenna shape for an antenna of this size in this
application"? Sure, it can do that.

"Design me an FM transmitter that will suit this client's fickle and
changing requirements?" seems like asking an "AI" system "Hey, could you
compose a piece of music that I'll like" or "Hey, could you tell me if
this is a 'good' novel?"

It's probably far easier to have a human do it.

bitrex

unread,
Mar 4, 2016, 6:07:36 PM3/4/16
to
On 03/04/2016 03:34 PM, John Larkin wrote:
> On Fri, 4 Mar 2016 07:08:24 -0500, bitrex
> <bit...@de.lete.earthlink.net> wrote:
>
>> On 03/03/2016 08:15 PM, mixed nuts wrote:
>>> ...
>>>
>> Furthermore, the final program did not work
>>> reliably when it was loaded onto other FPGAs of the same type.
>>>
>>> http://www.damninteresting.com/on-the-origin-of-circuits/
>>>
>>
>> Yeah, wasn't the final problem with making any practical use of this
>> that there was too much variation between the hardware of individual
>> devices nominally of the same device type? So you could never know
>> exactly what it would be doing with any specific device, or if it would
>> work when ported to another.
>>
>> The only reason that there could be gates not connected to something
>> that breaks functionality when they're removed is that the algorithm is
>> exploiting some nonideality in the silicon.
>
>
> Asynchronous logic behavior depends on prop delays, which vary from
> chip to chip, and with time/temperature/Vcc. Synchronous logic will
> work exactly the same on any number of chips.
>
>
>
>
> The numbers are discouraging. Randomly changing on bit in an FPGA
> config file, or a program binary, will almost always break it.

What I'm saying is that you could certainly write a program that spits
out VHDL or Verilog files by way of genetic algorithms, and if the shit
works in Quartus II or whatever, I see no reason it won't work on the
hardware every time.


John Larkin

unread,
Mar 4, 2016, 9:02:17 PM3/4/16
to
The "genetic fpga" noted here failed when unconnected gates were
deleted, and couldn't be reproduced on another chip. Clearly the
designers (technically, I suppose, an evolutionary design has no
designers) didn't do it right.

The vast majority of async logic designs aren't done right. Even clock
domain crossings in synchronous FPGA designs tend to be hazards.

John Larkin

unread,
Mar 4, 2016, 9:08:39 PM3/4/16
to
On Fri, 4 Mar 2016 18:07:28 -0500, bitrex
And I don't think that will work.

"it works" is one part of the problem. An hour of compile, followed by
hours of exhaustive test benching, for every one of trillions of
genetically-created permutations, could get ugly.

bill....@ieee.org

unread,
Mar 4, 2016, 9:43:52 PM3/4/16
to
> >Ahh, AFAICT natural selection is fairly chaotic,
> >breeding... choosing what you want, is much faster.
> >(I didn't read the whole post...)
>
> Random mutation and selection is a terrible way to design software or
> logic. The numbers are not promising.
>
> I suspect that nature doesn't use mutation+selection either; it's too
> inefficient.

Pity about the efficiency, but it does get there in the end. Your superior scheme depended on the DNA knowing when it was mutated, which doesn't happen to be possible.

> There are periodic bursts of enthusiasm for automatic design. So far,
> none seem to work. Give it another few hundred years and we'll see.

Something will probably evolve. Biology does seem to have evolved us, and some of us are potentially capable of doing intelligent design - though it does depend on acquiring more background knowledge than you seem to have bothered to pick up.

--
Bill Sloman, Sydney

bill....@ieee.org

unread,
Mar 4, 2016, 9:45:40 PM3/4/16
to
In John Larkin's case, transformers. If he can't buy one to do a job he won't even try to think about designing one.

--
Bill Sloman, Sydney

Tom Del Rosso

unread,
Mar 4, 2016, 10:33:19 PM3/4/16
to
Allan Herriman wrote:
>
> That was in a Xilinx 6200 family part. Xilinx bought and then
> discontinued that line of parts in the 1990s.
> They were unique (at the time) in that the bitstream format was fully
> specified, allowing researchers to write tools that manipulated the
> configuration directly.

Unique only at the time? Do you mean modern ones are fully specified
too?

--


rickman

unread,
Mar 5, 2016, 1:30:13 AM3/5/16
to
Not sure what your point is. I don't think genetic algorithms can run a
pig farm either. So?

BTW, they have algorithms to learn what you like to listen to and send
more like that your way.

--

Rick

rickman

unread,
Mar 5, 2016, 1:35:46 AM3/5/16
to
I'm not sure what "fully specified" is intended to mean. If FPGAs are
not fully specified how does *anyone* generate a bitstream for them?

I think what is meant is that there was *no* combination of
configuration bits that would cause damage to the part, so the bits
could be twitled at will and tested in a real chip with no harm.

That's not really the best way to do genetic design I think since the
designs can depend on arbitrary and unspecified features of the chip,
sometimes the individual chip. Better to do the modifications at a
circuit level and selection in a simulator where only the documented
features have an effect. But then I haven't done any of this work, so I
don't know much details of how it is done. Maybe this idea just isn't
practical.

--

Rick

rickman

unread,
Mar 5, 2016, 1:38:55 AM3/5/16
to
Again, you know little about this. If people don't do async designs
right, they are just people doing designs they don't know how to do,
like many at some point in their careers. Async design is completely
practical, it is just detailed and often of little benefit.

Clock domain crossings in ASICs, FPGAs or any other logic are well
understood and are simple to deal with. Again, you need to actually
know about them to do them right... that's all.

--

Rick

rickman

unread,
Mar 5, 2016, 1:42:53 AM3/5/16
to
That is exactly the point. Constrain the problem correctly and you can
get past the more trivial problems of previous attempts at genetic
programming. I'm still not sure it will produce great designs, but part
of the reason people are doing this is to study the designs and find
potentially new ideas in how to design circuits.

Human thought is limited and often takes only very small steps away from
anything done before. Genetic design can potentially uncover completely
new aspects of design.

--

Rick

Allan Herriman

unread,
Mar 5, 2016, 2:09:40 AM3/5/16
to
On Sat, 05 Mar 2016 01:35:41 -0500, rickman wrote:

> On 3/4/2016 10:33 PM, Tom Del Rosso wrote:
>> Allan Herriman wrote:
>>>
>>> That was in a Xilinx 6200 family part. Xilinx bought and then
>>> discontinued that line of parts in the 1990s.
>>> They were unique (at the time) in that the bitstream format was fully
>>> specified, allowing researchers to write tools that manipulated the
>>> configuration directly.
>>
>> Unique only at the time? Do you mean modern ones are fully specified
>> too?
>
> I'm not sure what "fully specified" is intended to mean. If FPGAs are
> not fully specified how does *anyone* generate a bitstream for them?

Fully specified = an end user (not a Xilinx employee) could read a
datasheet that described the function of every bit in the configuration
and every programmable feature of the chip.

This still isn't done for the "big two" companies (Xilinx and Intel/
Altera).

I believe it is done for some Lattice devices. I haven't used them (or
the resulting open source toolchain) though.


> I think what is meant is that there was *no* combination of
> configuration bits that would cause damage to the part, so the bits
> could be twitled at will and tested in a real chip with no harm.

Nothing I said could be misconstrued to mean that. In fact, it was
possible to destroy that chip with a bad configuration (by e.g.
connecting VCC to GND via many routes at the same time). The researchers
had to include a pass in their genetic compiler (after the sexing and
mutating) that would fix design rule violations.

Regards,
Allan

Allan Herriman

unread,
Mar 5, 2016, 2:22:21 AM3/5/16
to
It wasn't portable or practical, but I don't think that detracts from the
wonder of a design that managed to be a frequency discriminator in an
ostensibly digital device without using a clock.

Allan

bitrex

unread,
Mar 5, 2016, 2:31:52 AM3/5/16
to
Bill Sloman = snarkier than a 29 year old female craft beer aficionado
from Cambridge, Massachusetts with a PhD in comparative literature

John Larkin

unread,
Mar 5, 2016, 4:51:58 PM3/5/16
to
On 05 Mar 2016 07:22:17 GMT, Allan Herriman
That shouldn't be difficult. What's more difficult would be to make it
reproducible.

rickman

unread,
Mar 5, 2016, 7:43:48 PM3/5/16
to
On 3/5/2016 2:09 AM, Allan Herriman wrote:
> On Sat, 05 Mar 2016 01:35:41 -0500, rickman wrote:
>
>> On 3/4/2016 10:33 PM, Tom Del Rosso wrote:
>>> Allan Herriman wrote:
>>>>
>>>> That was in a Xilinx 6200 family part. Xilinx bought and then
>>>> discontinued that line of parts in the 1990s.
>>>> They were unique (at the time) in that the bitstream format was fully
>>>> specified, allowing researchers to write tools that manipulated the
>>>> configuration directly.
>>>
>>> Unique only at the time? Do you mean modern ones are fully specified
>>> too?
>>
>> I'm not sure what "fully specified" is intended to mean. If FPGAs are
>> not fully specified how does *anyone* generate a bitstream for them?
>
> Fully specified = an end user (not a Xilinx employee) could read a
> datasheet that described the function of every bit in the configuration
> and every programmable feature of the chip.
>
> This still isn't done for the "big two" companies (Xilinx and Intel/
> Altera).
>
> I believe it is done for some Lattice devices. I haven't used them (or
> the resulting open source toolchain) though.

No, I believe Lattice does not publish specs on the bitstream. But they
make some smaller, simpler devices that allowed a brute force analysis
to figure it out... at least well enough to produce a working bit stream
for most designs.


>> I think what is meant is that there was *no* combination of
>> configuration bits that would cause damage to the part, so the bits
>> could be twitled at will and tested in a real chip with no harm.
>
> Nothing I said could be misconstrued to mean that. In fact, it was
> possible to destroy that chip with a bad configuration (by e.g.
> connecting VCC to GND via many routes at the same time). The researchers
> had to include a pass in their genetic compiler (after the sexing and
> mutating) that would fix design rule violations.

I don't recall that being the case. I remember this being discussed in
c.a.fpga and that was mentioned as a key feature, no damage possible.
However, it was some 15 or 20 years ago. So I may not be remembering it
properly.

The "right" way to do genetic design is to control documented chip
features through a tool, not by twiddling bits in a configuration file.
That lets you bypass an entire level of rather rude errors that don't
need to be tried allowing the design to be tested on any FPGA hardware
without damage. I seriously doubt genetic development stopped when they
stopped making the XC6000 devices.

--

Rick

bill....@ieee.org

unread,
Mar 5, 2016, 9:26:33 PM3/5/16
to
Whoever she is, she sounds like fun. What did you do to her to get her snark-engine running? My wife did a post-doc at MIT, and used to like real beer ...

--
Bill Sloman, Sydney

Allan Herriman

unread,
Mar 5, 2016, 10:18:30 PM3/5/16
to
I thought the whole point of the original (1990s) article was that there
is no "right" way to do genetic design and that by twiddling
configuration bits directly, the "design" process was able to exploit
undocumented features in a way that would never be dreamed up by a human
designer.

I may not be remembering it properly either.

BTW, This appears to be the original paper from 1996:
<http://citeseerx.ist.psu.edu/viewdoc/download?
doi=10.1.1.50.9691&rep=rep1&type=pdf>


Regards,
Allan

Tom Del Rosso

unread,
Mar 6, 2016, 5:07:46 AM3/6/16
to
Allan Herriman wrote:
>
> I thought the whole point of the original (1990s) article was that
> there is no "right" way to do genetic design and that by twiddling
> configuration bits directly, the "design" process was able to exploit
> undocumented features in a way that would never be dreamed up by a
> human designer.

Of course, if you intend to randomize it, then the bitstream doesn't
even need to be documented.

--


rickman

unread,
Mar 6, 2016, 9:47:35 AM3/6/16
to
Except for the damage thing. It has been indicated clearly that most
FPGAs will fry themselves if configured by a random bit stream. I think
most have some sort of checksum on the header to prevent a random stream
of bits from being loaded. So you would at least need to deal with the
format.

Allen may well be right that the XC6000 series was used because they
could scan bit streams and prevent destruction.

--

Rick

Lasse Langwadt Christensen

unread,
Mar 6, 2016, 10:02:47 AM3/6/16
to
I believe that series was special in that all bit streams were safe

-Lasse

rickman

unread,
Mar 6, 2016, 10:06:17 AM3/6/16
to
That's what I remember, but a quick scan by Google didn't find anything
to support that. It was some 15-20 years ago so I can't recall for sure.

--

Rick
0 new messages