Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

PSoC or FPGA?

1,084 views
Skip to first unread message

fasf

unread,
Mar 20, 2011, 5:10:35 AM3/20/11
to
Hi,
i've worked wit 8bit/32bit microcontrollers for 5 years and now i want
to explore new fields, so i'm interested in these two solutions: PSoC
and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and
i don't understand the differences between these programmable devices.
For hobby, what do you think is more useful to study? Can you suggest
a low cost DevKit?
Any getting started suggestion will be very appreciated

Frank Buss

unread,
Mar 20, 2011, 7:20:49 AM3/20/11
to
fasf wrote:

Depends on what you want to do with it. Maybe a FPGA would be easier for
you, if you have mostly digital knowlege with microcontrollers.

I already have some old FPGA kits, like the Spartan starterkit from
Xilinx, but Terasic has a nice new one, with the new Cyclone IV and some
useful other hardware and integrated USB blaster for programming it, for
$79:

http://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=139&No=593

I've ordered some of these boards, they'll start shipping it next week.

You can use NIOS for it, Altera has changed their price policy and the
economy version of NIOS II is free now:

http://www.altera.com/products/ip/processors/nios2/cores/economy/ni2-economy-core.html

Maybe you should start with some simple LED blinking project in VHDL to
learn the basics, first, but then NIOS would be a good starting point
for more complex projects, because you start with a microcontroller and
then you can add your own VHDL or Verilog entities, which can be
controlled by the microcontroller. Quartus Web Pack, including the NIOS
IDE and Modelsim is free, too, so this is a very inexpensive start with
a powerful system. 22,000 LEs is so many, that you can instantiate
multiple NIOS II cores inside the FPGA without problems.

Xilinx has ISE Web Pack is another good free IDE and they have some low
cost CPLDs, like the XC9572.

The Cypress PSoC is useful, if you want to do mixed signal processing.
The latest PSoC 5 family has many useful components, like 20 bit ADC,
DAC, integrated OpAmp, muxer, SPI, I2C, UART, USB etc., all routable to
any pins, with a schematic entry in the IDE, and a fast ARM CPU core.
I'm one of the winner of the first round of the Cypress challenge:

http://www.cypress.com/?id=3271

and they gave away their latest kit for all winners for free. But you
can't buy it at the moment. Maybe they are using the contest as a cheap
beta test :-) but it is a really great kit and a powerful IDE:

http://gwdevprojects.blogspot.com/2011/03/cypress-psoc-5-cy8ckit-050-with.html

--
Frank Buss, http://www.frank-buss.de
piano and more: http://www.youtube.com/user/frankbuss

k...@att.bizzzzzzzzzzzz

unread,
Mar 20, 2011, 1:19:43 PM3/20/11
to
On Sun, 20 Mar 2011 02:10:35 -0700 (PDT), fasf <silusi...@gmail.com> wrote:

>Hi,
>i've worked wit 8bit/32bit microcontrollers for 5 years and now i want
>to explore new fields, so i'm interested in these two solutions: PSoC
>and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and
>i don't understand the differences between these programmable devices.
>For hobby, what do you think is more useful to study?

FPGA, without a doubt. PSoCs are interesting but have a very limited market.

>Can you suggest a low cost DevKit?

There are hundreds of them around. I would suggest one of the Altera kits,
only because their software is better than Xilinx and Altera is more
"mainstream" than third place. Some put a lot of weight on experience with a
particular vendor.

>Any getting started suggestion will be very appreciated

For FPGAs, I'd suggest VHDL. Verilog seems to be more popular in the ASIC
world and VHDL in the FPGA world. Both would be helpful, but not at once. Do
a typical "stoplight" design, then concentrate on simulation. VHDL for design
is easy. Simulation is where it's at. Don't think you can skip simulation
for anything more complex than a blinking light.

Jan Panteltje

unread,
Mar 20, 2011, 2:14:25 PM3/20/11
to
On a sunny day (Sun, 20 Mar 2011 12:19:43 -0500) it happened
"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote in
<oddco697c3t6e9aj6...@4ax.com>:

Just for fun I did a frequency counter, a LCD display driver, and a smart card reader
all without simulation on a Spartan 2.
No problem.
I used iverilog to test some crypto, and then used simulation to test it further.
I did the TV up converter (15625 to VGA) without simulation, and the sync separator / slicer too.
and the video filter.
But I do not use MPlab for PICs either... just write the asm directly.
Faster and better that way.

There is too much simulations in this world.
NASA has changed from a manned space agency to one that sells simulations to the public of astronuts mars landing
Zillions are spend on sup[p]er computers that do simulations of problems that any sane person can work out with eyes closed in a blink.
Beware of simulations.
Reality always rules, use a scope.
But simulations can sometimes be cheaper and avoid lead poisening from soldering.
I mean I am sure they ran simultions for those nuke plants too.

k...@att.bizzzzzzzzzzzz

unread,
Mar 20, 2011, 4:16:14 PM3/20/11
to
On Sun, 20 Mar 2011 18:14:25 GMT, Jan Panteltje <pNaonSt...@yahoo.com>
wrote:


>
>There is too much simulations in this world.

You needn't say anything else. At least you're consistent; clueless.

Nico Coesel

unread,
Mar 20, 2011, 4:15:56 PM3/20/11
to
"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote:

I never used a simulator for FPGA design and I did quite a few designs
over the pas 10 years. I've always used a logic analyzer and a debug
bus which can output internal FPGA signals. The use of a simulator is
limited because you also have to model the outside world accurately.
In my experience most of the problems are in interfacing with the
(asynchronous) outside world.

Having typed that I must say that I'm interested in learning how to
use a simulator for FPGA design to see if it really can shorten
development time. I just didn't had the time yet. Simulating
electronic circuits is an art in itself. I guess simulating FPGA
designs is no different given the number of forum postings saying 'it
works in the simulator but not on my board'.

--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
nico@nctdevpuntnl (punt=.)
--------------------------------------------------------------

k...@att.bizzzzzzzzzzzz

unread,
Mar 20, 2011, 5:05:35 PM3/20/11
to

It's a *lot* easier simulating, even simple designs. There is no way you're
going to get anything of any complexity working with just internal debug
ports. A simulator can also test scenarios that are very difficult to force
in the real world.

Most of my problems with the outside world is because whatever I'm interfacing
to is not well documented. Asynchronous interfaces shouldn't be a major
problem if the design is right. Modern FPGAs are quite fast.

>Having typed that I must say that I'm interested in learning how to
>use a simulator for FPGA design to see if it really can shorten
>development time. I just didn't had the time yet.

That's what they all say. ;-) It's common to rush into the design phase.
FPGAs are no different, here, except that you really can't tie a scope on
nodes to see what's happening.

>Simulating
>electronic circuits is an art in itself. I guess simulating FPGA
>designs is no different given the number of forum postings saying 'it
>works in the simulator but not on my board'.

Simulation is an art. I did a *lot* of it, six years in design verification
on PPC processors (Nintendo and Apple processors). I certainly don't go into
that detail with my FPGA simulation, but I do a fair bit. VHDL for simulation
is *much* different than that used for synthesis. In some ways it's easier,
but the subset of the language used is far greater. Making a useful testbench
is a lot of work but it does pay off in anything but the most trivial designs.

Nico Coesel

unread,
Mar 20, 2011, 6:45:45 PM3/20/11
to
"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote:

For more complicated signal processing I prototype in C first. That's
much faster than simulating. Verifying the logic afterwards is a piece
of cake if you implement some resources for self tests.

>Most of my problems with the outside world is because whatever I'm interfacing
>to is not well documented. Asynchronous interfaces shouldn't be a major
>problem if the design is right. Modern FPGAs are quite fast.
>
>>Having typed that I must say that I'm interested in learning how to
>>use a simulator for FPGA design to see if it really can shorten
>>development time. I just didn't had the time yet.
>
>That's what they all say. ;-) It's common to rush into the design phase.
>FPGAs are no different, here, except that you really can't tie a scope on
>nodes to see what's happening.

If you create a debug unit, you can output the most important signals.
Software can also be used to read the status of the FPGA logic. This
will also allow for debugging later on.

A few years ago I had to debug a PCI based design which sometimes got
stuck when a computer of a specific type/brand rebooted. It turned out
a statemachine didn't reset properly. This problem was triggered
because the computer didn't unload the drivers before rebooting. The
internal debug facilities of the design where very useful for tracking
the problem using a logic analyzer.

You can simulate a lot but creating a set of tests which covers all
the possible mayhem coming from the outside world is next to
impossible.

hrh1818

unread,
Mar 20, 2011, 8:31:29 PM3/20/11
to

You don't need to make a choice between PSOC and FPGA. Actel sells
Smartfusion. SmartFusion intelligent mixed signal FPGAs are the only
devices that integrate an FPGA, ARM® Cortex™-M3, and programmable
analog, offering full customization, For an overview see:
http://www.actel.com/products/SmartFusion/default.aspx

An evaluation kit is available for $99.00. See:
http://www.actel.com/products/hardware/devkits_boards/smartfusion_eval.aspx

Howard

John - KD5YI

unread,
Mar 20, 2011, 8:49:59 PM3/20/11
to


Well, I will put in a plug for Cypress' PSoC. I've been using them for
years. However, if you need blazing speed, you will need to go the FPGA
route as the PSoC is nothing more than a microcontroller that has some
analog goodies.

I like the PSoC because it is extremely configurable/reconfigurable to
almost anything you need. An example that Cypress came out with was a
soft drink machine that did the usual coin counting/dispensing during
the day and at night reconfigured into a DTMF to dial the home office
then reconfigured into a 300 BPS modem to upload data then downloaded
instructions. At one company where I worked, the PSoC replaced three
other micro controllers because of its flexibility.

The scheme uses registers to set up counters, timers, amplifiers,
input/output pins, filters, and lots of other stuff. It is done by
setting registers. Most of the setup is done in the GUI provided for
free by Cypress. So, to get the chip to reconfigure itself, all you need
to do is change the register settings. Even that can be handled by the
GUI. Or, you can do it yourself.

The learning curve is steep with Cypress' PSoC 1. It is possibly shorter
with the PSoC 3 and 5 due to the schematic-type setup.

I have no financial connection with Cypress. I've just been using their
PSoC chips since about 2002.

Cheers,
John

Muzaffer Kal

unread,
Mar 20, 2011, 11:20:25 PM3/20/11
to
On Sun, 20 Mar 2011 22:45:45 GMT, ni...@puntnl.niks (Nico Coesel)
wrote:
...

>You can simulate a lot but creating a set of tests which covers all
>the possible mayhem coming from the outside world is next to
>impossible.

This suggests that making an ASIC work right the first time is next to
impossible but experience shows that it's not. So it's just a matter
of how diligent one is with one's tests.
If you are not simulating, you are leaving a lot on the table with
respect to time (a largish FPGA can take hours to synthesize+p&r),
observability and controllability of the chip under test.
--
Muzaffer Kal

DSPIA INC.
ASIC/FPGA Design Services

http://www.dspia.com

Jan Panteltje

unread,
Mar 21, 2011, 4:41:39 AM3/21/11
to
On a sunny day (Sun, 20 Mar 2011 15:16:14 -0500) it happened
"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote in
<62oco610kr2ullbgc...@4ax.com>:

Take your intelligence for example.

I do not think we will ever see any code from you,
all you do is boast about how big the things are you make,
insult people, etc,
while from the time it takes you to do the simplest things I guess you are at least incompetent, and at most a windbag.

Nico Coesel

unread,
Mar 21, 2011, 11:44:26 AM3/21/11
to
Muzaffer Kal <k...@dspia.com> wrote:

>On Sun, 20 Mar 2011 22:45:45 GMT, ni...@puntnl.niks (Nico Coesel)
>wrote:
>...
>>You can simulate a lot but creating a set of tests which covers all
>>the possible mayhem coming from the outside world is next to
>>impossible.
>
>This suggests that making an ASIC work right the first time is next to
>impossible but experience shows that it's not. So it's just a matter
>of how diligent one is with one's tests.

Well, tell that to the big semi manufacturers :-) Many
microcontrollers and SoCs have several versions with bugs. Recently
even Intel got bitten badly.

And there still may be unexpected behaviour coming from the outside
which might not be addressed by the design (incomplete specification).
If you have an ASIC you most probably need to fix the outside world,
if you are using an FPGA (or something else which is programmable) you
most likely end up fixing the FPGA design.

I think FPGA simulation is very usefull but I always got by without it
or used other methods for verification. The maximum size of the
designs I worked on is about 800k to 1M equivalent gates.

Joel Koltner

unread,
Mar 21, 2011, 12:58:00 PM3/21/11
to
"Jan Panteltje" <pNaonSt...@yahoo.com> wrote in message
news:im5g4j$t06$2...@news.albasani.net...

> Just for fun I did a frequency counter, a LCD display driver, and a smart
> card reader
> all without simulation on a Spartan 2.
> No problem.
> I used iverilog to test some crypto, and then used simulation to test it
> further.
> I did the TV up converter (15625 to VGA) without simulation, and the sync
> separator / slicer too.
> and the video filter.
> But I do not use MPlab for PICs either... just write the asm directly.
> Faster and better that way.
>
> There is too much simulations in this world.

There's also too little simulation in the world.

The project you described, yeah, they're well within the realm of being
straightforward enough to not bother with any simulation before you just
program up a part and see what happens. But when you get to more complex
systems, even with good designers it rapidly becomes obvious that simulation
more than pays for itself. E.g., some years back I designed a DMA-based
memory controller for 8 slots worth of SDRAM SIMMs, and since I wanted it to
be reasonably efficient the memory controller could have several memory
accesses in process simulation and would also re-order incoming DMA requests
to better utilize SDRAM banks that had already been opened for other
operations. I'm quite certain that without the Microon-provided SDRAM
simulation models I used, I would have spent hundreds of additional hours
trying to get the thing working -- if only due to ending up writing my own
testbenches, since this had a 128-bit wide data bus running at over 100MHz,
and I sure didn't have a logic analyzer or anything else that had nearly
enough inputs to try to watch what was going on -- much less make much sense
of it all (since all the acceses were so heavily interleaved).

The reason there's too little simulation in the world is that even of people
who buy into the idea that simulation is a Good Thing, I've seen far too much
cases where the designer (and this applies to those using C/C++ as well as
VHDL/Verilog) writes a testbench that only exercises what the design *should*
do... which is largely worthless; by far the greatest value from testbenches
comes when you purposely try to destroy your design by feeding it an
unrelenting stream of garbage data... or at least a stream designed to abuse
the underlying model as much as possible. :-) (This is how I tested the
SDRAM controller -- feeding it a new randomized request through every single
DMA port on every single clock cycle...)

Although one doesn't always have a choice, it's best if someone else writes
the testbench for your design... and that that person's deepest desire is to
make your design fail. :-)

> Reality always rules, use a scope.

For a board level design, sure. For FPGA and (particularly) IC designs, it's
quite expensive.

---Joel

Jan Panteltje

unread,
Mar 21, 2011, 4:25:39 PM3/21/11
to
On a sunny day (Mon, 21 Mar 2011 09:58:00 -0700) it happened "Joel Koltner"
<zapwireD...@yahoo.com> wrote in
<tsLhp.777261$pX3.4...@en-nntp-11.dc1.easynews.com>:

>> There is too much simulations in this world.
>
>There's also too little simulation in the world.

<snip good reasons given>

>> Reality always rules, use a scope.
>
>For a board level design, sure. For FPGA and (particularly) IC designs, it's
>quite expensive.

Yes OK, and I use LTspice too...
But what I do not like is where simulations replace what should be reality,
like NASA having simulations of people landing on mars,
while all they can really do is fly around the block (earth).
The brain is quite powerful, and it can do most of the simulations that
supercomputers are now used for, those are merely executing some equations
that are always a subset of reality, then tinkered in a way so they see what they want to see,
and that then is called a new 'theory'.
Or in electronics 'design'.
Simulations is a drug.
What you need is overview, and detailed knowledge of what you are doing.
Bottom up, modular.
Then nothing is impossible, and the sky the limit.
Top-down with simulations sucks.

Nico Coesel

unread,
Mar 21, 2011, 4:25:49 PM3/21/11
to
John - KD5YI <sop...@invalid.org> wrote:

>On 3/20/2011 4:10 AM, fasf wrote:
>> Hi,
>> i've worked wit 8bit/32bit microcontrollers for 5 years and now i want
>> to explore new fields, so i'm interested in these two solutions: PSoC
>> and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and
>> i don't understand the differences between these programmable devices.
>> For hobby, what do you think is more useful to study? Can you suggest
>> a low cost DevKit?
>> Any getting started suggestion will be very appreciated
>
>
>Well, I will put in a plug for Cypress' PSoC. I've been using them for
>years. However, if you need blazing speed, you will need to go the FPGA
>route as the PSoC is nothing more than a microcontroller that has some
>analog goodies.
>
>I like the PSoC because it is extremely configurable/reconfigurable to
>almost anything you need. An example that Cypress came out with was a
>

>I have no financial connection with Cypress. I've just been using their
>PSoC chips since about 2002.

Looks interesting. I checked the website a bit but couldn't find all I
want to know: How is the analog performance regarding noise and
bandwidth? Can you also use a Psoc as a reconfigurable analog brick
(analog in - analog out)?

Joel Koltner

unread,
Mar 21, 2011, 4:31:27 PM3/21/11
to
"Jan Panteltje" <pNaonSt...@yahoo.com> wrote in message
news:im8c6e$crk$1...@news.albasani.net...

> Simulations is a drug.
> What you need is overview, and detailed knowledge of what you are doing.
> Bottom up, modular.
> Then nothing is impossible, and the sky the limit.
> Top-down with simulations sucks.

Ever hard this quote, Jan?

--> "Some people, when confronted with a problem, think "I know, I'll use
regular expressions." Now they have two problems."

:-)

Jon Kirwan

unread,
Mar 21, 2011, 4:51:36 PM3/21/11
to
On Sun, 20 Mar 2011 12:20:49 +0100, Frank Buss
<f...@frank-buss.de> wrote:

>fasf wrote:
>
>> i've worked wit 8bit/32bit microcontrollers for 5 years and now i want
>> to explore new fields, so i'm interested in these two solutions: PSoC
>> and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and
>> i don't understand the differences between these programmable devices.
>> For hobby, what do you think is more useful to study? Can you suggest
>> a low cost DevKit?
>> Any getting started suggestion will be very appreciated
>
>Depends on what you want to do with it. Maybe a FPGA would be easier for
>you, if you have mostly digital knowlege with microcontrollers.
>
>I already have some old FPGA kits, like the Spartan starterkit from
>Xilinx, but Terasic has a nice new one, with the new Cyclone IV and some
>useful other hardware and integrated USB blaster for programming it, for
>$79:
>
>http://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=139&No=593
>
>I've ordered some of these boards, they'll start shipping it next week.

That looks very interesting and at a very good price,
considering the plexi cover and all the rest.

>You can use NIOS for it, Altera has changed their price policy and the
>economy version of NIOS II is free now:
>
>http://www.altera.com/products/ip/processors/nios2/cores/economy/ni2-economy-core.html

><snip>

That's good. What about the tools needed to load, modify,
and recompile with it? It looks like the EDS is free:

http://www.altera.com/products/ip/processors/nios2/tools/ni2-development_tools.html

But have you gone through the steps? Is the Eclipse IDE also
free? (I'm not sure yet from a quick scan, but need to spend
more time I suppose.)

Jon

Phil Hobbs

unread,
Mar 21, 2011, 5:23:34 PM3/21/11
to

Taking a simulation and poking it until it sort-of works is a recipe for
mediocre performance, at best. On the other hand, board turns and the
attendant delays are expensive. In analogue, it's best to design stuff
by hand and simulate to verify (and maybe optimize some badly-behaved
stuff that's hard to do by algebra, e.g. parametric effects).

I'm a big fan of debuggers for code, because that makes it faster to
explore all the branches and reduces the number of bugs that make it
into the version control system. I've never done an FPGA, but I suspect
they might be similar in that respect.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal
ElectroOptical Innovations
55 Orchard Rd
Briarcliff Manor NY 10510
845-480-2058

email: hobbs (atsign) electrooptical (period) net
http://electrooptical.net

k...@att.bizzzzzzzzzzzz

unread,
Mar 21, 2011, 7:13:19 PM3/21/11
to
On Mon, 21 Mar 2011 08:41:39 GMT, Jan Panteltje <pNaonSt...@yahoo.com>
wrote:

>On a sunny day (Sun, 20 Mar 2011 15:16:14 -0500) it happened
>"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote in
><62oco610kr2ullbgc...@4ax.com>:
>
>>On Sun, 20 Mar 2011 18:14:25 GMT, Jan Panteltje <pNaonSt...@yahoo.com>
>>wrote:
>>
>>
>>>
>>>There is too much simulations in this world.
>>
>>You needn't say anything else. At least you're consistent; clueless.
>
>Take your intelligence for example.
>
>I do not think we will ever see any code from you,

Wow! You got that right. I don't WRITE CODE, moron.

>all you do is boast about how big the things are you make,
>insult people, etc,
>while from the time it takes you to do the simplest things I guess you are at least incompetent, and at most a windbag.

You sure are a useless twit, JP.

k...@att.bizzzzzzzzzzzz

unread,
Mar 21, 2011, 7:20:13 PM3/21/11
to
On Mon, 21 Mar 2011 15:44:26 GMT, ni...@puntnl.niks (Nico Coesel) wrote:

>Muzaffer Kal <k...@dspia.com> wrote:
>
>>On Sun, 20 Mar 2011 22:45:45 GMT, ni...@puntnl.niks (Nico Coesel)
>>wrote:
>>...
>>>You can simulate a lot but creating a set of tests which covers all
>>>the possible mayhem coming from the outside world is next to
>>>impossible.
>>
>>This suggests that making an ASIC work right the first time is next to
>>impossible but experience shows that it's not. So it's just a matter
>>of how diligent one is with one's tests.
>
>Well, tell that to the big semi manufacturers :-)

Um, I worked for one. I was on the verification team (a department of twelve)
for some rather "big" processors.

>Many
>microcontrollers and SoCs have several versions with bugs. Recently
>even Intel got bitten badly.

Are you saying that you think simulation should make perfection automatic? No,
there are too many things that go unsimulated. That is, too little
simulation, not too much.

>And there still may be unexpected behaviour coming from the outside
>which might not be addressed by the design (incomplete specification).
>If you have an ASIC you most probably need to fix the outside world,
>if you are using an FPGA (or something else which is programmable) you
>most likely end up fixing the FPGA design.

Duh!

>I think FPGA simulation is very usefull but I always got by without it
>or used other methods for verification. The maximum size of the
>designs I worked on is about 800k to 1M equivalent gates.

How many flops? "Equivalent gates" is a meaningless number.

k...@att.bizzzzzzzzzzzz

unread,
Mar 21, 2011, 7:23:18 PM3/21/11
to
On Sun, 20 Mar 2011 17:31:29 -0700 (PDT), hrh1818 <hr...@att.net> wrote:

>On Mar 20, 4:10 am, fasf <silusilus...@gmail.com> wrote:
>> Hi,
>> i've worked wit 8bit/32bit microcontrollers for 5 years and now i want
>> to explore new fields, so i'm interested in these two solutions: PSoC
>> and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and
>> i don't understand the differences between these programmable devices.
>> For hobby, what do you think is more useful to study? Can you suggest
>> a low cost DevKit?
>> Any getting started suggestion will be very appreciated
>
>You don't need to make a choice between PSOC and FPGA. Actel sells
>Smartfusion. SmartFusion intelligent mixed signal FPGAs are the only
>devices that integrate an FPGA, ARM® Cortex™-M3, and programmable
>analog, offering full customization, For an overview see:
>http://www.actel.com/products/SmartFusion/default.aspx

SmartFusion is an FPGA. It's really nothing like a PSoC.

SmartFusion is really slick, but I wouldn't recommend it for an introduction
into FPGAs. If he's looking to get into the business, I'd suggest something
more mainstream than Actel. Too many HR types will look for Altera or Xilinx
keywords (work them into the resume somehow ;-).

k...@att.bizzzzzzzzzzzz

unread,
Mar 21, 2011, 7:25:16 PM3/21/11
to
On Mon, 21 Mar 2011 20:25:49 GMT, ni...@puntnl.niks (Nico Coesel) wrote:

>John - KD5YI <sop...@invalid.org> wrote:
>
>>On 3/20/2011 4:10 AM, fasf wrote:
>>> Hi,
>>> i've worked wit 8bit/32bit microcontrollers for 5 years and now i want
>>> to explore new fields, so i'm interested in these two solutions: PSoC
>>> and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and
>>> i don't understand the differences between these programmable devices.
>>> For hobby, what do you think is more useful to study? Can you suggest
>>> a low cost DevKit?
>>> Any getting started suggestion will be very appreciated
>>
>>
>>Well, I will put in a plug for Cypress' PSoC. I've been using them for
>>years. However, if you need blazing speed, you will need to go the FPGA
>>route as the PSoC is nothing more than a microcontroller that has some
>>analog goodies.
>>
>>I like the PSoC because it is extremely configurable/reconfigurable to
>>almost anything you need. An example that Cypress came out with was a
>>
>>I have no financial connection with Cypress. I've just been using their
>>PSoC chips since about 2002.
>
>Looks interesting. I checked the website a bit but couldn't find all I
>want to know: How is the analog performance regarding noise and
>bandwidth? Can you also use a Psoc as a reconfigurable analog brick
>(analog in - analog out)?

I looked into them a year ago, or so. The A/D stuff was mediocre, at best. I
didn't think the analog performance was anything to write home about, either.

Nico Coesel

unread,
Mar 21, 2011, 7:31:31 PM3/21/11
to
"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote:

Good question. I think somewhere between 1000 to 2000.

k...@att.bizzzzzzzzzzzz

unread,
Mar 21, 2011, 8:08:36 PM3/21/11
to

That's a moderate size. I'd never attempt anything even that complex without
a pretty extensive simulation suite. I prefer to spend more time in
simulation than in debug.

Frank Buss

unread,
Mar 22, 2011, 2:20:06 AM3/22/11
to
Jon Kirwan wrote:

> That's good. What about the tools needed to load, modify,
> and recompile with it? It looks like the EDS is free:
>
> http://www.altera.com/products/ip/processors/nios2/tools/ni2-development_tools.html
>
> But have you gone through the steps? Is the Eclipse IDE also
> free? (I'm not sure yet from a quick scan, but need to spend
> more time I suppose.)

Yes, I have tested it with my old T-Rex board and it works. I can create
a NIOS II CPU with the free NIOS II EDS is and Quartus II Web Edition.
The economy version is not crypted anymore, so you can see all the
generated VHDL code for the CPU (doesn't look good, as usual for
generated code) and the peripherals. If you choose the larger NIOS II
CPU models, it will be crypted and time limited and you have to buy a
license.

But you have to spend some days learning the system. I've done a project
some years ago, so I know where the pitfalls were, because NIOS EDS is
not self-explanatory. E.g. you have to create a Quartus project, first,
then call the SOPC Builder function from within and after you've
designed your CPU system and peripheral, you have to start Eclipse and
use the SOPC file for creating the Eclipse project. But when I tried the
hello world wizard, it created the BSP, only, not the project, so I
created it manually, but then the LED blinked on my T-Rex, after
importing the generated VHDL files manually into the Quartus project.
But it is worth to learn it, because there are lots of free cores in the
SOPC Builder, which can be added by mouse click to the system and once
the Eclipse project works, you can debug the system over JTAG. Maybe it
is easier with some tutorial from Altera, I guess they have some.

My main job is not hardware designing, so from a programmer perspective
last time I tried to understand how the NIOS EDS worked, the whole
system looked like a duct-taped system, with lots of Shell and TCL
scripts, Perl code generators, Makefiles, all called from Java and the
Makefiles called Java programs again. Perl, Shell etc. was running with
Cygwin, which doesn't help for the speed. But at least it works and you
can create commercial projects with it. Altera support was good, too,
e.g. they helped me when I tried to do unusual things, like combining
the bitstream configuration file and the NIOS ELF file into the
configuration EEPROM, with my own checksum, which was possible, if you
write your own Bash scripts.

Jon Kirwan

unread,
Mar 22, 2011, 6:09:44 AM3/22/11
to
On Tue, 22 Mar 2011 07:20:06 +0100, Frank Buss
<f...@frank-buss.de> wrote:

>Jon Kirwan wrote:
>
>> That's good. What about the tools needed to load, modify,
>> and recompile with it? It looks like the EDS is free:
>>
>> http://www.altera.com/products/ip/processors/nios2/tools/ni2-development_tools.html
>>
>> But have you gone through the steps? Is the Eclipse IDE also
>> free? (I'm not sure yet from a quick scan, but need to spend
>> more time I suppose.)
>
>Yes, I have tested it with my old T-Rex board and it works. I can create
>a NIOS II CPU with the free NIOS II EDS is and Quartus II Web Edition.
>The economy version is not crypted anymore, so you can see all the
>generated VHDL code for the CPU (doesn't look good, as usual for
>generated code) and the peripherals. If you choose the larger NIOS II
>CPU models, it will be crypted and time limited and you have to buy a
>license.

Okay. I'll take a crack at it. I've an older xilinx 4000
series board I used to learn VHDL and _some_ floorplanning
skills for fun. Been a while, though. It is worth another
go. (Never did try verilog, yet.)

I enjoyed struggling to develop a tiny cpu and achieved some
modest success -- succeeded on a reasonably useful ALU and
learned a little about carry forward vs ripple carry and
about booth's divider, for example.

This sounds like still more cheap fun, though I wonder if
each cell in the Altera devices are as fancy as the xilinx
4000 cells were.

Question is, do I have time right now? Maybe.

>But you have to spend some days learning the system. I've done a project
>some years ago, so I know where the pitfalls were, because NIOS EDS is
>not self-explanatory. E.g. you have to create a Quartus project, first,
>then call the SOPC Builder function from within and after you've
>designed your CPU system and peripheral, you have to start Eclipse and
>use the SOPC file for creating the Eclipse project. But when I tried the
>hello world wizard, it created the BSP, only, not the project, so I
>created it manually, but then the LED blinked on my T-Rex, after
>importing the generated VHDL files manually into the Quartus project.
>But it is worth to learn it, because there are lots of free cores in the
>SOPC Builder, which can be added by mouse click to the system and once
>the Eclipse project works, you can debug the system over JTAG. Maybe it
>is easier with some tutorial from Altera, I guess they have some.

I may have some questions when the time comes. But the above
helps by giving me some things to check up on.

>My main job is not hardware designing, so from a programmer perspective
>last time I tried to understand how the NIOS EDS worked, the whole
>system looked like a duct-taped system, with lots of Shell and TCL
>scripts, Perl code generators, Makefiles, all called from Java and the
>Makefiles called Java programs again. Perl, Shell etc. was running with
>Cygwin, which doesn't help for the speed. But at least it works and you
>can create commercial projects with it. Altera support was good, too,
>e.g. they helped me when I tried to do unusual things, like combining
>the bitstream configuration file and the NIOS ELF file into the
>configuration EEPROM, with my own checksum, which was possible, if you
>write your own Bash scripts.

Sounds positive. And that's a lot, these days.

Jon

Jan Panteltje

unread,
Mar 22, 2011, 6:14:37 AM3/22/11
to
On a sunny day (Mon, 21 Mar 2011 18:13:19 -0500) it happened

izzzzzzzzzzzzzzzzzzzzzzzz> wrote:
>Wow! You got that right. I don't WRITE CODE,

Yea, that was clear already.


Jan Panteltje

unread,
Mar 22, 2011, 6:14:37 AM3/22/11
to
On a sunny day (Mon, 21 Mar 2011 17:23:34 -0400) it happened Phil Hobbs
<pcdhSpamM...@electrooptical.net> wrote in
<4D87C1D6...@electrooptical.net>:

>Taking a simulation and poking it until it sort-of works is a recipe for
>mediocre performance, at best. On the other hand, board turns and the
>attendant delays are expensive. In analogue, it's best to design stuff

>by hand and simulate to verify (and maybe optimise some badly-behaved

>stuff that's hard to do by algebra, e.g. parametric effects).
>
>I'm a big fan of debuggers for code, because that makes it faster to
>explore all the branches and reduces the number of bugs that make it
>into the version control system.

I stopped using debuggers in the eighties, after reading a paper from university
that argued against debuggers in higher level languages,.
It suggested to use print statements.
Even asm is a high level language for me, usually first thing I do is add some routines to print decimal and ASCII
via a serial port or pin for debug.
Else in C printf().
But in C I start most functions with a parameter check, parameter report, that can be enabled
by a debug flag set on the command line. So if I run
program -v
then it will print all function calls and all variables to those functions.
Important in this is that you can have a user / customer run the program like that if it ever crashes
and send you the output text, the last line will be the function with the incorrect code.
It takes some discipline coding, but it works 100%.

As for FPGA, I have not done so much with those as some geniuses here,
the main philosophical difference is that things can happens in parallel,
maybe difficult to grasp if one comes from a sequential programming background,
but for somebody from a hardware background it is just a lot of simple hardware blocks connected together.
With wires.
Means you can build your own collection of blocks just like library functions.
Issues are clock timing, gate delays, just like ordinary digital logic.
FPGA vendors have their own libraries to solve standard problems, usually in an optimised form.
That means however not all things are easily portable from one make FPGA to an other,
It also means one manufacturer's FPGA may be more suitable to solve a specific problem than an other one.
Have not used a FPGA in over a year, so maybe I should shut up on that subject.
Some FPGA manufacturers have the worst software in the universe except for the Silversoft C compiler I once had
for the Sinclair ZX80, maybe created by the same team? Xilinx comes to mind.

Jan Panteltje

unread,
Mar 22, 2011, 6:14:38 AM3/22/11
to
On a sunny day (Mon, 21 Mar 2011 13:31:27 -0700) it happened "Joel Koltner"
<zapwireD...@yahoo.com> wrote in
<AAOhp.318155$Mg5.1...@en-nntp-06.dc1.easynews.com>:

Couple of years ago there was a Dutch software company that designed the
luggage handling system for the new UK air terminal.
Their simulation sold it I think.
But the reality version had some serious problems that made the news quite a bit.

Jon Kirwan

unread,
Mar 22, 2011, 6:21:16 AM3/22/11
to
On Tue, 22 Mar 2011 03:09:44 -0700, I wrote:

><snip>


>learned a little about carry forward vs ripple carry and
>about booth's divider, for example.

><snip>

Sorry. I meant booth's multiplication method. For the
divider, I tried a technique I found in Analog Device's book
on the ADSP-21xx family and non-restoring division, too.

Jon

Warren

unread,
Mar 22, 2011, 9:26:45 AM3/22/11
to
Jan Panteltje expounded in
news:im9soq$e0a$5...@news.albasani.net:

> On a sunny day (Mon, 21 Mar 2011 17:23:34 -0400) it
> happened Phil Hobbs
> <pcdhSpamM...@electrooptical.net> wrote in
> <4D87C1D6...@electrooptical.net>:
>
>>Taking a simulation and poking it until it sort-of works is
>>a recipe for mediocre performance, at best. On the other
>>hand, board turns and the attendant delays are expensive.
>>In analogue, it's best to design stuff by hand and simulate
>>to verify (and maybe optimise some badly-behaved stuff
>>that's hard to do by algebra, e.g. parametric effects).
>>
>>I'm a big fan of debuggers for code, because that makes it
>>faster to explore all the branches and reduces the number
>>of bugs that make it into the version control system.
>
> I stopped using debuggers in the eighties, after reading a
> paper from university that argued against debuggers in
> higher level languages,. It suggested to use print
> statements.

If you work on large systems, then avoiding debuggers is a
huge waste of time. When you have intermittant or unexplained
failure(s) a debugger saves gobs of time since you don't have
to anticipate what to "print".

When your code core dumps, a debugger will trace back the
precise point of the failure and allows you to look at
anything related to it.

All of this facility comes with a simple compile option. No
need to code special macros.

In short, you're following some bad and antiquated advice.

Warren

Jan Panteltje

unread,
Mar 22, 2011, 1:54:25 PM3/22/11
to
On a sunny day (Tue, 22 Mar 2011 13:26:45 +0000 (UTC)) it happened Warren
<ve3...@gmail.com> wrote in
<Xns9EB0601772EF2W...@188.40.43.213>:

>If you work on large systems, then avoiding debuggers is a
>huge waste of time. When you have intermittant or unexplained
>failure(s) a debugger saves gobs of time since you don't have
>to anticipate what to "print".
>
>When your code core dumps, a debugger will trace back the
>precise point of the failure and allows you to look at
>anything related to it.
>
>All of this facility comes with a simple compile option. No
>need to code special macros.
>
>In short, you're following some bad and antiquated advice.
>
>Warren

That is a matter of opinion.
I have seen to many programmers staring too long at little windows with register values...
While just some sane coding and understanding WHAT you were doing (in C),
and print statements in the right place, would have saved hours, and
prevented all that 'searching', and segfaults too.

It is a beginners idea, I started with 'debugger'.
From a programming POV at some point you have to trust the compiler writers,
no need to go through the generated asm even.

You can take that too far too.
For example write in Java because you are scared of pointers, and then
make a product that sucks for speed and performance.
'Oh lets write this in Java, it will run anywhere'.
NOT.
Coding is not for everyone, but many who have no talent for it do it for a job,
like music made by somebody who cannot play.
For those those sluggish languages and C++ was created, too bad it only makes their work worse.
And then people start to believe you need many cores many GHz to do the simplest things, like email.
And that sells hardware, so those manufacturers will not disturb that dream.
The ones who sell you the next C++ compiler, plus debugger, will not disturb your dream either.
Reality is however that you should have a clear mind when you code and not make one
error every line, and spend 30 minutes with a debugger for every line of code you write.
The fable of 'large project' is just that.
Things should be modular, and C and C libraries is a very good way to do that.
Linux is a good example from a huge project with many distributed developers,
written in C, and in Linux I never used a debugger ever.
Oh I know gdb, I played with it, but I rather write code than trace gcc output.
Sure there are bugs in each gcc version, just that those never seem to have wrecked my code.
Some sanity is required.
Not everybody is a programmer, not everybody loves that stuff.
I do not claim to be a programmer, I just write what I need, and somehow enjoy that.
Some others benefit as I make much code available, if I can, and it is not NDA stuff.
I know my programs are full of bugs, but they get the job done.
That is the point where my quest for perfection in programming usually ends.
Programs are a tool, not a purpose in themselves, no tool is absolutely perfect,
try looking for the perfect hammer, I have broken some expensive ones...
Anyways, there is no end to what can be said about that subject, and no limit to all the different viewpoints.,
This is just my viewpoint.
So I will leave it at that,

Frank Buss

unread,
Mar 22, 2011, 2:31:10 PM3/22/11
to
Jon Kirwan wrote:

> I may have some questions when the time comes. But the above
> helps by giving me some things to check up on.

comp.arch.fpga is a good newsgroup for FPGA questions. And there are far
less offtopic posts than in this newsgroup, maybe because of less
retirees and more people doing real work :-)

Jon Kirwan

unread,
Mar 22, 2011, 3:43:23 PM3/22/11
to
On Tue, 22 Mar 2011 19:31:10 +0100, Frank Buss
<f...@frank-buss.de> wrote:

>Jon Kirwan wrote:
>
>> I may have some questions when the time comes. But the above
>> helps by giving me some things to check up on.
>
>comp.arch.fpga is a good newsgroup for FPGA questions. And there are far
>less offtopic posts than in this newsgroup, maybe because of less
>retirees and more people doing real work :-)

Thanks.

Jon

Warren

unread,
Mar 22, 2011, 4:01:43 PM3/22/11
to
Jan Panteltje expounded in
news:imanmt$clr$1...@news.albasani.net:

> On a sunny day (Tue, 22 Mar 2011 13:26:45 +0000 (UTC)) it
> happened Warren <ve3...@gmail.com> wrote in
> <Xns9EB0601772EF2W...@188.40.43.213>:
>
>>If you work on large systems, then avoiding debuggers is a
>>huge waste of time. When you have intermittant or
>>unexplained failure(s) a debugger saves gobs of time since
>>you don't have to anticipate what to "print".
>>
>>When your code core dumps, a debugger will trace back the
>>precise point of the failure and allows you to look at
>>anything related to it.
>>
>>All of this facility comes with a simple compile option. No
>>need to code special macros.
>>
>>In short, you're following some bad and antiquated advice.
>>
>>Warren
>
> That is a matter of opinion.

Your's is in the minority. :)

> I have seen to many programmers staring too long at little
> windows with register values... While just some sane coding
> and understanding WHAT you were doing (in C), and print
> statements in the right place, would have saved hours, and
> prevented all that 'searching', and segfaults too.

That speaks volumes about the programmers-- and nothing about
the value of the debugger. Don't get me wrong- a few carefully
crafted prints can augment a difficult debug.

But to write off a debugger is like saying "I can saw it by
hand, so I don't need no stinkin' table saw".

Warren

k...@att.bizzzzzzzzzzzz

unread,
Mar 22, 2011, 7:24:14 PM3/22/11
to
On Tue, 22 Mar 2011 03:09:44 -0700, Jon Kirwan <jo...@infinitefactors.org>
wrote:

>On Tue, 22 Mar 2011 07:20:06 +0100, Frank Buss
><f...@frank-buss.de> wrote:
>
>>Jon Kirwan wrote:
>>
>>> That's good. What about the tools needed to load, modify,
>>> and recompile with it? It looks like the EDS is free:
>>>
>>> http://www.altera.com/products/ip/processors/nios2/tools/ni2-development_tools.html
>>>
>>> But have you gone through the steps? Is the Eclipse IDE also
>>> free? (I'm not sure yet from a quick scan, but need to spend
>>> more time I suppose.)
>>
>>Yes, I have tested it with my old T-Rex board and it works. I can create
>>a NIOS II CPU with the free NIOS II EDS is and Quartus II Web Edition.
>>The economy version is not crypted anymore, so you can see all the
>>generated VHDL code for the CPU (doesn't look good, as usual for
>>generated code) and the peripherals. If you choose the larger NIOS II
>>CPU models, it will be crypted and time limited and you have to buy a
>>license.
>
>Okay. I'll take a crack at it. I've an older xilinx 4000
>series board I used to learn VHDL and _some_ floorplanning
>skills for fun. Been a while, though. It is worth another
>go. (Never did try verilog, yet.)
>
>I enjoyed struggling to develop a tiny cpu and achieved some
>modest success -- succeeded on a reasonably useful ALU and
>learned a little about carry forward vs ripple carry and
>about booth's divider, for example.

That stuff is interesting in itself but not of much use when designing with
FPGAs. FPGA architecture favors certain types of adders. It's rare that
you'll beat synthesis for an ALU. As far as more complex functions like
multipliers and dividers, the manufacturer's libraries are also going to be
hard to beat.

>This sounds like still more cheap fun, though I wonder if
>each cell in the Altera devices are as fancy as the xilinx
>4000 cells were.

Modern devices are *way* beyond where the Xilinx 4Ks were. The Altera and
Xilinx FPGA logic elements are pretty similar (their CPLDs are quite
different).

>Question is, do I have time right now? Maybe.

Always a good question. Priorities.

<...>

k...@att.bizzzzzzzzzzzz

unread,
Mar 22, 2011, 7:24:48 PM3/22/11
to
On Tue, 22 Mar 2011 10:14:37 GMT, Jan Panteltje <pNaonSt...@yahoo.com>
wrote:

>On a sunny day (Mon, 21 Mar 2011 18:13:19 -0500) it happened

Moron, I'm hardware engineer. Software is for dweebs.

Jon Kirwan

unread,
Mar 22, 2011, 8:51:01 PM3/22/11
to

I am doing none of this for professional work -- I just enjoy
learning, a lot. I want to know how to do things simply
because I enjoy the process. Which means that I'd love to
learn from the libraries of others, but I also need to learn
to walk, first, so that I can place what I learn from
professionals into a better contextual frame.

In any case, this kind of thing is pure joy to me.

>>This sounds like still more cheap fun, though I wonder if
>>each cell in the Altera devices are as fancy as the xilinx
>>4000 cells were.
>
>Modern devices are *way* beyond where the Xilinx 4Ks were. The Altera and
>Xilinx FPGA logic elements are pretty similar (their CPLDs are quite
>different).

I need to revisit, obviously.

>>Question is, do I have time right now? Maybe.
>
>Always a good question. Priorities.

Hehe.

Jon

k...@att.bizzzzzzzzzzzz

unread,
Mar 22, 2011, 9:48:11 PM3/22/11
to
On Tue, 22 Mar 2011 17:51:01 -0700, Jon Kirwan <jo...@infinitefactors.org>
wrote:

Understood, I just wanted to make the point that while this stuff is useful to
know, it doesn't have much application in FPGAs. ASIC design is a whole
'nuther kettle. FPGA, in this way, are much more restrictive.

>In any case, this kind of thing is pure joy to me.

It's even more fun when you can get someone to pay you to do it. ;-)

>>>This sounds like still more cheap fun, though I wonder if
>>>each cell in the Altera devices are as fancy as the xilinx
>>>4000 cells were.
>>
>>Modern devices are *way* beyond where the Xilinx 4Ks were. The Altera and
>>Xilinx FPGA logic elements are pretty similar (their CPLDs are quite
>>different).
>
>I need to revisit, obviously.

Indeed. The first time I saw carry chains, I said "yech, carry chains?". I
finally had to suck it up and admit that a ripple counter was faster than the
fancier counters, so why knock myself out. Besides, the carry chains are
quite hand for other functions. ;-)

>>>Question is, do I have time right now? Maybe.
>>
>>Always a good question. Priorities.
>
>Hehe.

;-)

Jon Kirwan

unread,
Mar 23, 2011, 4:06:51 AM3/23/11
to
On Tue, 22 Mar 2011 20:48:11 -0500, "k...@att.bizzzzzzzzzzzz"
<k...@att.bizzzzzzzzzzzz> wrote:

><snip>


>It's even more fun when you can get someone to pay you to do it. ;-)

><snip>

I'm getting paid to do stuff I'd do for free, or even pay
others to allow me to do. If I got good enough to get paid
for writing VHDL code and then added that to the list, I'd
probably die from forgetting to eat out of the pure excess
pleasure of it all. I'm far too lucky as it is. ;)

Jon

Jan Panteltje

unread,
Mar 23, 2011, 4:27:03 AM3/23/11
to
On a sunny day (Tue, 22 Mar 2011 20:01:43 +0000 (UTC)) it happened Warren
<ve3...@gmail.com> wrote in
<Xns9EB0A30D9BE11W...@81.169.183.62>:

>Jan Panteltje expounded in
>news:imanmt$clr$1...@news.albasani.net:
>
>> On a sunny day (Tue, 22 Mar 2011 13:26:45 +0000 (UTC)) it
>> happened Warren <ve3...@gmail.com> wrote in
>> <Xns9EB0601772EF2W...@188.40.43.213>:
>>
>>>If you work on large systems, then avoiding debuggers is a
>>>huge waste of time. When you have intermittant or
>>>unexplained failure(s) a debugger saves gobs of time since
>>>you don't have to anticipate what to "print".
>>>
>>>When your code core dumps, a debugger will trace back the
>>>precise point of the failure and allows you to look at
>>>anything related to it.
>>>
>>>All of this facility comes with a simple compile option. No
>>>need to code special macros.
>>>
>>>In short, you're following some bad and antiquated advice.
>>>
>>>Warren
>>
>> That is a matter of opinion.
>
>Your's is in the minority. :)

Better a sane minority in the mad house.


>> I have seen to many programmers staring too long at little
>> windows with register values... While just some sane coding
>> and understanding WHAT you were doing (in C), and print
>> statements in the right place, would have saved hours, and
>> prevented all that 'searching', and segfaults too.
>
>That speaks volumes about the programmers-- and nothing about
>the value of the debugger. Don't get me wrong- a few carefully
>crafted prints can augment a difficult debug.
>
>But to write off a debugger is like saying "I can saw it by
>hand, so I don't need no stinkin' table saw".
>
>Warren

Not sure that analogy is the right one, how about this:
Using a debugger is like spell checking by checking the font of each character,
not only does it not help, as fonts are computer generated,
it has no bearing on the spelling either.
:-)

Jan Panteltje

unread,
Mar 23, 2011, 4:27:03 AM3/23/11
to
On a sunny day (Tue, 22 Mar 2011 18:24:48 -0500) it happened
"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote in
<7tbio69cbrq529nck...@4ax.com>:

>On Tue, 22 Mar 2011 10:14:37 GMT, Jan Panteltje <pNaonSt...@yahoo.com>
>wrote:
>
>>On a sunny day (Mon, 21 Mar 2011 18:13:19 -0500) it happened
>>izzzzzzzzzzzzzzzzzzzzzzzz> wrote:
>>>Wow! You got that right. I don't WRITE CODE,
>>
>>Yea, that was clear already.
>>

>I'm hardware engineer. Software is for dweebs.

Then why do you use it so much with your simulators?
:-)

Warren

unread,
Mar 23, 2011, 9:58:26 AM3/23/11
to
Jan Panteltje expounded in
news:imcaqq$58m$2...@news.albasani.net:

> On a sunny day (Tue, 22 Mar 2011 20:01:43 +0000 (UTC)) it
> happened Warren <ve3...@gmail.com> wrote in
> <Xns9EB0A30D9BE11W...@81.169.183.62>:
>
>>Jan Panteltje expounded in
>>news:imanmt$clr$1...@news.albasani.net:

..


>>>>In short, you're following some bad and antiquated
>>>>advice.
>>>>
>>>>Warren
>>>
>>> That is a matter of opinion.
>>
>>Your's is in the minority. :)
>
> Better a sane minority in the mad house.

That's the usual statement for inmates _in_
an insane asylum.

They believe they are the only sane ones and
that its everyone else that is not.

>>But to write off a debugger is like saying "I can saw it by
>>hand, so I don't need no stinkin' table saw".

> Not sure that analogy is the right one, how about this:


> Using a debugger is like spell checking by checking the
> font of each character, not only does it not help, as fonts
> are computer generated, it has no bearing on the spelling
> either.
>:-)

You're saying debuggers don't help, which is simply not the
case. It either suggests that you don't know how to use them
or you've avoided them for so long that you don't know what
they're capable of.

A debugger saves an enormous amount of time. Who wouldn't want
to leverage that? I'll bet your employer would.

Some poor developers avoid learning gdb (for example) simply
because they didn't want to take the effort to learn how to
use it. It isn't difficult but some folks are lazy or lousy at
self education. But that's a problem with the developer-- not
the tool.

With the quality of the debuggers generally today, there is no
reason not to use them. I for one, would not pay a developer
to avoid debuggers so that they can spend time cooking up
personalized macros and manually coding printf's all over the
place. That's a poor use of a developer's time.

Macros BTW, are completely useless in a language like Ada,
which I've been using for AVR stuff. Ada shortcuts software
development because it encourages software _engineering_ that
is often lacking in C and derivatives. Ada is very fussy about
what gets compiled which saves debugging time (but I realize
this is a language religious thing and that sometimes the
language is dictated). Finally, any code added for debugging
purposes clutters the code, which is bad. A debugger makes
most of that completely unnecessary. One compiler option is
all that it costs.

But if you don't want to use a debugger, then that is your own
choice. One can only lead the horse to water.

But saying that debuggers are not useful is simply not
generally accepted. There is a good reason for that!

Warren

Jan Panteltje

unread,
Mar 23, 2011, 11:21:21 AM3/23/11
to
On a sunny day (Wed, 23 Mar 2011 13:58:26 +0000 (UTC)) it happened Warren
<ve3...@gmail.com> wrote in
<Xns9EB165763DBEAW...@81.169.183.62>:

>>>> That is a matter of opinion.
>>>
>>>Your's is in the minority. :)
>>
>> Better a sane minority in the mad house.
>
>That's the usual statement for inmates _in_
>an insane asylum.

This world is one, just depends on your standard of 'sanity'.


>> Not sure that analogy is the right one, how about this:
>> Using a debugger is like spell checking by checking the
>> font of each character, not only does it not help, as fonts
>> are computer generated, it has no bearing on the spelling
>> either.
>>:-)
>
>You're saying debuggers don't help, which is simply not the
>case.

The only case where a debugger may help is if you want to know why a BINARY
of somebody else's code craches, when you have no source code.
Not that it helps much to fix it if you know it is accessing an illegal address.
If you have the source, as you SHOULD, look at the code.
A few simple printf() statements will tell you all you want to know.
I have been through piles of other man's source code, used parts of it,
C reads like a novel to me, no matter what style it is written in.


>It either suggests that you don't know how to use them
>or you've avoided them for so long that you don't know what
>they're capable of.

I have to admit I have lost the capability of walking on my hands,
actually never was good at it.
To be honest I never tried very hard.
All your insults show how incredibly unexperienced you must be with walking on your feet.
For us who walk on those, you, on your hands, is amusing,
but not something I would recommend to anyone for 'larger projects' like a trip
to the next city :-)


>A debugger saves an enormous amount of time. Who wouldn't want
>to leverage that? I'll bet your employer would.

Not using it saves all the time spend with a debugger.


>Some poor developers avoid learning gdb (for example) simply
>because they didn't want to take the effort to learn how to
>use it. It isn't difficult but some folks are lazy or lousy at
>self education. But that's a problem with the developer-- not
>the tool.

You are too lazy and too stupid and too stubborn to take a good academic advice.
Kids stuff.

>With the quality of the debuggers generally today, there is no
>reason not to use them.

Quality of debugger has nothing to do with it.
If you want to shoot yourself in the head a good quality gun does not help
save your life.


>I for one, would not pay a developer
>to avoid debuggers so that they can spend time cooking up
>personalized macros and manually coding printf's all over the
>place. That's a poor use of a developer's time.

I rarely use macros, the printf() at the start of functions is
just a few lines, if your typing skills are that bad that you cannot
put those out in about 8 seconds each, then use your hours f*cking about
with your [de]bugger.


>Macros BTW, are completely useless in a language like Ada,
>which I've been using for AVR stuff.

Well that says it all, an other victim of ADA.
It so happened I threw my ADA book in the garbage a few weeks ago, after not using it
since 1989 or so.
What the world has come to, what a bunch of crap.
Maybe you work for the US DOD who once required ADA,
but for real critical systems allowed 'other' languages (they had to),
ADA would explain why they keep losing wars.


>development because it encourages software _engineering_ that
>is often lacking in C and derivatives.

C has no derivate[1], is close to the hardware, and requires you to know what you are doing.
ADA forces you into some insane form, that actually does not even prevent you from making basic mistakes,
else you would not need your bugger at all.

[1]
C++ is no language, it is a speech disability, a crime against humanity.


>Ada is very fussy

The word is 'sucks'.

Joel Koltner

unread,
Mar 23, 2011, 11:23:00 AM3/23/11
to
"Warren" <ve3...@gmail.com> wrote in message
news:Xns9EB165763DBEAW...@81.169.183.62...

> I for one, would not pay a developer
> to avoid debuggers so that they can spend time cooking up
> personalized macros and manually coding printf's all over the
> place. That's a poor use of a developer's time.

Yeah, these days they instead spend time cooking up macros and plug-ins for
the debugger so that it can "sanely" display various objects or structures.
:-)

> Ada is very fussy about
> what gets compiled which saves debugging time (but I realize
> this is a language religious thing and that sometimes the
> language is dictated).

The other thing is that there really is a very broad range of programmer skill
sets/proficiencies out there: My opinion is that, while all programmers make
mistakes, the kind of bugs that very strong typing and other rigors that Ada
imposes tend to prevent errors that a *certain subset* of programmers don't
make anyway -- or at least only make on an incredibly rare occasion --, and
instead *for them* it just decreases their average productivity a bit.

Overall it is your programmer's productivity that counts, as measured by, "how
long did it take them to get this chunk of code to work in a bug-free
manner?" -- regardless of what tools they choose (or are made) to use. My
opinion is that software development is one of the few fields where the answer
to this question varies far more than in many other fields; a 5:1 ratio is
readily seen -- yet hiring the most productive programmers doesn't cost nearly
5x what hiring the least productive ones does.

My recollection is that Steve Wozniak wrote out all of the Apple I's BASIC in
assembly language and then manually "assembled" it, still on paper, into
op-codes. He did all of his debugging "on paper," and the first time he
entered it into the actual CPU... it worked.

Many people couldn't pull off that feat with Ada, C++, Python or any other
language out there. :-)

On the other hand... Woz and one other guy were off trying to finish DOS 1.0
before some trade show (probably Comdex), hopped onto a plane the day before,
figuring that they'd had it done in a couple of hours after they arrived at
their hotel. Instead they were programming all through the night, finishing
just a couple of hours before the show was set to open... at which point Woz
figured he'd make "one last test" of the read/write sector routines before
getting some shuteye... and inadvertently did so using the disc he was using
to store DOS itself rather than their test disc, thereby wiping out all the
progress they'd make that past night.

Everyone's human? :-)

(The story then continues that he did go to sleep, woke up that afternoon, and
fixed everything still that same day -- having remembered most of the details
of what had been done --, so they only lost the one opening day of the trade
show without a demonstrably working DOS. Excellent recovery there, at least.)

---Joel

Warren

unread,
Mar 23, 2011, 12:46:53 PM3/23/11
to
Joel Koltner expounded in
news:pfoip.771840$iV7.3...@en-nntp-15.dc1.easynews.com:

> "Warren" <ve3...@gmail.com> wrote in message
> news:Xns9EB165763DBEAW...@81.169.183.62...
>> I for one, would not pay a developer
>> to avoid debuggers so that they can spend time cooking up
>> personalized macros and manually coding printf's all over
>> the place. That's a poor use of a developer's time.
>
> Yeah, these days they instead spend time cooking up macros
> and plug-ins for the debugger so that it can "sanely"
> display various objects or structures.
>:-)
>
>> Ada is very fussy about
>> what gets compiled which saves debugging time (but I
>> realize this is a language religious thing and that
>> sometimes the language is dictated).
>
> The other thing is that there really is a very broad range
> of programmer skill sets/proficiencies out there:

No doubt.

> My
> opinion is that, while all programmers make mistakes, the
> kind of bugs that very strong typing and other rigors that
> Ada imposes tend to prevent errors that a *certain subset*
> of programmers don't make anyway -- or at least only make
> on an incredibly rare occasion --, and instead *for them*
> it just decreases their average productivity a bit.

The comp.lang.ada forum would disagree with that on the whole,
but that is another discussion I don't want to pursue here.
They'll suggest that there is more to it than that (and I
agree).

> Overall it is your programmer's productivity that counts,
> as measured by, "how long did it take them to get this
> chunk of code to work in a bug-free manner?" -- regardless
> of what tools they choose (or are made) to use.

Agreed generally.

> My recollection is that Steve Wozniak wrote out all of the
> Apple I's BASIC in assembly language and then manually
> "assembled" it, still on paper, into op-codes.

I've also hand assembled lots of code in the '70s. It's no big
deal, just extra effort. When you have no choice, it can be
done. Now that is rarely needed, with the tools available
today.

> He did all
> of his debugging "on paper," and the first time he entered
> it into the actual CPU... it worked.

Computer time was considered more expensive than programmer
time in those days (speaking generally). Things are
considerably different today. It makes no sense to do a time
consuming desk check when testing it can do it in an instant.
Yes, there are exceptions to that, depending upon the nature
of the project (like critical flight control systems).

> Many people couldn't pull off that feat with Ada, C++,
> Python or any other language out there. :-)

Ada is used whenever there are safety critical and life
threatening situations. There is a reason for that. If you
hang out on comp.lang.ada, you'll see that there are tools
that build upon Ada and take the checking to an even more
rigid extreme. But that is moving OT..

Warren

Warren

unread,
Mar 23, 2011, 1:17:27 PM3/23/11
to
Jan Panteltje expounded in
news:imd33m$s4n$1...@news.albasani.net:

> On a sunny day (Wed, 23 Mar 2011 13:58:26 +0000 (UTC)) it
> happened Warren <ve3...@gmail.com> wrote in
> <Xns9EB165763DBEAW...@81.169.183.62>:

>>You're saying debuggers don't help, which is simply not the


>>case.
>
> The only case where a debugger may help is if you want to
> know why a BINARY of somebody else's code craches, when you
> have no source code.

Hogwash. It can be used in this situation, but it is not the
"only" situation where it is useful.

> A few simple
> printf() statements will tell you all you want to know.

And to wait for a rebuild a huge system to compile that added
printf statement, which might take you 30 minutes, this is
terrible waste of time. For small projects, you can tolerate
all kinds of il-advised development practices.

But a huge project or not, needs no recompiles, no relinking
or running of makefiles. You just invoke it with the debugger,
or heaven forbid, just attach to an already running program
and take control.

> I
> have been through piles of other man's source code, used
> parts of it, C reads like a novel to me, no matter what
> style it is written in.

Your skill level is academic. It needs to read well for others
on your team.

And, if your code is being reviewed, as it happens in safety
critical systems, that "added code" (if it stays in) now needs
to be verified that it is not going to become a flight
critical error. Less is more in critical software.

>>It either suggests that you don't know how to use them
>>or you've avoided them for so long that you don't know what
>>they're capable of.
>
> I have to admit I have lost the capability of walking on my
> hands, actually never was good at it.
> To be honest I never tried very hard.

So you've never really tried to use it but are telling
everyone to trust you that a debugger is worthless?

> All your insults

Insults?

I'm trying to understand how someone can dis a very useful
tool in the current environment. Debuggers have never been
better, but you say they are useless based upon (IIRC) some
article in the early '80s.

>>A debugger saves an enormous amount of time. Who wouldn't
>>want to leverage that? I'll bet your employer would.
>
> Not using it saves all the time spend with a debugger.

It's your choice man.

> You are too lazy and too stupid and too stubborn to take a
> good academic advice. Kids stuff.

Bad antiquated advice from decades ago hardly applies to the
tools we have today. That might have been good advice for
some big iron environments of the time. But I even recall
using C/PM and DOS debuggers, that were still plenty useful on
the microcomputer front.

But I leave you to your own fate.

>>With the quality of the debuggers generally today, there is
>>no reason not to use them.
>
> Quality of debugger has nothing to do with it.
> If you want to shoot yourself in the head a good quality
> gun does not help save your life.

A gun is very useful in the hands of a hungry hunter. If you
shoot yourself in the foot, then that says a lot about you as
a hunter.

It ain't the gun's fault.

> I rarely use macros, the printf() at the start of functions
> is just a few lines, if your typing skills are that bad
> that you cannot put those out in about 8 seconds each, then
> use your hours f*cking about with your [de]bugger.

You've left out the make process, which can be huge in a large
system.

But clearly you don't know about debuggers.

>>Macros BTW, are completely useless in a language like Ada,
>>which I've been using for AVR stuff.
>
> Well that says it all, an other victim of ADA.

No, it is Ada (not American Dental Association or some other
acronym).

> It so happened I threw my [Ada] book in the garbage a few


> weeks ago, after not using it since 1989 or so.
> What the world has come to, what a bunch of crap.

..


> Maybe you work for the US DOD who once required ADA,
> but for real critical systems allowed 'other' languages
> (they had to), ADA would explain why they keep losing wars.

Actually, as I understand it, each project still gets reviewed
before they relax the Ada requirement. I also understand that
a lot of contracting firms still use Ada due to the cost
savings in the testing and maintenance. Others however, are
getting my billable hours by using other languages.

> C has no derivate[1], is close to the hardware, and

> requires you to know what you are doing. [Ada] forces you


> into some insane form, that actually does not even prevent
> you from making basic mistakes, else you would not need
> your bugger at all.

You've clearly never mastered Ada and has changed a lot since
your '89 text. Ada is not your grandfather's Ada-83 that you
remember. It has undergone a major revision in '95, and 2005.
It is now headed for a new revision of the standard in 2012.
As a standard, Ada is one of the best "well defined"
standards.

> C++ is no language, it is a speech disability, a crime
> against humanity.

It's not my favourite, but like C, it has its place in the
world.

>>Ada is very fussy
>
> The word is 'sucks'.

A clearly uninformed opinion.

Warren

Nico Coesel

unread,
Mar 23, 2011, 3:57:58 PM3/23/11
to
Jan Panteltje <pNaonSt...@yahoo.com> wrote:

>On a sunny day (Tue, 22 Mar 2011 20:01:43 +0000 (UTC)) it happened Warren
><ve3...@gmail.com> wrote in
><Xns9EB0A30D9BE11W...@81.169.183.62>:
>
>>Jan Panteltje expounded in
>>news:imanmt$clr$1...@news.albasani.net:
>>
>>> On a sunny day (Tue, 22 Mar 2011 13:26:45 +0000 (UTC)) it
>>> happened Warren <ve3...@gmail.com> wrote in
>>> <Xns9EB0601772EF2W...@188.40.43.213>:
>>>
>>>>If you work on large systems, then avoiding debuggers is a
>>>>huge waste of time. When you have intermittant or
>>>>unexplained failure(s) a debugger saves gobs of time since
>>>>you don't have to anticipate what to "print".
>>>>
>>>>When your code core dumps, a debugger will trace back the
>>>>precise point of the failure and allows you to look at
>>>>anything related to it.
>>>>
>>>>All of this facility comes with a simple compile option. No
>>>>need to code special macros.
>>>>
>>>>In short, you're following some bad and antiquated advice.
>>>>
>>>>Warren
>>>
>>

>>That speaks volumes about the programmers-- and nothing about
>>the value of the debugger. Don't get me wrong- a few carefully
>>crafted prints can augment a difficult debug.
>>
>>But to write off a debugger is like saying "I can saw it by
>>hand, so I don't need no stinkin' table saw".
>>
>>Warren
>
>Not sure that analogy is the right one, how about this:
>Using a debugger is like spell checking by checking the font of each character,
>not only does it not help, as fonts are computer generated,
>it has no bearing on the spelling either.

Sorry but that is nonsense. A debugger is very usefull to see if the
code is actually doing what it is supposed to do. Not just verifying
the output is (accidentally) right. In case of an exception it will
lead you right to the offending function. If you're lucky you'll get a
stack trace as well.

Output from print statements are usefull when the program is running
at a client and the client is able to capture them in a log file.
Print statements also help debugging real-time processes which cannot
be stepped.

k...@att.bizzzzzzzzzzzz

unread,
Mar 23, 2011, 6:44:20 PM3/23/11
to
On Wed, 23 Mar 2011 01:06:51 -0700, Jon Kirwan <jo...@infinitefactors.org>
wrote:

>On Tue, 22 Mar 2011 20:48:11 -0500, "k...@att.bizzzzzzzzzzzz"

Well, there you go! Have fun!

k...@att.bizzzzzzzzzzzz

unread,
Mar 23, 2011, 6:47:06 PM3/23/11
to
On Wed, 23 Mar 2011 08:27:03 GMT, Jan Panteltje <pNaonSt...@yahoo.com>
wrote:

>On a sunny day (Tue, 22 Mar 2011 18:24:48 -0500) it happened
>"k...@att.bizzzzzzzzzzzz" <k...@att.bizzzzzzzzzzzz> wrote in
><7tbio69cbrq529nck...@4ax.com>:
>
>>On Tue, 22 Mar 2011 10:14:37 GMT, Jan Panteltje <pNaonSt...@yahoo.com>
>>wrote:
>>
>>>On a sunny day (Mon, 21 Mar 2011 18:13:19 -0500) it happened
>>>izzzzzzzzzzzzzzzzzzzzzzzz> wrote:
>>>>Wow! You got that right. I don't WRITE CODE,
>>>
>>>Yea, that was clear already.
>>>
>>I'm hardware engineer. Software is for dweebs.
>
>Then why do you use it so much with your simulators?
>:-)

_With_ my(?) simulators? Someone has to do the shit work. Even DimBulb is
employed, evidently.

Nico Coesel

unread,
Mar 23, 2011, 6:48:43 PM3/23/11
to
"Joel Koltner" <zapwireD...@yahoo.com> wrote:

>"Warren" <ve3...@gmail.com> wrote in message
>news:Xns9EB165763DBEAW...@81.169.183.62...
>> I for one, would not pay a developer
>> to avoid debuggers so that they can spend time cooking up
>> personalized macros and manually coding printf's all over the
>> place. That's a poor use of a developer's time.
>
>Yeah, these days they instead spend time cooking up macros and plug-ins for
>the debugger so that it can "sanely" display various objects or structures.
>:-)
>
>> Ada is very fussy about
>> what gets compiled which saves debugging time (but I realize
>> this is a language religious thing and that sometimes the
>> language is dictated).
>
>The other thing is that there really is a very broad range of programmer skill
>sets/proficiencies out there: My opinion is that, while all programmers make
>mistakes, the kind of bugs that very strong typing and other rigors that Ada
>imposes tend to prevent errors that a *certain subset* of programmers don't
>make anyway -- or at least only make on an incredibly rare occasion --, and
>instead *for them* it just decreases their average productivity a bit.

I wish Ada where available for ARM. I really love to try it for a
microcontroller project. If I'm right its like programming with the
cruise control on.

If you concentrate on driving a car, you can keep your speed constant.
But why bother if you can have cruise control? Not having to think
about pitfalls is much more efficient than knowing how to avoid them.

>My recollection is that Steve Wozniak wrote out all of the Apple I's BASIC in
>assembly language and then manually "assembled" it, still on paper, into
>op-codes. He did all of his debugging "on paper," and the first time he
>entered it into the actual CPU... it worked.

I know that kind of code. Try to add something new later.

k...@att.bizzzzzzzzzzzz

unread,
Mar 23, 2011, 6:58:04 PM3/23/11
to
On Wed, 23 Mar 2011 08:23:00 -0700, "Joel Koltner"
<zapwireD...@yahoo.com> wrote:

>"Warren" <ve3...@gmail.com> wrote in message
>news:Xns9EB165763DBEAW...@81.169.183.62...
>> I for one, would not pay a developer
>> to avoid debuggers so that they can spend time cooking up
>> personalized macros and manually coding printf's all over the
>> place. That's a poor use of a developer's time.
>
>Yeah, these days they instead spend time cooking up macros and plug-ins for
>the debugger so that it can "sanely" display various objects or structures.
>:-)
>
>> Ada is very fussy about
>> what gets compiled which saves debugging time (but I realize
>> this is a language religious thing and that sometimes the
>> language is dictated).
>
>The other thing is that there really is a very broad range of programmer skill
>sets/proficiencies out there: My opinion is that, while all programmers make
>mistakes, the kind of bugs that very strong typing and other rigors that Ada
>imposes tend to prevent errors that a *certain subset* of programmers don't
>make anyway -- or at least only make on an incredibly rare occasion --, and
>instead *for them* it just decreases their average productivity a bit.

Programmers never make buffer overrun errors? Nah, Windows has none of
them...

>Overall it is your programmer's productivity that counts, as measured by, "how
>long did it take them to get this chunk of code to work in a bug-free
>manner?" -- regardless of what tools they choose (or are made) to use. My
>opinion is that software development is one of the few fields where the answer
>to this question varies far more than in many other fields; a 5:1 ratio is
>readily seen -- yet hiring the most productive programmers doesn't cost nearly
>5x what hiring the least productive ones does.

It's also hard to tell the difference.

>My recollection is that Steve Wozniak wrote out all of the Apple I's BASIC in
>assembly language and then manually "assembled" it, still on paper, into
>op-codes. He did all of his debugging "on paper," and the first time he
>entered it into the actual CPU... it worked.

People can *read* S/370 core dumps. Manual assembly on many antique
processors isn't difficult. Not all processors are the mess that is x86.

>Many people couldn't pull off that feat with Ada, C++, Python or any other
>language out there. :-)

Not sure how to parse that challenge. Manual assembly of a HLL? ;-)

>On the other hand... Woz and one other guy were off trying to finish DOS 1.0
>before some trade show (probably Comdex), hopped onto a plane the day before,
>figuring that they'd had it done in a couple of hours after they arrived at
>their hotel. Instead they were programming all through the night, finishing
>just a couple of hours before the show was set to open... at which point Woz
>figured he'd make "one last test" of the read/write sector routines before
>getting some shuteye... and inadvertently did so using the disc he was using
>to store DOS itself rather than their test disc, thereby wiping out all the
>progress they'd make that past night.
>
>Everyone's human? :-)

Especially the night before a trade show. ;-)

Joel Koltner

unread,
Mar 23, 2011, 7:31:16 PM3/23/11
to
<k...@att.bizzzzzzzzzzzz> wrote in message
news:keuko6ltsm7jfujhn...@4ax.com...

> Programmers never make buffer overrun errors? Nah, Windows has none of
> them...

I think Larkin claims none of his equipment has ever had one. :-)

>> My
>>opinion is that software development is one of the few fields where the
>>answer
>>to this question varies far more than in many other fields; a 5:1 ratio is
>>readily seen -- yet hiring the most productive programmers doesn't cost
>>nearly
>>5x what hiring the least productive ones does.
>
> It's also hard to tell the difference.

For large projects, yes... there's some study I read where they attempted to
determine programmer productivity vs. project size, and while there are very
large deviations for small (e.g., one or two man) projects, it largely goes
away once you're hitting, e.g., dozen+ people projects.

>>Many people couldn't pull off that feat with Ada, C++, Python or any other
>>language out there. :-)
> Not sure how to parse that challenge. Manual assembly of a HLL? ;-)

Haha... yes, it is ambiguous!

But I meant the other way around: Write a BASIC interpreter in an HLL; any
errors will be punishable by a cat peeing in your bed.

---Joel

Muzaffer Kal

unread,
Mar 23, 2011, 10:32:15 PM3/23/11
to
On Wed, 23 Mar 2011 22:48:43 GMT, ni...@puntnl.niks (Nico Coesel)
wrote:

>I wish Ada where available for ARM.

Why do you think it's not? GNU compiler chain has a full blown Ada95
front-end and an ARM back-end.
--
Muzaffer Kal

DSPIA INC.
ASIC/FPGA Design Services

http://www.dspia.com

John - KD5YI

unread,
Mar 23, 2011, 4:49:08 PM3/23/11
to
On 3/21/2011 3:25 PM, Nico Coesel wrote:
> John - KD5YI<sop...@invalid.org> wrote:
>
>> On 3/20/2011 4:10 AM, fasf wrote:
>>> Hi,
>>> i've worked wit 8bit/32bit microcontrollers for 5 years and now i want
>>> to explore new fields, so i'm interested in these two solutions: PSoC
>>> and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and
>>> i don't understand the differences between these programmable devices.
>>> For hobby, what do you think is more useful to study? Can you suggest
>>> a low cost DevKit?
>>> Any getting started suggestion will be very appreciated
>>
>>
>> Well, I will put in a plug for Cypress' PSoC. I've been using them for
>> years. However, if you need blazing speed, you will need to go the FPGA
>> route as the PSoC is nothing more than a microcontroller that has some
>> analog goodies.
>>
>> I like the PSoC because it is extremely configurable/reconfigurable to
>> almost anything you need. An example that Cypress came out with was a
>>
>> I have no financial connection with Cypress. I've just been using their
>> PSoC chips since about 2002.
>
> Looks interesting. I checked the website a bit but couldn't find all I
> want to know: How is the analog performance regarding noise and
> bandwidth? Can you also use a Psoc as a reconfigurable analog brick
> (analog in - analog out)?
>

You'll have to read the specs or reference manual to get noise and
bandwidth information. I don't remember them, but, in the PSoC1, not all
that great.

I don't know what you mean by 'brick', but, yes, the input pins are
selectable, the output pins are selectable, the type of analog
(continuous analog or switched cap) is selectable. The switched cap
modules make the ADs and DACs, filters, modulators, demodulators, etc.
The continuous analog makes amplifiers, instrumentation amps, and
comparators. Internally, you can route signals between blocks for
various uses.

Bottom of the line (PSoC1) data sheet: http://www.cypress.com/?rID=3324
PSoC3 (8051 core): http://www.cypress.com/?rID=35178
PSoC5 (ARM core): http://www.cypress.com/?rID=37581

If you want more than the data sheets, the Technical Reference Manuals
(TRM) are available.

If I can help get more info for you, let me know.

Cheers,
John

Frank Buss

unread,
Mar 24, 2011, 1:49:18 AM3/24/11
to
Nico Coesel wrote:

> I wish Ada where available for ARM. I really love to try it for a
> microcontroller project. If I'm right its like programming with the
> cruise control on.

Ada is available for GCC. Looks like some people have problems building
the libraries for it as a cross compiler, but arm-rtems should work:

http://www.rtems.com/wiki/index.php/RTEMSAda

But I didn't try it.

Nico Coesel

unread,
Mar 24, 2011, 2:17:35 AM3/24/11
to
Muzaffer Kal <k...@dspia.com> wrote:

>On Wed, 23 Mar 2011 22:48:43 GMT, ni...@puntnl.niks (Nico Coesel)
>wrote:
>>I wish Ada where available for ARM.
>
>Why do you think it's not? GNU compiler chain has a full blown Ada95
>front-end and an ARM back-end.

The last time I checked there was no 'real' Ada for ARM. Ada is
available for most platforms GCC supports except ARM. It has something
to do with the Ada runtime libraries and ARM changing their ABI too
often.

Warren

unread,
Mar 24, 2011, 9:38:35 AM3/24/11
to
Nico Coesel expounded in
news:4d8ae163....@news.kpnplanet.nl:

> Muzaffer Kal <k...@dspia.com> wrote:
>
>>On Wed, 23 Mar 2011 22:48:43 GMT, ni...@puntnl.niks (Nico
>>Coesel) wrote:
>>>I wish Ada where available for ARM.
>>
>>Why do you think it's not? GNU compiler chain has a full
>>blown Ada95 front-end and an ARM back-end.
>
> The last time I checked there was no 'real' Ada for ARM.
> Ada is available for most platforms GCC supports except
> ARM. It has something to do with the Ada runtime libraries
> and ARM changing their ABI too often.

Gnat Pro seems to support it, though I don't know about cost
etc.:

http://www.adacore.com/home/products/gnatpro/?gclid=CNzE0u-
n56cCFYi8KgodLxPXaQ

If you google "gnat arm" you might find instructions to cross
compile it for free, though it is probably a difficult build
from what I've seen.

If you want to just give Ada a spin, perhaps the easiest thing
to do (for Windows users) is to install Cygwin and have the
gcc-ada compiler installed.

Ada Core also has native windows IDE environment (if you like
that sort of thing) called GPS, which is free for download and
use.

Warren

k...@att.bizzzzzzzzzzzz

unread,
Mar 24, 2011, 7:18:59 PM3/24/11
to
On Wed, 23 Mar 2011 16:31:16 -0700, "Joel Koltner"
<zapwireD...@yahoo.com> wrote:

><k...@att.bizzzzzzzzzzzz> wrote in message
>news:keuko6ltsm7jfujhn...@4ax.com...
>> Programmers never make buffer overrun errors? Nah, Windows has none of
>> them...
>
>I think Larkin claims none of his equipment has ever had one. :-)
>
>>> My
>>>opinion is that software development is one of the few fields where the
>>>answer
>>>to this question varies far more than in many other fields; a 5:1 ratio is
>>>readily seen -- yet hiring the most productive programmers doesn't cost
>>>nearly
>>>5x what hiring the least productive ones does.
>>
>> It's also hard to tell the difference.
>
>For large projects, yes... there's some study I read where they attempted to
>determine programmer productivity vs. project size, and while there are very
>large deviations for small (e.g., one or two man) projects, it largely goes
>away once you're hitting, e.g., dozen+ people projects.

I'm not so sure it's even possible on small projects. I've seen some pretty
productive people get ignored. They tend to work quietly.

>>>Many people couldn't pull off that feat with Ada, C++, Python or any other
>>>language out there. :-)
>> Not sure how to parse that challenge. Manual assembly of a HLL? ;-)
>
>Haha... yes, it is ambiguous!
>
>But I meant the other way around: Write a BASIC interpreter in an HLL; any
>errors will be punishable by a cat peeing in your bed.

Why not? You want to write BASIC in machine language? Today?

Joel Koltner

unread,
Mar 24, 2011, 7:26:39 PM3/24/11
to
<k...@att.bizzzzzzzzzzzz> wrote in message
news:b7kno6156l1973crj...@4ax.com...

>>But I meant the other way around: Write a BASIC interpreter in an HLL; any
>>errors will be punishable by a cat peeing in your bed.
> Why not? You want to write BASIC in machine language? Today?

No, I'm saying that while Woz was able to pull off writing BASIC in machine
language and have it work on the first actual test, many people today couldn't
write a comparable BASIC in an HLL of their choosing and have it similarly
work on the first go.

It's the difference that John Larkin likes to point out about "debugging in
your mind" vs. debugging using a software or hardware debugger: While the
later is (IMO) quite valuable, in some cases the former is actually the much
faster approach.

k...@att.bizzzzzzzzzzzz

unread,
Mar 24, 2011, 7:53:59 PM3/24/11
to
On Thu, 24 Mar 2011 16:26:39 -0700, "Joel Koltner"
<zapwireD...@yahoo.com> wrote:

><k...@att.bizzzzzzzzzzzz> wrote in message
>news:b7kno6156l1973crj...@4ax.com...
>>>But I meant the other way around: Write a BASIC interpreter in an HLL; any
>>>errors will be punishable by a cat peeing in your bed.
>> Why not? You want to write BASIC in machine language? Today?
>
>No, I'm saying that while Woz was able to pull off writing BASIC in machine
>language and have it work on the first actual test, many people today couldn't
>write a comparable BASIC in an HLL of their choosing and have it similarly
>work on the first go.

Well... I taught at a reasonably well known college. Only about 5% of the
seniors could write a program to convert bases without being told how to do
it. 75% couldn't do it even after being told.

>It's the difference that John Larkin likes to point out about "debugging in
>your mind" vs. debugging using a software or hardware debugger: While the
>later is (IMO) quite valuable, in some cases the former is actually the much
>faster approach.

As has been pointed out by others here, a debugger (or a simulator for
hardware types) doesn't only help squash bugs but helps verify that the code
is working properly, also. The latter takes a *lot* more work.

Joel Koltner

unread,
Mar 24, 2011, 8:49:22 PM3/24/11
to
<k...@att.bizzzzzzzzzzzz> wrote in message
news:75mno6t6djmo95813...@4ax.com...

> Well... I taught at a reasonably well known college. Only about 5% of the
> seniors could write a program to convert bases without being told how to do
> it. 75% couldn't do it even after being told.

Why, were you teaching them COBOL or Ada?

Just kidding. :-) (...running for cover from all the COBOL and Ada
proponents...)

I take it the language was something like C or Java or BASIC?

I'm rather grateful that they taught us how to convert to and from base 5 back
in elementary school -- made it that much easier when I started learning
useful bases like 2 and 16!

k...@att.bizzzzzzzzzzzz

unread,
Mar 24, 2011, 9:13:58 PM3/24/11
to
On Thu, 24 Mar 2011 17:49:22 -0700, "Joel Koltner"
<zapwireD...@yahoo.com> wrote:

><k...@att.bizzzzzzzzzzzz> wrote in message
>news:75mno6t6djmo95813...@4ax.com...
>> Well... I taught at a reasonably well known college. Only about 5% of the
>> seniors could write a program to convert bases without being told how to do
>> it. 75% couldn't do it even after being told.
>
>Why, were you teaching them COBOL or Ada?

>Just kidding. :-) (...running for cover from all the COBOL and Ada
>proponents...)

>I take it the language was something like C or Java or BASIC?

x86 assembler, but the language was irrelevant. They couldn't do it
themselves, much less write pseudo-code to do it.

>I'm rather grateful that they taught us how to convert to and from base 5 back
>in elementary school -- made it that much easier when I started learning
>useful bases like 2 and 16!

We were doing arithmetic and converting between bases (directly, not through
base-10) in every base up to 32 (characters got tough after that) when we were
in fifth and sixth grades. Jr. High was a total waste, though.

Joel Koltner

unread,
Mar 25, 2011, 11:17:58 AM3/25/11
to
<k...@att.bizzzzzzzzzzzz> wrote in message
news:btqno6lncbucjvqap...@4ax.com...

> x86 assembler, but the language was irrelevant. They couldn't do it
> themselves, much less write pseudo-code to do it.

A certain percentage of them might have been able to pull it off in a HLL if
not assembly (of the ones who could at least create pseudo-code).

> We were doing arithmetic and converting between bases (directly, not through
> base-10) in every base up to 32 (characters got tough after that) when we
> were
> in fifth and sixth grades.

That's cool... in retrospect, it does seem a little odd that in my class it
was just base 5 rather than covering arbitrary bases.

> Jr. High was a total waste, though.

That's where I was first taught algebra, which was of course quite useful...
although I do also recall spending far more time than seemed warranted on
"directional proportional to" and "inversely proportional to"-type problems.
(But then again, in our "social studies" class in middle school, there was a
section on how to use a map, and I had a hard time imagining how it was that
anybody didn't already know that -- staring at the map was one of the
diversions of last resort on long car trips, after all. :-) )

Warren

unread,
Mar 25, 2011, 12:13:13 PM3/25/11
to
Joel Koltner expounded in
news:QqQip.769947$Yn5.3...@en-nntp-14.dc1.easynews.com:

> No, I'm saying that while Woz was able to pull off writing
> BASIC in machine language and have it work on the first
> actual test, many people today couldn't write a comparable
> BASIC in an HLL of their choosing and have it similarly
> work on the first go.

I personally don't buy that folklore. What one considers as
"working" varies widely. Perfection is rare in software- even
rarer in machine/assembly language.

Where is this folklore published? I'm a skeptic.

Warren

Joel Koltner

unread,
Mar 25, 2011, 12:56:50 PM3/25/11
to
"Warren" <ve3...@gmail.com> wrote in message
news:Xns9EB37C4FC2713W...@81.169.183.62...

> I personally don't buy that folklore. What one considers as
> "working" varies widely. Perfection is rare in software- even
> rarer in machine/assembly language.
>
> Where is this folklore published? I'm a skeptic.

Woz claims so himselef in his book:
http://www.amazon.com/iWoz-Computer-Invented-Personal-Co-Founded/dp/0393330435

Granted, he's obviously not an unbiased source, so you have to decide for
yourself if you believe him or not.

Personally I find the idea he designed Atari's original Breakout video game
from nothing more than an idea related to him via Steve Jobs to a working
prototype in four days straight even more fantastical. This claim does have a
lot of references listed in Wikipedia --
http://en.wikipedia.org/wiki/Breakout_%28video_game%29 :

"Jobs noticed his friend Steve Wozniak-employee of Hewlett-Packard-was capable
of producing designs with a small number of chips, and invited him to work on
the hardware design with the prospect of splitting the $750 wage. Wozniak had
no sketches and instead interpreted the game from its description. To save
parts, he had "tricky little designs" difficult to understand for most
engineers. Near the end of development, Wozniak considered moving the high
score to the screen's top, but Jobs claimed Bushnell wanted it at the bottom;
Wozniak was unaware of any truth to his claims. The original deadline was met
after Wozniak did not sleep for four days straight. In the end 50 chips were
removed from Jobs' original design. This equated to a US$5,000 bonus, which
Jobs kept secret from Wozniak, instead only paying him
$375.[1][2][3][4][5][6]"

Although Atari apparently wasn't as impressed:

"Atari was unable to use Steve Wozniak's design. By designing the board with
as few chips as possible, he also cut down the amount of TTL
(transistor-transistor logic) chips to 42. This made the design difficult to
manufacture-it was too compact and complicated to be feasible with Atari's
manufacturing methods. However, Wozniak claims Atari could not understand the
design, and speculates "maybe some engineer there was trying to make some kind
of modification to it". Atari ended up designing their own version for
production, which contained about 100 TTL chips. Wozniak found the gameplay to
be the same as his original creation, and could not find any
differences.[2][3][4][5][7][8]"

I'd be interested to learn how, exactly, it was that cutting down the number
of TTL chips made the design difficult to manufacture... weird...

---Joel

Frank Buss

unread,
Mar 25, 2011, 1:22:16 PM3/25/11
to
Joel Koltner wrote:

> I'd be interested to learn how, exactly, it was that cutting down the
> number of TTL chips made the design difficult to manufacture... weird...

I don't know, but maybe he designed too long combinational logic without
tactical latches, so gate delay time was to long, or too complicated to
create a PCB from it (I guess at this time he created a prototype with a
wire wrap board).

Joel Koltner

unread,
Mar 25, 2011, 1:29:54 PM3/25/11
to
"Frank Buss" <f...@frank-buss.de> wrote in message
news:imij0a$dhj$1...@newsreader5.netcologne.de...

> I don't know, but maybe he designed too long combinational logic without
> tactical latches, so gate delay time was to long, or too complicated to
> create a PCB from it (I guess at this time he created a prototype with a
> wire wrap board).

Hmm, could be. Seymour Cray, as I recall, was found of pushing combinatorial
logic as far as he could so as to avoid another clock's worth of latency, and
the story goes he occasionally pushed a bit far and his timing margains
because, um, marginal. :-)

You're correct about the prototype -- the story goes that Woz worked his day
job at HP, spent all night working on Breakout's design, and then Jobs would
spend the day wirewrapping up Woz's design to be tested that evening.

---Joel

Warren

unread,
Mar 25, 2011, 3:52:33 PM3/25/11
to
Joel Koltner expounded in
news:oP3jp.781129$pX3.4...@en-nntp-11.dc1.easynews.com:

> "Warren" <ve3...@gmail.com> wrote in message
> news:Xns9EB37C4FC2713W...@81.169.183.62...
>> I personally don't buy that folklore. What one considers
>> as "working" varies widely. Perfection is rare in
>> software- even rarer in machine/assembly language.
>>
>> Where is this folklore published? I'm a skeptic.
>
> Woz claims so himselef in his book:
> http://www.amazon.com/iWoz-Computer-Invented-Personal-Co-Fou
> nded/dp/0393330435
>
> Granted, he's obviously not an unbiased source, so you have
> to decide for yourself if you believe him or not.

Ah-ha- a self-made claim. :) At best, he may have had a more
flexible concept of "working". Programmers are not immune to
bragging.

> Personally I find the idea he designed Atari's original
> Breakout video game from nothing more than an idea related
> to him via Steve Jobs to a working prototype in four days
> straight even more fantastical. This claim does have a lot
> of references listed in Wikipedia --
> http://en.wikipedia.org/wiki/Breakout_%28video_game%29 :

..


> I'd be interested to learn how, exactly, it was that
> cutting down the number of TTL chips made the design
> difficult to manufacture... weird...
>
> ---Joel

I don't know much about that but on the surface it seems
possible. I can imagine designs that have fewer parts but
depend upon behavioural quirks or tricks to get the job done.
Hence the "hard to understand" part.

I also suspect that you can make something appear to work on
paper (or on one breadboard) but would otherwise be a
production nightmare. Things like tolerances on available
parts for timing etc. may make the production yield
infeasable.

But that isn't my field-- only my opinion.

Warren

Dennis

unread,
Mar 25, 2011, 6:23:05 PM3/25/11
to
Frank Buss wrote:
> Joel Koltner wrote:
>
>> I'd be interested to learn how, exactly, it was that cutting down the
>> number of TTL chips made the design difficult to manufacture... weird...
>
> I don't know, but maybe he designed too long combinational logic without
> tactical latches, so gate delay time was to long, or too complicated to
> create a PCB from it (I guess at this time he created a prototype with a
> wire wrap board).
>
Like in the design of the Apple where he used TTL gate delays to
generate the phase shift for the color? When there were process changes
in the chip manufacture the delays were less and it no longer worked. At
the end you needed certain manufactures and data codes to get it to
work. I think they even applied for a patent.

Or maybe the single shot (rather than counter) timing chains that
changed as components aged? He did some things I wouldn't do even in a
one off hobby project. But then I'm not rich and famous.

k...@att.bizzzzzzzzzzzz

unread,
Mar 25, 2011, 7:10:38 PM3/25/11
to
On Fri, 25 Mar 2011 08:17:58 -0700, "Joel Koltner"
<zapwireD...@yahoo.com> wrote:

><k...@att.bizzzzzzzzzzzz> wrote in message
>news:btqno6lncbucjvqap...@4ax.com...
>> x86 assembler, but the language was irrelevant. They couldn't do it
>> themselves, much less write pseudo-code to do it.
>
>A certain percentage of them might have been able to pull it off in a HLL if
>not assembly (of the ones who could at least create pseudo-code).

Nope. They couldn't do it themselves or even start to write a flow chart to
do it.

>> We were doing arithmetic and converting between bases (directly, not through
>> base-10) in every base up to 32 (characters got tough after that) when we
>> were
>> in fifth and sixth grades.
>
>That's cool... in retrospect, it does seem a little odd that in my class it
>was just base 5 rather than covering arbitrary bases.

It was kinda funny. My father didn't have a clue what we were doing (couldn't
help with homework) but knew it was important. He was an EE prof (power).

>> Jr. High was a total waste, though.
>
>That's where I was first taught algebra, which was of course quite useful...

First semester was taught in 9th grade. The rest, all subjects, wasn't even
at the level we had in grade school.



>although I do also recall spending far more time than seemed warranted on
>"directional proportional to" and "inversely proportional to"-type problems.
>(But then again, in our "social studies" class in middle school, there was a
>section on how to use a map, and I had a hard time imagining how it was that
>anybody didn't already know that -- staring at the map was one of the
>diversions of last resort on long car trips, after all. :-) )

My wife cannot get the simplest information (left/right) off a map and is
*totally* lost without a GPS. Her father was a high school history teacher
and one time president of the local NEA union (during a strike year, no less).
I turned his daughter into a Republican, quite a bit to the right of me. ;-)

0 new messages