Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Can OO be successful in real-time embedded systems?

0 views
Skip to first unread message

Henning Rietz

unread,
Apr 10, 1996, 3:00:00 AM4/10/96
to
To whom it may concern:

For the last couple of weeks I have been involved in a major survey on
the use of object oriented techniques in the area of telecommunications
(mainly in the German speaking region).
I can say "everybody" is using OO in some areas (mainly network
management, switch provisioning, customer care), BUT there are (almost)
no examples in the area of (small) embedded systems, main reasons for
that being:

- "OO systems are too slow"
- "OO systems eat up too much memory"

I believe, that this is not necessarily true but heavily related to
experience. Now I=B4m asking you:

How far "down" does the application of OO really go?
How far will it go in the future?
Who develops commercial(!) embedded real-time systems
using OO methods and languages?
Will OO ever be of major importance in that area?

If you have an opinion concerning these questions please share it with
me! Even those who think they=B4ll "never use that OO-stuff".

Regards,

Henning
-- =

Henning Rietz c/o
Condat GmbH Telephone: +49.30.39094-179
Alt-Moabit 91D Fax: +49.30.39094-300
10559 Berlin, Germany E-Mail: ri...@condat.de

Dave Baldwin

unread,
Apr 10, 1996, 3:00:00 AM4/10/96
to
Henning Rietz (ri...@condat.de) wrote:

: I can say "everybody" is using OO in some areas (mainly network


: management, switch provisioning, customer care), BUT there are (almost)
: no examples in the area of (small) embedded systems, main reasons for
: that being:

: - "OO systems are too slow"
: - "OO systems eat up too much memory"

Object-dis-oriented programming is (like some others) intended to hide
the hardware from the programmer. How useful can this possibly be when
small embedded systems are expressly for dealing with the hardware? Some
of the techniques can be useful, but the overhead and 'hiding' is exactly
what you don't need in hardware control.

There is no universal programming method. Even the examples you cite are
misleading because they're the 'desk-top / paperwork' end of the software.
I'd bet that the software that operates the networks and switches isn't
done in 'OO' for the same reasons. Last time I looked at the cards in a
telephone network bay, I saw thousands of 8032's doing the hardware
control. There were one or two of them on each interface card in a
network terminal that had tens-of-thousands of telephone lines passing
thru it.

--
-=-=-=-=-=-=-=-=-=-=-=- Check out 'alt.tcj' -=-=-=-=-=-=-=-=-=-=-=-=-=-
Dave Baldwin: dib...@netcom.com | The Computer Journal 1(800)424-8825
DIBs Electronic Design | Home page "http://www.psyber.com/~tcj/"
Voice : (916) 722-3877 | Hands-on hardware and software
TCJ/DIBs BBS: (916) 722-5799 | TCJ/DIBs FAX: (916) 722-7480
-=-=-=-=-=- @#$%^&* I can't even quote myself! Oh,well. -=-=-=-=-=-

Barry Kauler

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to Henning Rietz
Henning Rietz wrote:
>
> For the last couple of weeks I have been involved in a major survey on
> the use of object oriented techniques in the area of telecommunications
> (mainly in the German speaking region).
> I can say "everybody" is using OO in some areas (mainly network
> management, switch provisioning, customer care), BUT there are (almost)
> no examples in the area of (small) embedded systems, main reasons for
> that being:
>
> - "OO systems are too slow"
> - "OO systems eat up too much memory"
>
> I believe, that this is not necessarily true but heavily related to
> experience. Now I´m asking you:

>
> How far "down" does the application of OO really go?
> How far will it go in the future?
> Who develops commercial(!) embedded real-time systems
> using OO methods and languages?
> Will OO ever be of major importance in that area?
> Henning,
I think part of the problem is a lack of software tools down at
this end, such as C++.
This is just a wild thought, but I noticed that what is considered
to be the "best" OO language, Eiffel, produces plain old C as output
-- reason is to make it as cross-platform as possible
-- I wonder if that cross-platform capability will extend down
to microcontrollers?
One problem is that C compilers at this level tend to have non
standard features.
Anyway, it's a thought. I am tempted to buy Eiffel just for
checking it out, as the "Personal Eiffel for Windows" is just
US$69.95 .... BUT, only the full professional version gives
the C output, and I can't remember what that costs.
The address is:
http://www.eiffel.com
Anyone had any Eiffel experience? (unfortunately, that
rhymes with "awful"!)
regards,
Barry Kauler
........................................................
........................................................
........................................................

Steven Perryman

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
In article <316BF0...@condat.de> Henning Rietz <ri...@condat.de> writes:

> For the last couple of weeks I have been involved in a major survey on
> the use of object oriented techniques in the area of telecommunications
> (mainly in the German speaking region).

> I can say "everybody" is using OO in some areas (mainly network
> management, switch provisioning, customer care), BUT there are (almost)
> no examples in the area of (small) embedded systems, main reasons for
> that being:

Yeah, everyone is using OO to build TMN management systems, the Alcatel SEL
and Siemens notably in Germany. But Network Elements are being built with OO
on embedded systems. Nokia have done so, and have deployed products. I believe
Siemens in Belgium are building SDH muxes with OO. Alcatel are bound to be
doing so too. I think GPT in the UK also use OO/C++ for their muxes too.

> - "OO systems are too slow"
> - "OO systems eat up too much memory"

I not sure about that. These issues are valid for all embedded systems IMHO,
and not just those developed using OO techniques. I guess it would depend on
your target platform, RTOS facilities (memory mgmt, IPC etc) , and the vendor
compilers you are using.

OOD for embedded systems seems to be the challenge.
You can come up with a HW platform-free OOA for your system quite easily, but
then taking that thru an OOD without losing any of the OO expressiveness and
also addressing the particular constraints/issues of the target embedded sys,
that is the challenge.


Regards,
Steven Perryman
perr...@cpd.ntc.nokia.com


ra...@ix.netcom.com

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
In <dibaldDp...@netcom.com>, dib...@netcom.com (Dave Baldwin) writes:

>Object-dis-oriented programming is (like some others) intended to hide
>the hardware from the programmer. How useful can this possibly be when
>small embedded systems are expressly for dealing with the hardware?

Tremendously. It makes it much easier to add new features, delete unused
ones, and recycle old firmware modules when the hardware changes.
Example: I'm working on a special-purpose PROM for some equipment my
employer manufactures. It has oodles of features (both optional and non-)
and lots of hardware that won't be used in this specific application. I
was able to strip the code I wouldn't be using, and reduce the object
file from well over 100K, to about 24K, in less than 8 hours. There
were a few low-level routines that called other low-level routines for
performance reasons, but more than 90% of that code came out with no more
work than dropping the module's file name from the make file, and removing
its entries in the inter-object message dispatcher and/or software timer
dispatcher tables.

>I'd bet that the software that operates the networks and switches isn't
>done in 'OO' for the same reasons. Last time I looked at the cards in a
>telephone network bay, I saw thousands of 8032's doing the hardware
>control.

Or, in other words, they used modules with a clearly-defined interface
to hide the low-level work from the high-level code. That's pretty much
the essence of "object-oriented design". If it'd been a snake, it woulda
bit ya ;-)

Ran


Larry Baker

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to Henning Rietz
Henning Rietz wrote:
> - "OO systems are too slow"
> - "OO systems eat up too much memory"

Based on what my Telecom friends have been telling me back in
the US, C++ (and object-oriented techniques) are alive and well
in the switching industry. I know of one company that's implemented
an ATM switch using a complete C++ development environment.

IMHO the biggest impediment to using OO techniques for "hard" RT
work is an understanding of how to apply them in a resource-
intensive environment. The straight "party line" answers don't
always work.

In particular, many people that have been disappointed with
C++ performance seem to lack an understanding of the implications
of implicit calls to constructors/copy-constructors/destructors,
memory framentation, or inline vs. non-inline procedure calls.

Then they turn around and blame the language, rather than their
use of it.

Cheers,

Larry Baker
l...@sdt.com

John Hunnell P840

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to

> I'd bet that the software that operates the networks and switches isn't
> done in 'OO' for the same reasons. Last time I looked at the cards in a
> telephone network bay, I saw thousands of 8032's doing the hardware
> control. There were one or two of them on each interface card in a
> network terminal that had tens-of-thousands of telephone lines passing
> thru it.

How much would you like to bet. I am working on my second large
switch product at Nortel using OO with C++. Our product handles
hundreds of lines or trunks. The first product I worked on is
adding OO to a non OO environment while the second is being
designed from the bottom up and using OO extensively. It
is using a PPC603 (and other 32 bit processors), not an 8 bit
processor. I might add that it is using a third party multitasking
OS with many different tasks. Some of the tasks are designed
using various OO tools while others are non-OO. They can coexist
in the same system.

A couple quick impressions of OO in embedded systems:
- Yes the tools are lacking.
- If you think OO slows down your product, then you are not
experienced enough in OO and are poorly designing your product.
If you try to instantiate 300 objects every time you try to
place a phone call, control a motor, etc. something is drastically
wrong with your design. Reatime requirements sometimes require
you to modify your OO design from the "ideal" design just like
it does with procedural design. Even virtual function calls only
require an extra table lookup to make the method call (can you
say pointer to function call in C).
- I would agree that very small embedded systems may not benefit
from OO. OO is just a tool. The larger the design, the more your
design can benefit from OO. If the design is small enough, OO
may be overcomplicating things that are not very complex but if
you have a large design, it can help manage the complexity.

Eddie Hunnell
--
Eddie Hunnell
Bell Northern Research
hun...@bnr.ca


Ron M. Cole

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
Henning Rietz (ri...@condat.de) wrote:
: To whom it may concern:

: For the last couple of weeks I have been involved in a major survey on


: the use of object oriented techniques in the area of telecommunications
: (mainly in the German speaking region).
: I can say "everybody" is using OO in some areas (mainly network
: management, switch provisioning, customer care), BUT there are (almost)
: no examples in the area of (small) embedded systems, main reasons for
: that being:

: - "OO systems are too slow"


: - "OO systems eat up too much memory"

: I believe, that this is not necessarily true but heavily related to
: experience. Now I=B4m asking you:

: How far "down" does the application of OO really go?
: How far will it go in the future?
: Who develops commercial(!) embedded real-time systems
: using OO methods and languages?
: Will OO ever be of major importance in that area?

There was a good talk/paper at OOPSLA last fall about a realtime telecom
switch system build in Smalltalk on top of the psos+ realtime os. The
speaker said that it was by far his most productive project yet with the
lowest level of defects (his early designs had been done in C and C++). It
was a VME based system running on something like up to 100 68040 based
switch cards. And yes they did have to spend some time timing the code and
being careful of what they did but it sounded like a resonable amount of
effort given the result.

The paper is "Implementing a Real-Time, Embedded, Telecommunication
Swithcing System in Smalltalk" John Radford.


--
Ron Cole e-mail: co...@spk.hp.com
Hewlett Packard
Spokane Division Bell: 509-921-3839
24001 E Mission Ave
Liberty Lake, WA 99019-9599

Robert C. Martin

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
In article <316BF0...@condat.de> Henning Rietz <ri...@condat.de> writes:

For the last couple of weeks I have been involved in a major survey on
the use of object oriented techniques in the area of telecommunications
(mainly in the German speaking region).
I can say "everybody" is using OO in some areas (mainly network
management, switch provisioning, customer care), BUT there are (almost)
no examples in the area of (small) embedded systems, main reasons for
that being:

- "OO systems are too slow"
- "OO systems eat up too much memory"

More than one of my clients are using OO and C++ in embedded,
multi-threaded, real time systems, and they are getting along quite
nicely. The systems I am talking about are extremely constrained.
There are dozens of real time tasks with millisecond response times.
There is a very limitted amount of memory. etc. They are concerned
for every wasted microsecond. Yet they are finding that C++ and OOD
are more than adequate to the task.

What is it that would make OO slow? Some people contend that it is
the time required for polymorphic dispatch. (i.e. figuring out which
method to call when a message is recieved.) In C++ this is very very
fast. Indeed I recently benchmarked a 486-33 using a popular
compiler and found that the polymorphic dispatch time was 140ns.

Moreover, polymorphic dispatch in OO applications replaces
if-else or switch statements in equivalent procedural applications.
So the comparison is probably moot. Neither is faster or slower than
the other given a decent compiler.

As to memory, C++ requires a bit more memory for managing the virtual
tables. This amounts to one pointer per object, and one virtual table
per class. Each virtual table contains one pointer per virtual
function and probably a few other bytes for miscellaneous stuff.
These tables can be placed in ROM.

However, these tables and the virtual pointers replace switch/case
tables or if/else code that would exist in the procedural counterpart.
So the difference is probably moot.

How far "down" does the application of OO really go?

As far down as you like. C++ code in interrupt heads is not out of
the question.

How far will it go in the future?

I anticipate no lower bound.

Who develops commercial(!) embedded real-time systems
using OO methods and languages?

If you would like to contact me, I will ask my clients if they would
be willing to share their experiences with you.

Will OO ever be of major importance in that area?

It already is.
--
Robert Martin | Design Consulting | Training courses offered:
Object Mentor Assoc.| rma...@oma.com | OOA/D, C++, Advanced OO
14619 N. Somerset Cr| Tel: (847) 918-1004 | Mgt. Overview of OOT
Green Oaks IL 60048 | Fax: (847) 918-1023 | http://www.oma.com


Robert C. Martin

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
In article <316D1D...@cowan.edu.au> Barry Kauler <b.ka...@cowan.edu.au> writes:

This is just a wild thought, but I noticed that what is considered
to be the "best" OO language, Eiffel, produces plain old C as output

I can't let that one slip. Eiffel is considered "by some" to be the
"best" OOPL. Others rather like Java. Still others are sworn to
uphold Objective-C. And then there are those of us who think C++ is
somewhat usable.

Robert C. Martin

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
In article <dibaldDp...@netcom.com> dib...@netcom.com (Dave Baldwin) writes:

Object-dis-oriented programming is (like some others) intended to hide
the hardware from the programmer.

This is not quite correct. The intention of OO is not to hide the
hardware from the programmer. The intention of OO is to provide tools
to the programmer whereby he can manipulate the hardware at varying
levels of abstraction. If he wants to twiddle the bits, he can
go right ahead and do so, even in OO. If he would rather deal at a
higher level of abstraction, he can use OO to create that level.

OO is a tool, not a religion, and not a philosophy.

How useful can this possibly be when

small embedded systems are expressly for dealing with the hardware? Some
of the techniques can be useful, but the overhead and 'hiding' is exactly
what you don't need in hardware control.

Incorrect. The overhead is minimal (arguably zero), and if the
engineer chooses to hide something, he must feel there is something to
hide. Example: When controlling a modem in order to dial a phone
number, one could twiddle the bits every time you need to dial, or one
can hide the bit twiddling in a function, and call the function
whenever you need to dial.

If you have two different kinds of modems, one could always check a
flag to make sure you are calling the right function, or one could
create an OO interface so that you don't care which type of modem you
are controlling.

There is no universal programming method.

Granted.

I'd bet that the software that operates the networks and switches isn't
done in 'OO' for the same reasons.

Actually, a lot of it is. I have clients in the telecom industry who
are using OO/C++ in their switches and network managers, etc.

Last time I looked at the cards in a
telephone network bay, I saw thousands of 8032's doing the hardware
control. There were one or two of them on each interface card in a
network terminal that had tens-of-thousands of telephone lines passing
thru it.

I don't know about 8032's. However, some of my clients are using
C++/OOD in motorola based microcontrollers (68000 based)....

Tim Dugan

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
In article <RMARTIN.96...@rcm.oma.com>,

Robert C. Martin <rma...@oma.com> wrote:
>In article <316BF0...@condat.de> Henning Rietz <ri...@condat.de> writes:
>[...]

> - "OO systems are too slow"
> - "OO systems eat up too much memory"
>[...]
>
>What is it that would make OO slow? [...]

>
>As to memory, C++ requires a bit more memory for managing the virtual
>tables. [...]

Although I have no figures or measurements, I would have to say
that I suspect that the one area where C++ is slower is that
there is something about C++ that encourages programmers to
perform a great deal more allocation and de-allocation of
memory, causing memory fragmentation and slowing the allocation/
deallocation process.

This is partially a problem of style, using pointers and
performing "new" when a non-pointer will work. As classes
are constructed of classes which are constructed...etc...
a constructor call can cause numerous allocations of small
bits of memory.

I know that some groups try to restrict real time software in
Ada to not allocating heap space but only stack space. I suppose
you could do this in C++, too. But, generally, that doesn't
seem necessary.

-t
--
Tim Dugan
mailto:ti...@starbase.neosoft.com
http://starbase.neosoft.com/~timd

Robert C. Martin

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to

OOD for embedded systems seems to be the challenge. You can come
up with a HW platform-free OOA for your system quite easily, but
then taking that thru an OOD without losing any of the OO
expressiveness and also addressing the particular
constraints/issues of the target embedded sys, that is the
challenge.

But not a particularly difficult challenge. Probably the hardest part
is finding the appropriate cross tools. These tools *do* exist for
C++, but they are not plentiful yet.

However, once you have an acceptable cross environment, creating an OO
solution for an embedded real-time problem is no more challenging than
creating a procedural solution to the same problem. And it comes with
the traditional benefits of OO: maintainability and reusability.

Dave Baldwin

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
ra...@ix.netcom.com wrote:

: Tremendously. It makes it much easier to add new features, delete unused


: ones, and recycle old firmware modules when the hardware changes.
: Example: I'm working on a special-purpose PROM for some equipment my
: employer manufactures. It has oodles of features (both optional and non-)
: and lots of hardware that won't be used in this specific application. I
: was able to strip the code I wouldn't be using, and reduce the object
: file from well over 100K, to about 24K, in less than 8 hours. There
: were a few low-level routines that called other low-level routines for
: performance reasons, but more than 90% of that code came out with no more
: work than dropping the module's file name from the make file, and removing
: its entries in the inter-object message dispatcher and/or software timer
: dispatcher tables.

: Or, in other words, they used modules with a clearly-defined interface

: to hide the low-level work from the high-level code. That's pretty much
: the essence of "object-oriented design". If it'd been a snake, it woulda
: bit ya ;-)

That's very nice, but it sounds to me like you're confusing modularity
with 'object-oriented'. They are not the same. I can do (and have done)
what you're describing with the assembly langauge libraries I've
written. Comment out a few macros and includes and I have a 'new'
program. No sense in writing everything from scratch every time.

You would be hard-pressed to put a 100k binary into a 8032 application.
It would require special bank-switching hardware external to the 8032
since it only has a code space of 64k. At that point, most would go to a
different CPU. Also, the 8032 is limited to about 128 bytes of stack
because you have to use the chip's internal memory which is 256 bytes
total. This memory space also includes the chip's registers and on-chip
I/O control.

Jon S Anthony

unread,
Apr 11, 1996, 3:00:00 AM4/11/96
to
In article <316BF0...@condat.de> Henning Rietz <ri...@condat.de> writes:

I would believe that Ada folk have more to say on this than most. So,
I am crossing it over to c.l.a too...

/Jon


> To whom it may concern:
>

> For the last couple of weeks I have been involved in a major survey on
> the use of object oriented techniques in the area of telecommunications
> (mainly in the German speaking region).
> I can say "everybody" is using OO in some areas (mainly network
> management, switch provisioning, customer care), BUT there are (almost)
> no examples in the area of (small) embedded systems, main reasons for
> that being:
>

> - "OO systems are too slow"
> - "OO systems eat up too much memory"
>

> I believe, that this is not necessarily true but heavily related to
> experience. Now I=B4m asking you:
>

> How far "down" does the application of OO really go?

> How far will it go in the future?

> Who develops commercial(!) embedded real-time systems
> using OO methods and languages?

> Will OO ever be of major importance in that area?
>

> If you have an opinion concerning these questions please share it with
> me! Even those who think they=B4ll "never use that OO-stuff".
>
> Regards,
>
> Henning
> -- =
>
> Henning Rietz c/o
> Condat GmbH Telephone: +49.30.39094-179
> Alt-Moabit 91D Fax: +49.30.39094-300
> 10559 Berlin, Germany E-Mail: ri...@condat.de

--
Jon Anthony
Organon Motives, Inc.
1 Williston Road, Suite 4
Belmont, MA 02178

617.484.3383
j...@organon.com


Ell

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to
Robert C. Martin (rma...@oma.com) wrote:
: dib...@netcom.com (Dave Baldwin) writes:
: This is not quite correct. The intention of OO is not to hide the

: hardware from the programmer. The intention of OO is to provide tools
: to the programmer whereby he can manipulate the hardware at varying
: levels of abstraction. If he wants to twiddle the bits, he can
: go right ahead and do so, even in OO. If he would rather deal at a
: higher level of abstraction, he can use OO to create that level.

: OO is a tool, not a religion, and not a philosophy.

OO"T" is a tool, OO should NOT be a religion, but there IS a "philosophy"
(or more accurately a philosophical viewpoint) beneath the most efficient
and "intuitive" OO, in my opinion. I have spoken to this philosophy on
the Usenet since 1990 (comp.object, and comp.lang.c++), Booch does so in
the early chapters of OOA&D, and Whitmire has done so recently here in
comp.object.

Elliott

Ell

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to
Robert C. Martin (rma...@oma.com) wrote:
: dib...@netcom.com (Dave Baldwin) writes:

: Object-dis-oriented programming is (like some others) intended to hide
: the hardware from the programmer.

: This is not quite correct. The intention of OO is not to hide the
: hardware from the programmer. The intention of OO is to provide tools
: to the programmer whereby he can manipulate the hardware at varying
: levels of abstraction.

This is possible using Structured Analysis Design, and Programming (SADP),
though not generally polymorphically, as with OO. The intention of
Simula, the historically acknowledged first OOPL, (as I understand it) was
to simulate, or model, a fleet ship distribution network. And in doing
so, Simula processed information which was useful to the ship fleet
clients, and owners.

Elliott

Jens Coldewey

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to
Steven Perryman wrote:

> ...Yeah, everyone is using OO to build TMN management systems, the Alcatel
> SEL and Siemens notably in Germany...

Well Siemens uses C++ but in a classical C/S architecture. Clients and data-
base servers are HP-UX (at least they were two years ago when I was in-
volved). The switches are connected as 'legacy' systems using either the
standard MML language or a 'Q3 interface' that is defined by Deutsche
Telekom. The database is a relational database. I think that limits the
OO statement to certain extent.

As far as I know Alcatel does it the same way. Concerning to
my knowledge both still use CHILL to program the switches.

BTW most of the TMN software was written by Siemens Austria in Vienna.

Jens

--
Jens Coldewey |s |d &|m | software design & management gmbh&CoKG
| | | | Thomas-Dehler-Str. 27
jens.c...@sdm.de | | | | 81737 Munich, Germany.

Roger Barnett

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to

We have customers using our Object Request Broker on networked
embedded systems running pSOS and OS/9 (amongst others), so I
suspect the answer to the question in the thread title is yes.

--
Roger Barnett
Object Oriented Technologies Ltd, Leamington Spa, England
email: ro...@oot.co.uk OR ro...@natron.demon.co.uk
Web: http://www.octacon.co.uk/onyx/external/oot.co.uk

Steven Perryman

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to
In article <RMARTIN.96...@rcm.oma.com> rma...@oma.com (Robert C. Martin) writes:

>> OOD for embedded systems seems to be the challenge.

> But not a particularly difficult challenge.

I wouldn't make such a sweeping statement as that. :-)

> Probably the hardest part is finding the appropriate cross tools. These
> tools *do* exist for C++, but they are not plentiful yet.

> However, once you have an acceptable cross environment, creating an OO
> solution for an embedded real-time problem is no more challenging than
> creating a procedural solution to the same problem

I think in more general terms than just C++ tools. Environment IMHO transcends
more than mere compilers.

For example :

What is your distribution mechanism ?? Your persistence mech ??
And so on.

I could for example use CORBA and/or ODMG ODL etc to completely abstract these
issues. But then, are they supported by vendor products on the target
platforms ??

Could you write your own if needed ?? Maybe.
Can it even be done on the target platform ?? Maybe.

These issues seem to become more pressing on embedded systems than on say
UNIX platforms (especially wrt getting off-the-shelf vendor products) .


Regards,
Steven Perryman
perr...@cpd.ntc.nokia.com

Bhargav P. Upender

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to
> Anyone had any Eiffel experience? (unfortunately, that
> rhymes with "awful"!)

I have "Personal Eiffel for Windows". I am not too impressed with the
maturity of the product.
* It is a memory hog. You need atleast 16M to run it on windows.
* The application that I have developed runs slow (personal version does
not have optimizer).
* I wouldn't use it for embedded systems yet!

The professional version might be better, but its more money.

I like the language: especially the pre/post conditions that can help in
reducing SW errors. These enable "programming by contract" method.

I like to hear other opinions.

Bhargav Upender
My opinions only!

Steve Lee

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to Dave Baldwin
>
> That's very nice, but it sounds to me like you're confusing modularity
> with 'object-oriented'. They are not the same. I can do (and have done)
> what you're describing with the assembly langauge libraries I've
> written. Comment out a few macros and includes and I have a 'new'
> program. No sense in writing everything from scratch every time.
>

I think there is a little difference between "comment out a few macros
and includes" and the power of polymorphism and abstraction.

> You would be hard-pressed to put a 100k binary into a 8032 application.
> It would require special bank-switching hardware external to the 8032
> since it only has a code space of 64k. At that point, most would go to a
> different CPU. Also, the 8032 is limited to about 128 bytes of stack
> because you have to use the chip's internal memory which is 256 bytes
> total. This memory space also includes the chip's registers and on-chip
> I/O control.
>

Isn't the title of this thread hint towards real-time embedded systems,
and not just the 8032?

--
Steve Lee
Computer Engineering/Computer Science
Iowa State University
email -> sj...@iastate.edu
WWW -> http://www.cs.iastate.edu/~sjlee/homepage.html

Steve Lee

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to Larry Baker
>
> In particular, many people that have been disappointed with
> C++ performance seem to lack an understanding of the implications
> of implicit calls to constructors/copy-constructors/destructors,
> memory framentation, or inline vs. non-inline procedure calls.
>
> Then they turn around and blame the language, rather than their
> use of it.
>

Exactly.

Lee Webber

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to
(I have quoted more than usual of the previous postings because I
am adding comp.lang.eiffel to the newsgroups, while deleting comp.
lang.c++.)

Barry Kauler <b.ka...@cowan.edu.au> wrote:


>
> Henning Rietz wrote:
> >
> > For the last couple of weeks I have been involved in a major survey on
> > the use of object oriented techniques in the area of telecommunications
> > (mainly in the German speaking region).
> > I can say "everybody" is using OO in some areas (mainly network
> > management, switch provisioning, customer care), BUT there are (almost)
> > no examples in the area of (small) embedded systems, main reasons for
> > that being:
> >
> > - "OO systems are too slow"
> > - "OO systems eat up too much memory"
> >
> > I believe, that this is not necessarily true but heavily related to

> > experience. Now I´m asking you:


> >
> > How far "down" does the application of OO really go?
> > How far will it go in the future?
> > Who develops commercial(!) embedded real-time systems
> > using OO methods and languages?
> > Will OO ever be of major importance in that area?

> > Henning,
> I think part of the problem is a lack of software tools down at
> this end, such as C++.

> This is just a wild thought, but I noticed that what is considered
> to be the "best" OO language, Eiffel,

Agreed.

> produces plain old C as output

> -- reason is to make it as cross-platform as possible
> -- I wonder if that cross-platform capability will extend down
> to microcontrollers?
> One problem is that C compilers at this level tend to have non
> standard features.
> Anyway, it's a thought. I am tempted to buy Eiffel just for
> checking it out, as the "Personal Eiffel for Windows" is just
> US$69.95 .... BUT, only the full professional version gives
> the C output, and I can't remember what that costs.
> The address is:
> http://www.eiffel.com

> Anyone had any Eiffel experience? (unfortunately, that
> rhymes with "awful"!)

> regards,
> Barry Kauler

Thank you for pressing my hot button. :-)

Yes, Eiffel should be ideal for embedded systems, for the reasons
given above. Unfortunately, the big 3 Eiffel vendors, while all
emitting C from their compilers, all interface to proprietary run-
time systems. Furthermore, their run-time systems all seem to be
platform-dependent.

You can get an Eiffel release from someone for just about any personal
computer or workstation and every major operating system you can name,
from DOS (Eon, SIG) to Windows (just about everyone) to various forms
of Unix (universal), to Mac, OS/2 and even (I believe) Intel/Next.
But that's as far as it goes; you can't use Eiffel for any environment
that doesn't have a user interface (this is a heuristic, not a causal
relationship), or that doesn't have a really large user base. To the
best of my knowledge, none of the Eiffel implementations has a garbage
collector that has even soft real-time characteristics on a slow
processor.

I have had on my wish list for some time an Eiffel compiler that would
emit C code that interfaced to an RTS in a *publically defined* way;
this would allow the RTS to be implemented for any platform. I have
made moves toward writing such a compiler, but it's just too big a
job and I haven't the time (or possibly the skills).

It's my opinion that if available Eiffel would eat Ada alive in the
real-time arena. But why dream...

Sid Johnson

unread,
Apr 12, 1996, 3:00:00 AM4/12/96
to
Henning Rietz wrote:
>

>
> - "OO systems are too slow"
> - "OO systems eat up too much memory"
>
> I believe, that this is not necessarily true but heavily related to
> experience. Now I´m asking you:
>
> How far "down" does the application of OO really go?
> How far will it go in the future?
> Who develops commercial(!) embedded real-time systems
> using OO methods and languages?
> Will OO ever be of major importance in that area?
>
>
>

>The trick is to remember that C++ is just doing what you would like to do in C, but
doing it with better looking source code. It provides modularity and extensibility without
cluttering source code with pointers and it avoids global data without the call overhead of "get"
functions (inline fns). These are great maintainability advantages.

As for being a resource hog, that is a matter of self-discipline. If you avoid exceptions
and templates, are reasonable with inheritance, and hit a happy medium on what you call an
object, there is very little overhead. In addition, for "single-instance" objects, you can make
the data and fns static and avoid the "this" pointer. Even with virtual fns, if the objects are
named at compile time instead of run-time, the virtual table is avoided altogether.

We are currently reengineering two existing products (similar but different) into one product
line using C++. These products are characterized by several choices of optional peripherals, and
very complex interactions of internal features. We have found C++ to be just the tool for
creating an architecture which simplifies the view of this software and gives the flexibility to
mix and match peripherals.

As for tools, there seems to be nothing below 32-bit except in the x86 family. This is because
the Microsoft and Borland compilers support it. Several other vendors provide linkers, locaters,
and debuggers to support these compilers. With this approach, you can go as low as the x86
family goes. As for OOA/D, the Rose tool supports these compilers.

I think the industry is really missing an opportunity to turn out much better software without
serious performance degradation -- even in the 8/16-bit market. :-)


--
__________________________________________________________________
Sid sjoh...@vantek.net

Come visit @ the Philosopher's Corner - http://www.vantek.net/pages/sjohnson/

In thinking, keep to the simple. In conflict, be fair and generous.
In governing, don't try to control. In work do what you enjoy.
In family life, be completely present.
Lao-Tzu
___________________________________________________________________

Brad Rodriguez

unread,
Apr 14, 1996, 3:00:00 AM4/14/96
to
Dave Baldwin wrote:
> Henning Rietz (ri...@condat.de) wrote:[snip]
>
> : - "OO systems are too slow"
> : - "OO systems eat up too much memory"[snip]

> Object-dis-oriented programming is (like some others) intended to hide
> the hardware from the programmer. How useful can this possibly be when

> small embedded systems are expressly for dealing with the hardware?

OK, Dave, I now know what my next article for TCJ will be. I've started
adding object-oriented extensions to Forth for my current project, a
distributed control system using relatively small embedded controllers
(68HC16s). It is neither a memory nor a CPU hog; dynamic binding takes
something like five machine instructions. (A similar scheme was
described in a recent ACM SIGPlan Notices.) It's also far from mature;
e.g., I've neglected encapsulation because I can work without it for the
time being. But I'm sure it will fit in an 8051. :-)

(You can hear about the entire project at the Rochester Forth Conference
this June. Advt.)

On the philosophical side...I've adopted OOP because it was the right
tool to solve the particular problems I'm facing. It's not always the
right tool. "If all you have is a hammer, everything looks like a nail."

(This message cross-posted to comp.lang.forth, and some inappropriate
cross-postings deleted.)
--
Brad Rodriguez b...@headwaters.com Computers on the Small Scale
Contributing Editor, The Computer Journal... http://www.psyber.com/~tcj
Director, Forth Interest Group........... http://www.forth.org/fig.html
1996 Rochester Forth Conference: June 19-22 in Toronto, Ontario
http://maccs.dcss.mcmaster.ca/~ns/96roch.html

Zsoter Andras

unread,
Apr 15, 1996, 3:00:00 AM4/15/96
to
>Dave Baldwin wrote:
>> Henning Rietz (ri...@condat.de) wrote:[snip]
>>
>> : - "OO systems are too slow"
>> : - "OO systems eat up too much memory"[snip]
>> Object-dis-oriented programming is (like some others) intended to hide
>> the hardware from the programmer. How useful can this possibly be when
>> small embedded systems are expressly for dealing with the hardware?

Well, I am not doing embedded systems, but my OOF (and DOOF) is built
on a VERY FAST OOP implementation.
My paper about it is in the coming(?) issue of FD.
In my system a late bound call or a field access takes the same time
as an ordinary call of a global variable access (at least on a 486
CPU -- of course caches can mess things up).
The only thing that takes extra time is to change the object-in-use, but
even that one is not too long.

On CPU-s with more restricted addressing modes it is not 100% true
but the penalty should be quite low.
If you can afford to use Forth instead of assembly you can afford
OOP.

Andras


Matt Kennel

unread,
Apr 15, 1996, 3:00:00 AM4/15/96
to
Larry Baker (l...@sdt.com) wrote:

: Henning Rietz wrote:
: > - "OO systems are too slow"
: > - "OO systems eat up too much memory"

: Based on what my Telecom friends have been telling me back in


: the US, C++ (and object-oriented techniques) are alive and well
: in the switching industry. I know of one company that's implemented
: an ATM switch using a complete C++ development environment.

: IMHO the biggest impediment to using OO techniques for "hard" RT
: work is an understanding of how to apply them in a resource-
: intensive environment. The straight "party line" answers don't
: always work.

: In particular, many people that have been disappointed with


: C++ performance seem to lack an understanding of the implications
: of implicit calls to constructors/copy-constructors/destructors,
: memory framentation, or inline vs. non-inline procedure calls.

: Then they turn around and blame the language, rather than their
: use of it.

I think it's suitable to assign some of the blame to the langauge when
alternative languages of equal capability not have such tricky implicit
semantic issues.

More mature fields of engineering do not blame humans for being human.

They strive to create technology and creative and clever and profound
design which adapts to humans and serves their needs. (have you ever used
those new guillotine-style bagel slicers? Safe, quick and easy. Would
you blame people for being incompetent at evenly slicing bagels with a knife)

Would you blame people for being stupid for not knowing the complex and
subtle implicit rules in the tax code? Or would you consider the tax
code ill-designed.

Unlike tax law, you should not need an act of Congress to change your
programming language.

(happy April 15th!)


: Cheers,

: Larry Baker
: l...@sdt.com

Matt Kennel

unread,
Apr 15, 1996, 3:00:00 AM4/15/96
to
Bhargav P. Upender (ba...@utrc.utc.com) wrote:
: > Anyone had any Eiffel experience? (unfortunately, that
: > rhymes with "awful"!)

: I have "Personal Eiffel for Windows". I am not too impressed with the

: maturity of the product.
: * It is a memory hog. You need atleast 16M to run it on windows.
: * The application that I have developed runs slow (personal version does
: not have optimizer).
: * I wouldn't use it for embedded systems yet!

: The professional version might be better, but its more money.

The personal eiffel for windows is an interpreter. It will be
much slower than the professional version.

16 MB of memory is not very much.

: I like the language: especially the pre/post conditions that can help in

Dave Baldwin

unread,
Apr 15, 1996, 3:00:00 AM4/15/96
to
Brad Rodriguez (b...@headwaters.com) wrote:
: OK, Dave, I now know what my next article for TCJ will be. I've started
: adding object-oriented extensions to Forth for my current project, a
: distributed control system using relatively small embedded controllers
: (68HC16s). It is neither a memory nor a CPU hog; dynamic binding takes
: something like five machine instructions. (A similar scheme was
: described in a recent ACM SIGPlan Notices.) It's also far from mature;
: e.g., I've neglected encapsulation because I can work without it for the
: time being. But I'm sure it will fit in an 8051. :-)

It will be interesting to hear about 'OO' at a lower or smaller level
than C++ or other 'big' machine languages.

Peter Hermann

unread,
Apr 16, 1996, 3:00:00 AM4/16/96
to
Zsoter Andras (h929...@hkuxa.hku.hk) wrote:
: Well, I am not doing embedded systems, but my OOF (and DOOF) is built

BTW, The German word "DOOF" means "stupid". ;-)

--
Peter Hermann Tel:+49-711-685-3611 Fax:3758 p...@csv.ica.uni-stuttgart.de
Pfaffenwaldring 27, 70569 Stuttgart Uni Computeranwendungen
Team Ada: "C'mon people let the world begin" (Paul McCartney)

Jack Campin

unread,
Apr 16, 1996, 3:00:00 AM4/16/96
to

Brad Rodriguez <b...@headwaters.com> writes:
Dave Baldwin wrote:
> Henning Rietz (ri...@condat.de) wrote:[snip]
>> - "OO systems are too slow"
>> - "OO systems eat up too much memory"[snip]
> Object-dis-oriented programming is (like some others) intended to hide
> the hardware from the programmer. How useful can this possibly be when
> small embedded systems are expressly for dealing with the hardware?

A non-Forth example from several years back: the system software for one
of the more successful deep-space probes was done by Chorus in C++ (this
somewhat before C++ took over the universe); this stuff was somehow related
to their semi-OO microkernel. The dynamic linking meant they could download
modules and install them into the running system from tens of millions of
miles away, and this couldn't have been a large-memory system. I'd have
thought an OO Forth would have been a lot easier, but I don't think there
was a mature one back then.

Which makes me wonder: has Forth made it into space yet?

-----------------------------------------------------------------------------
Jack Campin ja...@purr.demon.co.uk
T/L, 2 Haddington Place, Edinburgh EH7 4AE, Scotland (+44) 131 556 5272
-------------------- FERMAN PADiSAHIN, DAGLAR BiZiMDiR --------------------


Ralph Hibbs

unread,
Apr 16, 1996, 3:00:00 AM4/16/96
to
Hello All Shlaer-Mellor Method Enthusiasts and OO Novices,

The report titled "Shlaer-Mellor Method: The OOA96 Report" is available
for free downloading at the Project Technology web site
(http://www.projtech.com).

This report is an extension the Shlaer-Mellor OOA Method, based on an
additional 5 years of real-world experience by Sally Shlaer, Stephen J.
Mellor and co-collaborator Neil Lang. Over the past 5 years the method
has been successfully applied to thousands of projects. This widespread
application surfaced some minor clarification and method enhancements.
These have been captured in the OOA96 Report.

Project Technology, home of methodologists Sally Shlaer and Stephen J.
Mellor, is proud to offer this exciting report via the internet. We
want to make our method available to the widest possible audience in the
most efficient manner.


----------------- Home of the Shlaer-Mellor Method --------------------
Ralph Hibbs Tel: (510) 845-1484
Director of Marketing Fax: (510) 845-1075
Project Technology, Inc. URL: http://www.projtech.com
Berkeley, CA 94710


Elizabeth Rather

unread,
Apr 16, 1996, 3:00:00 AM4/16/96
to
ja...@purr.demon.co.uk (Jack Campin) wrote:

>
>Which makes me wonder: has Forth made it into space yet?
>

Yes! A list compiled by folks at NASA/GSFC a few years ago listed over 40
space projects (including Shuttle experiments and satellites) coded in Forth.
There have presumably been more since. Some are described in our web site
(addr. below). NASA is presently developing a whole generation of systems for
interfacing "guest" payloads to the shuttle computing systems based on the
RTX2000 and programmed in Forth.

Elizabeth D. Rather
FORTH, Inc. Products and services for
111 N. Sepulveda Blvd. professional Forth programmers
Manhattan Beach, CA 90266 since 1973. See us at:
310-372-8493/fax 318-7130 http://home.earthlink.net/~forth/

Robert C. Martin

unread,
Apr 16, 1996, 3:00:00 AM4/16/96
to
In article <4kjfrh$2...@Starbase.NeoSoft.COM> ti...@Starbase.NeoSoft.COM (Tim Dugan) writes:

Although I have no figures or measurements, I would have to say
that I suspect that the one area where C++ is slower is that
there is something about C++ that encourages programmers to
perform a great deal more allocation and de-allocation of
memory, causing memory fragmentation and slowing the allocation/
deallocation process.

There is nothing about C++ that encourages programmers to perform
a great deal more allocation and de-allocation of memory. Some
popular styles advocate this, but they advocate it in C++ as well as
other languages.

Also. If you use a programming style that is heavily weighted towards
dynamic memory allocation, you can prefent fragmentation and heap
delays by using non-deterministic heaps.

Roman Fietze

unread,
Apr 17, 1996, 3:00:00 AM4/17/96
to
Robert C. Martin wrote:
> There is nothing about C++ that encourages programmers to perform
> a great deal more allocation and de-allocation of memory. Some
> popular styles advocate this, but they advocate it in C++ as well as
> other languages.
>
> Also. If you use a programming style that is heavily weighted towards
> dynamic memory allocation, you can prefent fragmentation and heap
> delays by using non-deterministic heaps.

If you really want to use the powers of C++ instead of using C++ as a
better C you have more memory allocation and deallocation. In C you very
often handle pointers and care about when to free the memory associated
with that memory. In C++ you always copy memory to other objects with
constructors or operators, because you use class objects like other
integral types without thinking too much (you could, but you don't want
to because the program is more readable then :). You even have a lot of
memory allocation and deallocation by using arguments passed by value or
by the compiler using temporaries. I was astonished, when I checked the
code produced by my Microtec C++ 4.x, allthough when I thought about it,
there was no other way to handle that.

In my special case I built a menu system based on an own curses
implementation on pSOS. With the old C version I passed pointers to some
structures to the menu library functions. In the C++ version I build a
menu by adding menu items to a menu object, which causes many memory
copy, allocation and deallocation calls (not to give the CPU any chance
I even used a String class instead of char *'s). The other drawback is
that with the old system I could hold the text for the menu text in ROM
only (by declaring it const), but with C++ I have to copy it using e.g.
the operator+= or some constructor, and even the type specifier const
isn't a guarantee for beeing allocated in a readonly memory (ROM on
embedded systems, readonly sections e.g. on UNIX), it just says, the
variable cann only be initialized, but not changed by an assignement
operator.

Roman

--
Roman Fietze (Mail Code 5023) Kodak AG Stuttgart/Germany
http://www.kodak.com fie...@kodak.COM

Marc de Groot

unread,
Apr 17, 1996, 3:00:00 AM4/17/96
to
Elizabeth Rather wrote:
>
> ja...@purr.demon.co.uk (Jack Campin) wrote:
>
> >
> >Which makes me wonder: has Forth made it into space yet?
> >
>
> Yes! A list compiled by folks at NASA/GSFC a few years ago listed over 40
> space projects (including Shuttle experiments and satellites) coded in Forth.
> There have presumably been more since. Some are described in our web site
> (addr. below). NASA is presently developing a whole generation of systems for
> interfacing "guest" payloads to the shuttle computing systems based on the
> RTX2000 and programmed in Forth.

AMSAT, the ham radio satellite organization, also uses Forth for the software on
its satellites. The language they use is has an acronym for a name. The acronym stands
for the German equivalent of "language for satellites" or something like that. I can't
remember exactly what it is anymore...
--
----
Marc de Groot | Immersive Systems, Inc.
<ma...@immersive.com> | http://www.immersive.com
| Real VR for the net!
"Under the most strictly controlled conditions of temperature,
lighting and pH, the organism does as it damn well pleases."

Michael Furman

unread,
Apr 17, 1996, 3:00:00 AM4/17/96
to
In article <RMARTIN.96...@rcm.oma.com>, rma...@oma.com says...

>
>In article <4kjfrh$2...@Starbase.NeoSoft.COM> ti...@Starbase.NeoSoft.COM (Tim
Duga
>n) writes:
>
> Although I have no figures or measurements, I would have to say
> that I suspect that the one area where C++ is slower is that
> there is something about C++ that encourages programmers to

> perform a great deal more allocation and de-allocation of
> memory, causing memory fragmentation and slowing the allocation/
> deallocation process.
>
>There is nothing about C++ that encourages programmers to perform
>a great deal more allocation and de-allocation of memory. Some
>popular styles advocate this, but they advocate it in C++ as well as
>other languages.

I think there is a little bit of that (like in any other higher level
language). For example it is very convenient to define "string" class with
overloaded operators and use just almost intuitive expressions to work with
strings, that was impossible in C. But if you do that be aware that compiler
will use some temporaries.
I thing this is really nothing about C++ particularly. Of cause this is
about programming styles. But, the higher level language you use, the more
attractive styles become available. And if you do not know well about them -
you will face some tradeoff.

--
<<< If you received it by E-mail: it is a copy of post to the newsgroup >>>
---------------------------------------------------------------
Michael Furman, (603)893-1109
Geophysical Survey Systems, Inc. fax:(603)889-3984
13 Klein Drive - P.O. Box 97 en...@gssi.mv.com
North Salem, NH 03073-0097 71543...@compuserve.com
---------------------------------------------------------------


Everett M. Greene

unread,
Apr 17, 1996, 3:00:00 AM4/17/96
to
In article <4ku5h0$f...@gaia.ns.utk.edu> m...@caffeine.engr.utk.edu (Matt Kennel) writes:
> Bhargav P. Upender (ba...@utrc.utc.com) wrote:
> : > Anyone had any Eiffel experience? (unfortunately, that
> : > rhymes with "awful"!)
> : I have "Personal Eiffel for Windows". I am not too impressed with the
> : maturity of the product.
> : * It is a memory hog. You need atleast 16M to run it on windows.
> : * The application that I have developed runs slow (personal version does
> : not have optimizer).
> : * I wouldn't use it for embedded systems yet!
>
> : The professional version might be better, but its more money.
> The personal eiffel for windows is an interpreter. It will be
> much slower than the professional version.
>
> 16 MB of memory is not very much.

Anything much beyond one Mbyte is a prime candidate for code-bloat
champion. 16 Mbytes is ludicrous.

-----------------------------------------------------------------------
Everett M. Greene (The Mojave Greene, crotalus scutulatus scutulatus)
Ridgecrest, Ca. 93555 Path: moj...@ridgecrest.ca.us

Matt Kennel

unread,
Apr 17, 1996, 3:00:00 AM4/17/96
to
Everett M. Greene (moj...@ridgecrest.ca.us) wrote:

: In article <4ku5h0$f...@gaia.ns.utk.edu> m...@caffeine.engr.utk.edu (Matt Kennel) writes:
: > Bhargav P. Upender (ba...@utrc.utc.com) wrote:
: > : > Anyone had any Eiffel experience? (unfortunately, that
: > : > rhymes with "awful"!)
: > : I have "Personal Eiffel for Windows". I am not too impressed with the
: > : maturity of the product.
: > : * It is a memory hog. You need atleast 16M to run it on windows.
: > : * The application that I have developed runs slow (personal version does
: > : not have optimizer).
: > : * I wouldn't use it for embedded systems yet!
: >
: > : The professional version might be better, but its more money.
: > The personal eiffel for windows is an interpreter. It will be
: > much slower than the professional version.
: >
: > 16 MB of memory is not very much.

: Anything much beyond one Mbyte is a prime candidate for code-bloat
: champion. 16 Mbytes is ludicrous.

Why? Sun 3/60's in 86 or 87 or so typically came with 8 to 16 MB of
RAM.

Why deny yourself 16 MB with processor speeds 50 to 100 times faster?

: -----------------------------------------------------------------------

John Joseph Newbigin

unread,
Apr 18, 1996, 3:00:00 AM4/18/96
to
> > In particular, many people that have been disappointed with
> > C++ performance seem to lack an understanding of the implications
> > of implicit calls to constructors/copy-constructors/destructors,
> > memory framentation, or inline vs. non-inline procedure calls.
> >
> > Then they turn around and blame the language, rather than their
> > use of it.
> >
>
> Exactly.
>

But if you take into account these implications and try to avoid them,
you become a hacker.

Newbs.

Russell R. Nell

unread,
Apr 18, 1996, 3:00:00 AM4/18/96
to
>Dave Baldwin wrote:
>> Henning Rietz (ri...@condat.de) wrote:[snip]
>>
>> : - "OO systems are too slow"
>> : - "OO systems eat up too much memory"[snip]
>> Object-dis-oriented programming is (like some others) intended to hide
>> the hardware from the programmer. How useful can this possibly be when
>> small embedded systems are expressly for dealing with the hardware?

Why should any but the lowest level code know about hardware?
Even without OO would you not put an abstraction layer around
the hardware? Or is that abstraction layer just treating that
hardware like the object it is???


Hello by the way, long time listener, first time poster.
Will be starting an OO re-design of a 10 year old product
soon. When I have the time to devote I will be posting
a description of the project and a lot of questions. OO
is a new (and overdue) idea for this company!!!

--
Rusty Nail (Russell R. Nell) +============================+ /`-_
ne...@norland.icdnet.com || Wisconsin Cheese-Head || { }/
(414) 563-8456 ext. 214 || and Damn Proud of it!!! || \ * /
Norland Corporation +============================+ |___|
W6340 Hackbarth Road, Fort Atkinson, WI 53538-8999

Paul E. Bennett

unread,
Apr 18, 1996, 3:00:00 AM4/18/96
to
In article <9...@purr.demon.co.uk> ja...@purr.demon.co.uk "Jack Campin" writes:

>
> Brad Rodriguez <b...@headwaters.com> writes:
> Dave Baldwin wrote:
> > Henning Rietz (ri...@condat.de) wrote:[snip]
> >> - "OO systems are too slow"
> >> - "OO systems eat up too much memory"[snip]
> > Object-dis-oriented programming is (like some others) intended to hide
> > the hardware from the programmer. How useful can this possibly be when
> > small embedded systems are expressly for dealing with the hardware?
>

> A non-Forth example from several years back:.....Chorus in C++ ........
> ........................... The dynamic linking meant they could download


> modules and install them into the running system from tens of millions of
> miles away, and this couldn't have been a large-memory system. I'd have
> thought an OO Forth would have been a lot easier, but I don't think there
> was a mature one back then.
>

> Which makes me wonder: has Forth made it into space yet?

Forth, according to an item I have read somewhere (I think it was MPE's
catalogue), is apparently on three out of four systems on the Shuttle (perhaps
our NASA guys could confirm this one) and is often the base programming level
for the embedded systems on a number of Satelites (mainly amatuer satelites).
The benefit is an interactive operating system and programming environment that
enables new programming from remote resources. Forth has been doing this for
quite a long time. there has always been something that seems "Object Oriented"
about Forth without being a full OO system. I prefer to think of Forth as more
"Function Oriented". Is this perhaps a better strategy for control systems.

--
Paul E. Bennett <p...@transcontech.co.uk>
Transport Control Technology Ltd.
Tel: +44 (0)117-9499861
Going Forth Safely

David L. Shang

unread,
Apr 18, 1996, 3:00:00 AM4/18/96
to
In article <RMARTIN.96...@rcm.oma.com> rma...@oma.com (Robert C.
Martin) writes:
Dugan) writes:
>
> Although I have no figures or measurements, I would have to say
> that I suspect that the one area where C++ is slower is that
> there is something about C++ that encourages programmers to
> perform a great deal more allocation and de-allocation of
> memory, causing memory fragmentation and slowing the allocation/
> deallocation process.
>
> There is nothing about C++ that encourages programmers to perform
> a great deal more allocation and de-allocation of memory. Some
> popular styles advocate this, but they advocate it in C++ as well as
> other languages.
>

Agreed. C++ is not the only language that advocates dynamic allocation
and deallocation.

Polymorphism is one of the major characteristics of object-oriented
programming. By declaring a variable

in name "x" of class "C"

we can expect that "x" take of value of a subclass of "C". To get
this polymorphism, you have to use dynamic memory allocation for
"x". In C++, you use pointers. In Java or Eiffel, you use smart
references.

Transframe is language originally designed for, but not limited to,
embedded/real-time systems. The language does not advocate using
dynamic references when they are not necessary. Even with static
allocation, you can still get polymorphism, as long as the maximum
size of the subclass value is known. For example, you can statcally
allocate the storage for a polymorphic character variable which can
take an ANSI character, an Unicode character, or a variable-length
character.

Sometimes you might be required to dynamically allocate an object
in large grain, but within the large piece of strage, you may not
wnat to fragment the memoery into many small pieces.

Back to Roman Fietze's example:

> In my special case I built a menu system based on an own curses
> implementation on pSOS. With the old C version I passed pointers to some
> structures to the menu library functions. In the C++ version I build a
> menu by adding menu items to a menu object, which causes many memory
> copy, allocation and deallocation calls (not to give the CPU any chance
> I even used a String class instead of char *'s). The other drawback is
> that with the old system I could hold the text for the menu text in ROM
> only (by declaring it const), but with C++ I have to copy it using e.g.
> the operator+= or some constructor, and even the type specifier const
> isn't a guarantee for beeing allocated in a readonly memory (ROM on
> embedded systems, readonly sections e.g. on UNIX), it just says, the
> variable cann only be initialized, but not changed by an assignement
> operator.

If your system want to created new windows dynamically, you might need
to allocate window structures dynamically. But within the window, if
you do not want the window have function of dynamic configuration, e.g.
adding/deleting menu items and other child windows, then, you can allocate
everything statically by writing the following code:

object myWindow is FramedToplevelWindow
{
object myMemu is Menu
{
object fileItem is PullDownMenu
{
string = "File";
object openItem is MenuItemString
{
string = "Open...\tCtrl+O";
};
object saveItem is MenuItemString
{
style = (Disabled);
string = "Save...\tCtrl+s";
};
};
object editItem is PullDownMenu
{
object copyItem is MenuItemIcon
{
style = (Disabled);
icon = icon_Copy;
};
object pasteItem is MenuItemIcon
{
style = (Disabled);
icon = icon_Paste;
};
object sp1 is MenuItemSeparator
};
...
};
};

However, if you do wish to configue the menu items dynamically,
for example, you may want a "recall" pull-down menu under the
"file" menu-item to list all the files opened previously in
history, then, you need to design your windows in a dyamaic
structure in which children are allocated dynamically.

For many small handout devices, I belive that the user interface
is fixed, and dynamic configuration in the final released product
is not necessary. Though in the development environment (for virtual
products), Transframe enables dynamic configruation for rapid
prototyping.

For large and complex desktop applications, the window structure
should be dynamic.

It is the decision of application, not the decision of a language,
that whether dyamaic structure should be used. Therefore, a language
should provide options that an application can choose.

David Shang

Robert Dewar

unread,
Apr 19, 1996, 3:00:00 AM4/19/96
to
Newbs said

"But if you take into account these implications and try to avoid them,
you become a hacker."

(these implications = overhead of destructors etc.)

Probably there was a :-) missing, but if the above was serious, I stronl
strongly disagree. All programmers should understand the consequences
of the code they write. I certainly agree that both in Ada 95 and
in C++, programmers use finalization (destructors) with great abandon
without the least bit idea of the overhead being introduced.


Harry V. Bims

unread,
Apr 19, 1996, 3:00:00 AM4/19/96
to
In article <dibaldDp...@netcom.com>,

Dave Baldwin <dib...@netcom.com> wrote:
>Henning Rietz (ri...@condat.de) wrote:
>
>: I can say "everybody" is using OO in some areas (mainly network

>: management, switch provisioning, customer care), BUT there are (almost)
>: no examples in the area of (small) embedded systems, main reasons for
>: that being:
>
>: - "OO systems are too slow"

>: - "OO systems eat up too much memory"

Quite the contrary. Here are Wireless Access, I have designed and built
a real-time, objected-oriented system running on a PC. The system
creates a local area two-way paging environment. It schedules and
manages multiple channels and users simultaneously. A coomparable system
was implemented without OO technology, and it requires a 250MHz Alpha to
do the same job. In addition, code changes can occur much more quickly
when you have an OO framework to start with. The problem is that
very few people know how to create good OO designs. It takes quite
a long time to climb the learning curve.

As far as memory is concerned, there is some overhead associated with
C++ compilation versus C, however, once your program reaches in excess
of 15K lines, that no longer becomes an issue. In fact, because of code
and data reuse, my code is more efficient from a memory requirements
perspective than the non-OO counterpart.

>
>Object-dis-oriented programming is (like some others) intended to hide
>the hardware from the programmer. How useful can this possibly be when

>small embedded systems are expressly for dealing with the hardware? Some
>of the techniques can be useful, but the overhead and 'hiding' is exactly
>what you don't need in hardware control.

Hardware designs are subject to change. When that happens, it is much
more difficult to adapt your code when you DON'T use OO. Again, overhead can
be minimized through careful design.

>There is no universal programming method. Even the examples you cite are

You are right. However, if you are building a program that is longer than
about 10K lines, OO is generally the best methodology.

Harry Bims
Senior MTS
Wireless Access, Inc.
408-653-2288
disc...@waccess.com


Marc de Groot

unread,
Apr 19, 1996, 3:00:00 AM4/19/96
to
Paul E. Bennett wrote:
>
> Forth, according to an item I have read somewhere (I think it was MPE's
> catalogue), is apparently on three out of four systems on the Shuttle (perhaps
> our NASA guys could confirm this one) and is often the base programming level
> for the embedded systems on a number of Satelites (mainly amatuer satelites).
> The benefit is an interactive operating system and programming environment that
> enables new programming from remote resources. Forth has been doing this for
> quite a long time. there has always been something that seems "Object Oriented"
> about Forth without being a full OO system. I prefer to think of Forth as more
> "Function Oriented". Is this perhaps a better strategy for control systems.

IMO, the essence of object-oriented programming is the explicit declaration
of the relationship between a data structure and the algorithms that operate
on it--which is what a class declaration is.

Forth's CREATE...DOES>... construct embodies this essence neatly and simply.
The code following CREATE allocates the data structure, and the code
following DOES> operates on it. A Forth defining word is both
a class declaration and an implementation of its methods. Executing the
defining word constructs an object of that class.

One powerful and unique aspect of Forth is that all words are members
of a single metaclass. That class has a uniform structure, consisting
(in an indirect-threaded system) of a pointer to native code followed
by an arbitrary data structure. This structure allows very high
efficiency at run time. The uniformity facilitates the implementation
of such tools as debuggers and decompilers. It also makes LISP-like
programming techniques more tractable, such as code creating and
modifying other code.

Dave Baldwin

unread,
Apr 20, 1996, 3:00:00 AM4/20/96
to
After reading all of this, I wonder what you people think small, medium,
and large embedded systems are. Maybe the things I was referring to are
just considered 'tiny' to you.

Robert A Duff

unread,
Apr 20, 1996, 3:00:00 AM4/20/96
to
In article <31783C...@immersive.com>,

Marc de Groot <ma...@immersive.com> wrote:
>IMO, the essence of object-oriented programming is the explicit declaration
>of the relationship between a data structure and the algorithms that operate
>on it--which is what a class declaration is.

Nah. That's just plain old abstract data types. Modula-2 modules can
do this. Ada 83 packages can do this. Neither are "object oriented",
in the usual sense. OO requires polymorphism, in addition to that other
good stuff.

- Bob

Paul E. Bennett

unread,
Apr 20, 1996, 3:00:00 AM4/20/96
to
In article <31783C...@immersive.com>

ma...@immersive.com "Marc de Groot" writes:

> Forth's CREATE...DOES>... construct embodies this essence neatly and simply.
> The code following CREATE allocates the data structure, and the code
> following DOES> operates on it. A Forth defining word is both
> a class declaration and an implementation of its methods. Executing the
> defining word constructs an object of that class.
>
> One powerful and unique aspect of Forth is that all words are members
> of a single metaclass. That class has a uniform structure, consisting
> (in an indirect-threaded system) of a pointer to native code followed
> by an arbitrary data structure. This structure allows very high
> efficiency at run time. The uniformity facilitates the implementation
> of such tools as debuggers and decompilers. It also makes LISP-like
> programming techniques more tractable, such as code creating and
> modifying other code.

With all that you state above, do you mean to say I have been doing "Object
Oriented Design and Programming" all this time without realising it?. Wow!.

You realise of course that it will now be just about impossible for me to
consider programming systems with anything other than Forth. :)

Chris Savage

unread,
Apr 21, 1996, 3:00:00 AM4/21/96
to
dib...@netcom.com (Dave Baldwin) wrote:

|After reading all of this, I wonder what you people think small, medium,
|and large embedded systems are. Maybe the things I was referring to are
|just considered 'tiny' to you.
|--

I'm with you there. How many people use / would think of using OO in a
highly cost-constrained _small_ microcontroller application (e.g.
engine management, vcr, toaster, fork lift truck ?)

|-=-=-=-=-=-=-=-=-=-=-=- Check out 'alt.tcj' -=-=-=-=-=-=-=-=-=-=-=-=-=-
|Dave Baldwin: dib...@netcom.com | The Computer Journal 1(800)424-8825
|DIBs Electronic Design | Home page "http://www.psyber.com/~tcj/"
|Voice : (916) 722-3877 | Hands-on hardware and software
|TCJ/DIBs BBS: (916) 722-5799 | TCJ/DIBs FAX: (916) 722-7480
|-=-=-=-=-=- @#$%^&* I can't even quote myself! Oh,well. -=-=-=-=-=-

=================================================
Chris Savage MSc Applications Software Engineer
Sevcon Ltd. Kingsway Gateshead NE11 0QA UK
Tel: +44 191 487 8516 Fax: +44 191 482 4223
=================================================

Piercarlo Grandi

unread,
Apr 21, 1996, 3:00:00 AM4/21/96
to
>>> On Sat, 20 Apr 1996 23:34:12 GMT, bob...@world.std.com (Robert A
>>> Duff) said:

bobduff> In article <31783C...@immersive.com>,
bobduff> Marc de Groot <ma...@immersive.com> wrote:

marc> IMO, the essence of object-oriented programming is the explicit
marc> declaration of the relationship between a data structure and the
marc> algorithms that operate on it--which is what a class declaration
marc> is.

bobduff> Nah. That's just plain old abstract data types. Modula-2
bobduff> modules can do this. Ada 83 packages can do this. Neither are
bobduff> "object oriented", in the usual sense. OO requires
^^^^^^^^
bobduff> polymorphism, in addition to that other good stuff.

Therefore as program which just does not happen to use polymorphism
cannot be called an OO program :-/.

David Taylor

unread,
Apr 21, 1996, 3:00:00 AM4/21/96
to
In article <317a4454...@news.demon.co.uk>,
ch...@nihilist.demon.co.uk (Chris Savage) wrote:

> dib...@netcom.com (Dave Baldwin) wrote:
>
> |After reading all of this, I wonder what you people think small, medium,
> |and large embedded systems are. Maybe the things I was referring to are
> |just considered 'tiny' to you.
> |--
> I'm with you there. How many people use / would think of using OO in a
> highly cost-constrained _small_ microcontroller application (e.g.
> engine management, vcr, toaster, fork lift truck ?)


The engine management systems I know of run 300K+ in size and are
highly cost constrained--production in the millions. Is this small??

--


-- Dave

Zsoter Andras

unread,
Apr 22, 1996, 3:00:00 AM4/22/96
to
"Paul E. Bennett" (p...@transcontech.co.uk) wrote:

>> One powerful and unique aspect of Forth is that all words are members
>> of a single metaclass. That class has a uniform structure, consisting
>> (in an indirect-threaded system) of a pointer to native code followed
>> by an arbitrary data structure. This structure allows very high
>> efficiency at run time. The uniformity facilitates the implementation
>> of such tools as debuggers and decompilers. It also makes LISP-like
>> programming techniques more tractable, such as code creating and
>> modifying other code.
>
>With all that you state above, do you mean to say I have been doing "Object
>Oriented Design and Programming" all this time without realising it?. Wow!.

Whether you believe it or not some programmers actually think so. :-(
Some even claim that OOP is unnecessary because "we had it all the
time as CREATE DOES>". :-(((

Andras


Lee Webber

unread,
Apr 22, 1996, 3:00:00 AM4/22/96
to
ch...@nihilist.demon.co.uk (Chris Savage) wrote:
>
> dib...@netcom.com (Dave Baldwin) wrote:
>
> |After reading all of this, I wonder what you people think small, medium,
> |and large embedded systems are. Maybe the things I was referring to are
> |just considered 'tiny' to you.
> |--
> I'm with you there. How many people use / would think of using OO in a
> highly cost-constrained _small_ microcontroller application (e.g.
> engine management, vcr, toaster, fork lift truck ?)

Hardly anyone. What's amazing to me is how few people would use OO in
a somewhat larger system, say 128K to 1 Meg of memory, 16-bit processor
-- and how few OO language vendors think such targets are worth
supporting.

Matt Kennel

unread,
Apr 22, 1996, 3:00:00 AM4/22/96
to
Roman Fietze (fie...@ag01.kodak.COM) wrote:
: If you really want to use the powers of C++ instead of using C++ as a

: better C you have more memory allocation and deallocation. In C you very
: often handle pointers and care about when to free the memory associated
: with that memory. In C++ you always copy memory to other objects with
: constructors or operators, because you use class objects like other
: integral types without thinking too much (you could, but you don't want
: to because the program is more readable then :). You even have a lot of
: memory allocation and deallocation by using arguments passed by value or
: by the compiler using temporaries. I was astonished, when I checked the
: code produced by my Microtec C++ 4.x, allthough when I thought about it,
: there was no other way to handle that.

This is a case where a GC would be beneficial. If you frequently
copied only references you would save on extra allocation and deallocation,
at the cost of having to deal with more complex memory reference paths.

If you have a GC, it's *much* easier to get such a thing to work.


: Roman

larry kollar

unread,
Apr 22, 1996, 3:00:00 AM4/22/96
to
Thus spake Dave Baldwin:

>After reading all of this, I wonder what you people think small, medium,
>and large embedded systems are. Maybe the things I was referring to are
>just considered 'tiny' to you.

A company I *used* to work for made a terminal server by embedding a
stripped-down UNIX, plus TCP/IP, in ROM. To me, that would be a large
embedded system.

I once hung a Rat Shack motion detector off the joystick port of my
Amiga and used JForth to monitor (and react to) someone tripping the
detector. That would have been a medium system, if I'd taken it beyond
that point and actually made it into something useful. :-)

As for small... well, what would a small system have been in the days
when 16K of RAM was a lot of memory in a computer? Times change, and so
does the capability of the hardware.
--
Larry Kollar, Dawsonville GA | *** Hatred is murder *** (1 Jn 3:15)
leko...@nyx.net | http://www.nyx.net/~lekollar/
"So don't try to turn my head away
Flirtin' with disaster every day"

Bruce R. McFarling

unread,
Apr 23, 1996, 3:00:00 AM4/23/96
to
bob...@world.std.com (Robert A Duff) wrote:

>In article <31783C...@immersive.com>,


>Marc de Groot <ma...@immersive.com> wrote:

>> IMO, the essence of object-oriented programming is the explicit >> declaration of the relationship between a data structure and the
>> algorithms that operate on it--which is what a class declaration is.
>
> Nah. That's just plain old abstract data types. Modula-2 modules
> can do this. Ada 83 packages can do this. Neither are object
> oriented", in the usual sense. OO requires polymorphism, in


>addition to that other good stuff.

I'm always amazed to see how a semantic quibble can be raised
even when the point raised is *explicitly* allowed for in the original
comment. Marc de Groot offers his opinion of the 'essense' of object
oriented programming. Robert Duff offers a checklist to qualify as
object oriented 'in the usual sense'.
But if something covers all the bases, its not an 'essence', is
it? We can see that it is not a substantial disagreement but only a
semantic quibble since the two points can be combined in one statement
with any conflict whatsoever:

"Although they lack some of the features normally expected in
object oriented programming, notably polymorphism, Forth, Modula-2 and
Ada-83 capture the essence of objective programming, which is the
declaration of a specific relationship between a data structure and the
algorithms that operate on it. Forth does this with CREATE DOES>,
Modula-2 with modules, and Ada-83 with packages."

Perfectly coherent combination of all the information contained
in both statements. That doesn't imply that both or either author
agrees, but it does imply that these specific statements of theirs are
not fundamentally contradictory.

Virtually,

Bruce R. McFarling, Newcastle, NSW
ec...@cc.newcastle.edu.au

Zsoter Andras

unread,
Apr 23, 1996, 3:00:00 AM4/23/96
to
Lee Webber (le...@micrologic.com) wrote:
>ch...@nihilist.demon.co.uk (Chris Savage) wrote:
>>
>> dib...@netcom.com (Dave Baldwin) wrote:
>>
>> |After reading all of this, I wonder what you people think small, medium,
>> |and large embedded systems are. Maybe the things I was referring to are
>> |just considered 'tiny' to you.
>> |--
>> I'm with you there. How many people use / would think of using OO in a
>> highly cost-constrained _small_ microcontroller application (e.g.
>> engine management, vcr, toaster, fork lift truck ?)

>Hardly anyone. What's amazing to me is how few people would use OO in

^^^^^^^^^^^^^^^^^^ ?


>a somewhat larger system, say 128K to 1 Meg of memory, 16-bit processor
>-- and how few OO language vendors think such targets are worth
>supporting.

I guess with OOP you can actually SAVE space because of the improved
code reusability. Well, if your whole application is 256 bytes long
than OOP or any other fancy stuff is out of question.
But if you have at least two kilobytes you should consider OOP.

Andras


Paul E. Bennett

unread,
Apr 23, 1996, 3:00:00 AM4/23/96
to
In article <4lggmt$k...@nyx10.cs.du.edu>
leko...@nyx10.cs.du.edu "larry kollar" writes:

> Thus spake Dave Baldwin:


>
> >After reading all of this, I wonder what you people think small, medium,
> >and large embedded systems are. Maybe the things I was referring to are
> >just considered 'tiny' to you.
>

> A company I *used* to work for made a terminal server by embedding a
> stripped-down UNIX, plus TCP/IP, in ROM. To me, that would be a large
> embedded system.

Perhaps we could agree the following classification:

Micro Embedded - Less than 4Kb total memory requirements
Small Embedded - Greater than 4kb and Less than 64kb total memory requirement
Medium Embedded - Greater than 64kb and Less than 1Mb total memory requirement
Large embedded - Greater than 1Mb total memory requirement

Almost all of my systems are in the small category.

Bob Kitzberger

unread,
Apr 23, 1996, 3:00:00 AM4/23/96
to
Dave Baldwin (dib...@netcom.com) wrote:
: After reading all of this, I wonder what you people think small, medium,
: and large embedded systems are. Maybe the things I was referring to are
: just considered 'tiny' to you.

tiny 0 -- 1k lines
small 1k -- 10k lines
medium 10k -- 100k lines
large 100k -- 1M lines
enormous 1M++

Just my humble opinion.

--
Bob Kitzberger Rational Software Corporation r...@rational.com

Scott Wheeler

unread,
Apr 23, 1996, 3:00:00 AM4/23/96
to
>The engine management systems I know of run 300K+ in size and are
>highly cost constrained--production in the millions. Is this small??

Isn't that almost entirely data (maps) rather than executable?

Scott

C. T. Nadovich

unread,
Apr 23, 1996, 3:00:00 AM4/23/96
to
ch...@nihilist.demon.co.uk (Chris Savage) writes:

>dib...@netcom.com (Dave Baldwin) wrote:

>|After reading all of this, I wonder what you people think small, medium,
>|and large embedded systems are. Maybe the things I was referring to are
>|just considered 'tiny' to you.

>|--
>I'm with you there. How many people use / would think of using OO in a
>highly cost-constrained _small_ microcontroller application (e.g.
>engine management, vcr, toaster, fork lift truck ?)

Today, or in the future?

Even in highly cost-constrained commercial applications, HARDWARE design
is usually OO. Often the economics of build vs. buy favor "buy",
especially for commodity items, like resistors and bolts. Companies
simply can't vertically integrate every technology.

Why won't software go that way? Why won't managers eventually see the
advantage of OO? A manager can pay a co-op to click and drag together a
toaster's embedded control system in a few minutes using third-party OO
widgets. Sure, a high priced engineer can code the same thing in less RAM,
bumming instructions or hacking FORTH, but is bumming instructions a
useful skill with megabit DRAMS selling for less than $2. If the toaster
costs $0.10 more with OO, but I can get it to market 6 months earlier, my
bottom line tells me what to do.

Get rid of the software tailor. That's one less high-priced craftsman I
need on staff --- not to mention one less schedule that slips because of
endless delays as the software is hand coded and hand debugged. Sure,
hand made stuff can be better than "from the rack", but one look at the
price tag on a custom suit and most of us head to JC Penny.

I know that point of view may rub some embedded SW gurus the wrong way, and
I'm not saying it's true today, but the dumbing down of software design is
coming at all levels IMHO. Fortunately, those high priced engineers can
all get jobs designing objects --- although they may have to move to
Russia or India.

--
73 de KD3BJ SK .. http://www.kd3bj.ampr.org
+1 215 257 0635 (voice) +1 215 257 2744 (data/fax)

Jason Rumney

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
In article <31749A...@ag01.kodak.COM>, Roman Fietze <fie...@ag01.kodak.COM> writes:

> If you really want to use the powers of C++ instead of using C++ as a
> better C you have more memory allocation and deallocation. In C you very
> often handle pointers and care about when to free the memory associated
> with that memory. In C++ you always copy memory to other objects with
> constructors or operators, because you use class objects like other
> integral types without thinking too much (you could, but you don't want
> to because the program is more readable then :).

You could avoid copying, and still keep your readablity, by using such
features as passing by reference. The key to making an efficient C++
program is in recognising what goes on behind the scenes, and making
allowances for it. There are sufficient features in the language to
make these allowances - you just have to know when to use them.

You even have a lot of memory allocation and deallocation by using
> arguments passed by value or by the compiler using temporaries.

Why are you passing by value?
If the compiler is using temporaries, it is doing so as an
optimisation. If you have a compiler that thinks making temporary
copies of large objects is an optimisation, then you need to change
compilers (if this is indeed what is happening).

> The other drawback is that with the old system I could hold the text
> for the menu text in ROM only (by declaring it const), but with C++
> I have to copy it using e.g. the operator+= or some constructor,
> and even the type specifier const isn't a guarantee for beeing
> allocated in a readonly memory (ROM on embedded systems, readonly
> sections e.g. on UNIX), it just says, the variable cann only be
> initialized, but not changed by an assignement operator.

This depends on your compiler. Better compilers will have an option
which allows you to specify that const variables are kept in code
space (this applies equally to C compilers as well as C++)

--
---------------------------------------------------------------------------
There's a girl lives next door. She's a Swedish American hippie,
at a bus station in Northern Holland sort of a person.
- Jah Wobble
------------------------- Jason Rumney (jas...@pec.co.nz) ----------------

Zsoter Andras

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
Paul E. Bennett (p...@tcontec.demon.co.uk) wrote:

>Perhaps we could agree the following classification:

>Micro Embedded - Less than 4Kb total memory requirements
>Small Embedded - Greater than 4kb and Less than 64kb total memory requirement
>Medium Embedded - Greater than 64kb and Less than 1Mb total memory requirement
>Large embedded - Greater than 1Mb total memory requirement

>Almost all of my systems are in the small category.

You mean memory size between 4kb and 64 kb.
Looks like an ideal size to fit my OOF model into.

Andras


Jeffrey Newmiller

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
My $0.02:

a) "Object-Based" systems lack polymorphism. I think polymorphism is
only one way to skin the cat. Ada-83 got along pretty well with
generics, and C++ now has templates, too. When function call tables are
appropriate, use polymorphism. When not, don't. Object-orientation is
really a function of the design phase of the development process, and
supporting features in the implementation language may or may not be
available or appropriate. That doesn't make OO any less appropriate for
any application.

b) I examine the domain in which the problem is stated, and try to adapt
my implementation language to that. In the absence of a macro assembler,
this information must be encoded entirely in the comments. With macro
capability, I try to create macros that apply to the domain(s)
appropriate to the function being written. The better the language, the
fewer comments I have to write, because I can adapt it to the languages of
the problem and solution.

Some of you might consider me handicapped: I have a very hard time
understanding functional designs, and I effectively cannot come up with
one that is more than a few lines long, because I know how much
easier it is for me to create and modify OO/OB code. The complaint I
usually get from old-timers is that they "cannot see what is going on";
that is, I write in a variant of the implementation language that is
adapted to the problem.

However, I think that practically any embedded program can be stated in
problem domain language so that solution actions are explicit in the
code. Naturally, I prefer at least a macro assembler, though 100 bytes
of RAM and 2k of code is plenty for some problem domains, and no, I would
be unlikely to use polymorphism for such a small scope.

BTW: I regularly re-use routines from programs that apply to similar
problem domains, but I haven't found much purchasable OO/OB code that was
worth putting in my programs. I view the primary advantage of OO/OB
style as adaptability to changing requirements (especially during the
development process), and it regularly pays off there.

--
---------------------------------------------------------------------------
Jeff Newmiller The ..... ..... Go Live...
DCN:<jdne...@dcn.davis.ca.us> Basics: ##.#. ##.#. Live Go...
Live: OO#.. Dead: OO#.. Playing
Research Engineer (Solar/Batteries O.O#. #.O#. with
/Software/Embedded Controllers) .OO#. .OO#. rocks...5k
---------------------------------------------------------------------------

Bruce R. McFarling

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
leko...@nyx10.cs.du.edu (larry kollar) wrote:

>As for small... well, what would a small system have been in the days
>when 16K of RAM was a lot of memory in a computer? Times change, and so
>does the capability of the hardware.

Would 192 bytes RAM, 16K ROM qualify as a small system? That's
the 6502-derived micro-controller I've been looking at, if they would
only get around to making it a 16K EPROM, to free up the 3 8-bit ports
that has to go to data and address lines when the ROM is external.
And would the OO techniques we've been talking about be useful?
And which aspect? Data type <-> method association? Polymorphism?

Zsoter Andras

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
Bob Kitzberger (r...@rational.com) wrote:

>Dave Baldwin (dib...@netcom.com) wrote:
>: After reading all of this, I wonder what you people think small, medium,
>: and large embedded systems are. Maybe the things I was referring to are
>: just considered 'tiny' to you.

> tiny 0 -- 1k lines


> small 1k -- 10k lines

^^^^^^^^^^

Are we still living in the FORTRAN age?

Andras


Todd Hoff

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
C. T. Nadovich wrote:

> A manager can pay a co-op to click and drag together a
> toaster's embedded control system in a few minutes using third-party OO
> widgets.

The reason i don't see this happening is that
every embedded target, for cost and other reasons,
is very custom. Devices are swapped in at the last
minute, a lot of ASICs and FPGAs are used, PIOs and
serial protocols vary hugely. Not to mention switching
OSs to get a smaller cost per copy. It's a very difficult
environment to automate for.

> useful skill with megabit DRAMS selling for less than $2. If the toaster

It's not $2, not even close. And yes, people will kill
over $2. And don't forget PROM will probably have to
scale up as RAM increases.

> costs $0.10 more with OO, but I can get it to market 6 months earlier, my
> bottom line tells me what to do.

There's a lot more to making a system than generating
a few classes, but yes, it would be nice to automate
more of it. You seem to be confusing OO with automation.
OO may or may not cut the per unit cost, but automation
would. And if you are automating the style of generated code
is not critical.

> I know that point of view may rub some embedded SW gurus the wrong way,

No, not all, many of us aggresively push for automation.
But automation implies a degree of standardization.
And standardization doesn't exist in the embedded world.

Lawrence M. Gearhart

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
Piercarlo Grandi wrote:
> bobduff> Nah. That's just plain old abstract data types. Modula-2
> bobduff> modules can do this. Ada 83 packages can do this. Neither are
> bobduff> "object oriented", in the usual sense. OO requires
> ^^^^^^^^
> bobduff> polymorphism, in addition to that other good stuff.
>
> Therefore as program which just does not happen to use polymorphism
> cannot be called an OO program :-/.

I disagree. The essence of object-oriented programming is that it
extends the notion of abstract data types in two ways:

1) It introduces inheritance, allowing you to create a new class that
extends the capabilities of an existing class. Polymorphism is simply an
additional feature that adds flexibility to inheritance.
2) It enlarges the concept of abstract data type to include classes whose
resources include more than just data, but reach out into the environment
beyond the computer in a fundamental way.

It is really the 2nd characteristic of object-oriented programming which
has made the greatest impact in software architectures, whether real-time
or not. The greatest impact of the 1st characteristic has been in code
reuse, and because of that, indirectly a profound effect on software
architectures. Microsoft invented OLE based on the 2nd characteristic,
but its own software product line is based upon reusable and extensible
classes, which is based on the 1st characteristic.

Mike Albaugh

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
C. T. Nadovich (ch...@kd3bj.ampr.org) wrote:
: ch...@nihilist.demon.co.uk (Chris Savage) writes:

: >I'm with you there. How many people use / would think of using OO in a


: >highly cost-constrained _small_ microcontroller application (e.g.
: >engine management, vcr, toaster, fork lift truck ?)

: Today, or in the future?

: Even in highly cost-constrained commercial applications, HARDWARE design
: is usually OO. Often the economics of build vs. buy favor "buy",
: especially for commodity items, like resistors and bolts. Companies
: simply can't vertically integrate every technology.

Strange, I had exactly the opposite notion. The hardware I see,
reading things like MicroProcessor Report, is less "OO" than the stuff I
was doing 20 years ago. How's That? 20 years ago one built a design out
of 8-bit latch chips, 4-bit counters, ALUs with clearly-defined
functionality, etc. Nowadays hardware seems to be heading down the same
two paths as software: quick-hack stuff done in a FPGA's or their ilk,
using software-like tools like VHDL in a development environment that
would cause Fred Brooks to shudder with flashbacks, and mega-designs
like the P6 with nasty little sections hand-tweaked by layout gurus.
Sure, in both cases one is "putting the blocks together", but the blocks
are significantly larger, less tested, and less "reusable", in the sense
of "applicable to other uses."

: Why won't software go that way? Why won't managers eventually see the
: advantage of OO? A manager can pay a co-op to click and drag together a


: toaster's embedded control system in a few minutes using third-party OO
: widgets.

I can confidently buy resistors and bolts because they are
commodity items with well-understood specifications and (in the case of
bolts) criminal penalties for mis-specification. In the software world,
the folks most likely to be providing the "objects" can't even get
things like strtol() and memcmp() right, and have legions of lawyers
making sure that if they slag down your machine the most they are liable
for is a replacment copy of the CDROM you bought their class-library on.
Your manager-type may _love_ to get the software process dumbed-down,
but the head of production wants no surprises with 5K toasters DOA and
the legal department wants no class-action suits from people injured by
flaming toast ejected at Mach 3 when they tried to toast during the
changeover from Standard to daylight-savings time :-)

: Sure, a high priced engineer can code the same thing in less RAM,


: bumming instructions or hacking FORTH, but is bumming instructions a

: useful skill with megabit DRAMS selling for less than $2. If the toaster
: costs $0.10 more with OO, but I can get it to market 6 months earlier, my


: bottom line tells me what to do.

Just be sure your bottom line includes adequate reserves for
returns and lawsuits. I am _not_ saying that the average assembly-hacker
is a better programmer than the average C++ hacker, but I'm saying that
he/she typically needs to design, rather than hack, if the project is to
work at all. The barriers to commercial use of OO are commercial ones,
and will require all of the following to be true before they fall:

1) a _huge_ increase in the reliability of purchased code.

2) Traceable responsibility, with legal teeth, for the mistakes that
will still occur.

3) Standards for software objects, and a method by which a purchaser
can verify that the purchased objects meet these standards.

Please note that just watching the rise of NetScape, following the
MicroSoft business model of "Dazzle them with glitz and they won't notice
the bugs", seems to argue against holding your breath for #1. Also that
the Nuclear Energy industry, with _much_ less money than the software
industry, managed to get themselves legally exempted for responsibility
for even _death_, so much for #2. And techniques like statistical
sampling which work fine for detecting, say, counterfeit grade-8 bolts
are meaningless with software. It takes only _one_ error in the wrong
place to moot most of the function of a software package. One need only
look at the stream of security bugs in Java, an OO language supposedly
designed with security in mind, for an illustration of this effect.

We will also need convincing proof that Fisher's Fundamental
Theorem ("The better adapted an organism is to its environment, the less
able to adapt to changes in environment, and vice versa" somehow does
not apply to software, alone in the universe of complex objects :-) This
theorem is roughly paraphrased: "Generality and efficiency are
inherently in oppostion" in Gerald Weinberg's "The Psychology of
Computer Programming" which I heartily recommend. The refusal of average
programmers to believe this is what, IMHO, blinds them to the gratuitous
complexity they introduce, and the bugs which _inherently_ follow.
That's what makes them average programmers :-) The point where this
becomes a problem for OO is that large aglomerations of really general
objects give rise to "epiphenomena", aka "emergent behavior. Verifying
such complex interactions, to the point where one could achieve #3, is
going to take quite a while. Leaving the question of liablity when
un-verified objects interact badly to be resolved by #2 (lawyers and
courts) or #1 (buy only from reputable vendors). But reputable vendors
are already taking a beating at the hands of "get it done now", else
quick-fix software development "silver-bullets" would not be so popular.
"Math is hard": Barbie :-)

: Get rid of the software tailor. That's one less high-priced craftsman I


: need on staff --- not to mention one less schedule that slips because of
: endless delays as the software is hand coded and hand debugged. Sure,
: hand made stuff can be better than "from the rack", but one look at the
: price tag on a custom suit and most of us head to JC Penny.

I own no suits at all, except for a tuxedo. It's a long story,
but the principle is: "If you don't need it, or it doesn't work, it's
not a bargain no matter how cheap it is." The point where real "software
tailors" (I'm not talking about the legions of wannabes who _should_ be
writing Doom-clones in Visual C++ to keep them out of trouble) earn
their keep is in understanding what the problem is, and picking the best
way to solve it. That way may very well be some sort of OO. It will
almost certainly involve decomposition into tractable sub-problems. It
will also involve taking things like product-life-cycle costs into account.

: I know that point of view may rub some embedded SW gurus the wrong way, and


: I'm not saying it's true today, but the dumbing down of software design is
: coming at all levels IMHO.

Has already come. The price is ever-increasing complexity resting
on ever-shakier foundations. I hear those apocolyptic woodpeckers coming :-)

: Fortunately, those high priced engineers can


: all get jobs designing objects --- although they may have to move to
: Russia or India.

Why? If I'm going to sit hunched over a workstation all day
hacking the One True String Class, why should my customer care where
I am?

Ending rant: The thing that distinguishes embedded programming
from the rest is not size or complexity, but the potential for harm,
either physical or fiscal :-) It shares these considerations with
"boring" things like airline reservation systems. By assuming away
the need for robustness, many folks free themselves to pursue really
attractive shiny toys :-)

Mike

| Mike Albaugh (alb...@agames.com) Atari Games (now owned by Williams)
| (No connection to any company owned by the Tramiel family)
| 675 Sycamore Dr. Milpitas, CA 95035 voice: (408)434-1709
| The opinions expressed are my own (Boy, are they ever)


Piercarlo Grandi

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
>>> On 23 Apr 1996 11:07:13 GMT, "Bruce R. McFarling"
>>> <ec...@cc.newcastle.edu.au> said:

ecbm> bob...@world.std.com (Robert A Duff) wrote:

>> In article <31783C...@immersive.com>,
>> Marc de Groot <ma...@immersive.com> wrote:

marc> IMO, the essence of object-oriented programming is the explicit
marc> declaration of the relationship between a data structure and the
marc> algorithms that operate on it--which is what a class declaration is.

Entirely correct if worded a bit clumsily.

bobduff> Nah. That's just plain old abstract data types.

No. For plain old ADTs imply only this relationship, they don't make it
explicit in a class relationship.

bobduff> Modula-2 modules can do this. Ada 83 packages can do this.

No, they cannot make that relationship *explicit*, because while in a
class declaration it is explicit that the functions in the class have
something to do with the type of the class, modules and packages have no
such explicit relationship, it is implicit if one follows it as a matter
of convention.

Unfortunately, as I have already remarked in this group, most proper
OO languages support but don't enforce the OO decomposition paradigm,
for while they provide a notion of module that is explicitly and
directly oriented at making clear the relationship between a data type
representation and operators defined over it, for *all* such operators
must be in the class, they don't enforce it, for it is usually
perfectly legal to put in a class entities totally unrelated to the
type representation it defines.

bobduff> Neither are object oriented", in the usual sense.

Let's agree on this... :-)

bobduff> OO requires polymorphism, in addition to that other good stuff.

OO does not require any specific language feature apart from direct and
explicit support and enforcement of its decomposition paradigm. Too bad
that no existing OO language enforces it...

ecbm> I'm always amazed to see how a semantic quibble can be raised
ecbm> even when the point raised is *explicitly* allowed for in the
ecbm> original comment. Marc de Groot offers his opinion of the
ecbm> 'essense' of object oriented programming. Robert Duff offers a
ecbm> checklist to qualify as object oriented 'in the usual sense'.

And his checklist is not a good way to catpure the essence.

ecbm> But if something covers all the bases, its not an 'essence', is
ecbm> it? We can see that it is not a substantial disagreement but only
ecbm> a semantic quibble since the two points can be combined in one
ecbm> statement with any conflict whatsoever:

ecbm> "Although they lack some of the features normally expected in
ecbm> object oriented programming, notably polymorphism, Forth, Modula-2 and
ecbm> Ada-83 capture the essence of objective programming, which is the
ecbm> declaration of a specific relationship between a data structure and the
ecbm> algorithms that operate on it. Forth does this with CREATE DOES>,
ecbm> Modula-2 with modules, and Ada-83 with packages."

But I disagree here too. For Forth, Modula-2 and Ada-83 do not make that
relationship explicit, but only as a matter of implicit convention. OO
languages are those language that _directly_ and _explicitly_ support
the OO decompositionb paradigm. One can use modules as if they were
classes, but this is just, if adopted, as implicit convention.

Lee Webber

unread,
Apr 24, 1996, 3:00:00 AM4/24/96
to
h929...@hkuxa.hku.hk (Zsoter Andras) wrote:
>
> Lee Webber (le...@micrologic.com) wrote:
> >ch...@nihilist.demon.co.uk (Chris Savage) wrote:
[snip] (lw)

> >> highly cost-constrained _small_ microcontroller application (e.g.
> >> engine management, vcr, toaster, fork lift truck ?)
>
> >Hardly anyone. What's amazing to me is how few people would use OO in
> ^^^^^^^^^^^^^^^^^^ ?
> >a somewhat larger system, say 128K to 1 Meg of memory, 16-bit processor
> >-- and how few OO language vendors think such targets are worth
> >supporting.
>
> I guess with OOP you can actually SAVE space because of the improved
> code reusability. Well, if your whole application is 256 bytes long
> than OOP or any other fancy stuff is out of question.
> But if you have at least two kilobytes you should consider OOP.
>
> Andras
>

And I thought I was an OOP partisan! To me, OOP means polymorphism
(i.e., computed dispatch) and code/data encapsulation. And you're
going to get more than a toy system into 2K+?? I just finished a
12K (code size) system, written in C and assembler, and I needed
(literally!) every byte. Any reuse I got (and I got a lot), I
accomplished by hand-tuning.

For a realistic system, I wouldn't even try to put true OO into
anything less than 64K -- and double that if I were using a general-
purpose commercial development environment.

Zsoter Andras

unread,
Apr 25, 1996, 3:00:00 AM4/25/96
to
Lee Webber (le...@micrologic.com) wrote:
>>
>> I guess with OOP you can actually SAVE space because of the improved
>> code reusability. Well, if your whole application is 256 bytes long
>> than OOP or any other fancy stuff is out of question.
>> But if you have at least two kilobytes you should consider OOP.
>>
>> Andras
>>

>And I thought I was an OOP partisan! To me, OOP means polymorphism

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


>(i.e., computed dispatch) and code/data encapsulation. And you're

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

For me OOP means exactly the same.

>going to get more than a toy system into 2K+?? I just finished a

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Well, 2K might be a bit tight (although I have never thought about
the minimal size of a system, I usually use a 486 machine with
megabytes of memory. It also depends whether the compiler has to be
on the target system or just the executable code is to be put into
that 2K.
For the executable code even 1K would do (well you need a VMT table
for each class but they can be small if you don't have too many
methods) and almost nothing more. For an embedded system which
turn a toaster on and off you will not have a huge class hierarchy
anyway.

>12K (code size) system, written in C and assembler, and I needed

^^^^^^
C compiler output usually occupies a lot of space (and C++ even more).
Eg. my OOF (written in assembly) is around 32K while DOOF (in C++)
is around 120K (+ uses shared libraries).

>(literally!) every byte. Any reuse I got (and I got a lot), I
>accomplished by hand-tuning.

>For a realistic system, I wouldn't even try to put true OO into
>anything less than 64K -- and double that if I were using a general-
>purpose commercial development environment.

For a general purpose development environment even 64K can be tight.
But for a system written for ONLY ONE task the insides of an OOP
implementation can be fairly small.

Andras


Piercarlo Grandi

unread,
Apr 25, 1996, 3:00:00 AM4/25/96
to
>>> On Wed, 24 Apr 1996 08:52:39 -0400, "Lawrence M. Gearhart"
>>> <larry.g...@trw.com> said:

larry.gearhart> Piercarlo Grandi wrote:

bobduff> Nah. That's just plain old abstract data types. Modula-2
bobduff> modules can do this. Ada 83 packages can do this. Neither are
bobduff> "object oriented", in the usual sense. OO requires
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ^^^^^^^^
bobduff> polymorphism, in addition to that other good stuff.

pcg> Therefore as program which just does not happen to use polymorphism
pcg> cannot be called an OO program :-/.

larry.gearhart> I disagree.

I disagree too :-).

larry.gearhart> The essence of object-oriented programming is that it
larry.gearhart> extends the notion of abstract data types in two ways:

larry.gearhart> 1) It introduces inheritance, allowing you to create a
larry.gearhart> new class that extends the capabilities of an existing
larry.gearhart> class.

Therefore Self, and all prototype based systems, and all actor based
systems, cannot be OO :-/, for they have no classes, and some no
inheritance to speak of.

larry.gearhart> Polymorphism is simply an additional feature that adds
larry.gearhart> flexibility to inheritance.

Uhmmmm. So for example polymorphism in non "OO" languages is useless,
because there are no classes and no inheritance :-/.

larry.gearhart> 2) It enlarges the concept of abstract data type to
larry.gearhart> include classes whose resources include more than just
larry.gearhart> data,

Well, the concept of ADT as I hve read it described includes data and
operations. Perhaps, given the following paragraph, instead of writing
"more than just data" you meant "more than just computer representation
of data and operations".

larry.gearhart> but reach out into the environment beyond the computer
larry.gearhart> in a fundamental way.

This on the face of it sounds a bit like those sci-fi "virtual reality"
films/novels that have become popular lately. Perhaps you would like to
define more precisely what is the "fundamental way" and how OO can
"reach into the environment".

larry.gearhart> It is really the 2nd characteristic of object-oriented
larry.gearhart> programming which has made the greatest impact in
larry.gearhart> software architectures, whether real-time or not. The
larry.gearhart> greatest impact of the 1st characteristic has been in
larry.gearhart> code reuse, and because of that, indirectly a profound
larry.gearhart> effect on software architectures. Microsoft invented
larry.gearhart> OLE based on the 2nd characteristic, but its own
larry.gearhart> software product line is based upon reusable and
larry.gearhart> extensible classes, which is based on the 1st
larry.gearhart> characteristic.

Not much clearer: how does OLE "reach into the environment beyond the
computer", and what is the "fundamental way" it does so? As far as I can
see OLE incorporates twenty years old technology, being optimistic, and
it is just computer technology.

Paul E. Bennett

unread,
Apr 25, 1996, 3:00:00 AM4/25/96
to
In article <4lllme$m...@void.agames.com> alb...@agames.com "Mike Albaugh" writes:

> C. T. Nadovich (ch...@kd3bj.ampr.org) wrote:
>
> : Why won't software go that way? Why won't managers eventually see the
> : advantage of OO? A manager can pay a co-op to click and drag together a
> : toaster's embedded control system in a few minutes using third-party OO
> : widgets.
>
> I can confidently buy resistors and bolts because they are
> commodity items with well-understood specifications and (in the case of
> bolts) criminal penalties for mis-specification. In the software world,
> the folks most likely to be providing the "objects" can't even get
> things like strtol() and memcmp() right, and have legions of lawyers
> making sure that if they slag down your machine the most they are liable
> for is a replacment copy of the CDROM you bought their class-library on.
> Your manager-type may _love_ to get the software process dumbed-down,
> but the head of production wants no surprises with 5K toasters DOA and
> the legal department wants no class-action suits from people injured by
> flaming toast ejected at Mach 3 when they tried to toast during the
> changeover from Standard to daylight-savings time :-)

This is, for the forseeable future going to remain the case. Whilst I am
already into re-use of code this is for code I have written on one project and
can re-use on another because it fitsthe rquirements. I also know the quality
factors of the code because the certification process I use declares that for
me.



> Just be sure your bottom line includes adequate reserves for
> returns and lawsuits.

This should always be the case. No-one is perfect yet and that is what
insurance is for.

> ................I am _not_ saying that the average assembly-hacker


> is a better programmer than the average C++ hacker, but I'm saying that
> he/she typically needs to design, rather than hack, if the project is to
> work at all.

I should hope all programmers spend some of their effort in design.

> ...........The barriers to commercial use of OO are commercial ones,


> and will require all of the following to be true before they fall:
>
> 1) a _huge_ increase in the reliability of purchased code.
>
> 2) Traceable responsibility, with legal teeth, for the mistakes that
> will still occur.
>
> 3) Standards for software objects, and a method by which a purchaser
> can verify that the purchased objects meet these standards.

These same barriers are the ones against re-use generally.



>
> Ending rant: The thing that distinguishes embedded programming
> from the rest is not size or complexity, but the potential for harm,
> either physical or fiscal :-) It shares these considerations with
> "boring" things like airline reservation systems. By assuming away
> the need for robustness, many folks free themselves to pursue really
> attractive shiny toys :-)

Consider also that a robust system (one that does not crash at the first
opportunity) is likely to last in service a very long time. This fact is on the
usually premise that if it ain't bust don't fix it. Such system will therefore
endure until there is a real change in the requirements.

Bruce R. McFarling

unread,
Apr 26, 1996, 3:00:00 AM4/26/96
to p...@aber.ac.uk

p...@aber.ac.uk (Piercarlo Grandi) wrote:

>But I disagree here too. For Forth, Modula-2 and Ada-83 do not make that
>relationship explicit, but only as a matter of implicit convention. OO
>languages are those language that _directly_ and _explicitly_ support
>the OO decompositionb paradigm. One can use modules as if they were
>classes, but this is just, if adopted, as implicit convention.

I wasn't setting out any claim for my own part, I was just
pointing out that the objection to the original claim was a semantic
quibble. Just because the two comments can be combined without conflict
does *not* prove that one, the other, or both are right!

Bruce R. McFarling

unread,
Apr 26, 1996, 3:00:00 AM4/26/96
to p...@aber.ac.uk

Bob Kitzberger

unread,
Apr 26, 1996, 3:00:00 AM4/26/96
to

Zsoter Andras (h929...@hkuxa.hku.hk) wrote:
: Bob Kitzberger (r...@rational.com) wrote:

: > tiny 0 -- 1k lines


: > small 1k -- 10k lines
: ^^^^^^^^^^

: Are we still living in the FORTRAN age?

No doubt you object to "lines" of code metric. I agree that SLOC
is a poor metric for most uses, but for our purposes (a very rough
feel for what small/medium/large systems are) I think it is acceptable.


--
Bob Kitzberger Rational Software Corporation r...@rational.com

http://www.rational.com http://www.rational.com/pst/products/testmate.html

Bob Kitzberger

unread,
Apr 26, 1996, 3:00:00 AM4/26/96
to

Lee Webber (le...@micrologic.com) wrote:

: And I thought I was an OOP partisan! To me, OOP means polymorphism
: (i.e., computed dispatch) and code/data encapsulation. And you're
: going to get more than a toy system into 2K+?? I just finished a
: 12K (code size) system, written in C and assembler, and I needed
: (literally!) every byte. Any reuse I got (and I got a lot), I
: accomplished by hand-tuning.

If you limit your view of OO to only those systems that include
computed/dynamic dispatch, then I think you are needlessly limiting
the number of embedded systems that can benefit from OO. IMHO, you
get a very large payback for merely using object-based programming
techniques, and an OO design. OO-based features such as encapsulation
and a contract-based programming model shouldn't necessarily bloat
your code nor slow your system's speed, but can bring great benefits
(the "ilities" : maintainbility, testability, etc.)

Dwight Elvey

unread,
Apr 26, 1996, 3:00:00 AM4/26/96
to

In article <chris.8...@kd3bj.ampr.org>, ch...@kd3bj.ampr.org (C. T. Nadovich) writes:
..... much stuff .....

|>
|> Why won't software go that way? Why won't managers eventually see the
|> advantage of OO? A manager can pay a co-op to click and drag together a
|> toaster's embedded control system in a few minutes using third-party OO
|> widgets. Sure, a high priced engineer can code the same thing in less RAM,

|> bumming instructions or hacking FORTH, but is bumming instructions a
|> useful skill with megabit DRAMS selling for less than $2. If the toaster
|> costs $0.10 more with OO, but I can get it to market 6 months earlier, my
|> bottom line tells me what to do.
|>
|> Get rid of the software tailor. That's one less high-priced craftsman I
|> need on staff --- not to mention one less schedule that slips because of
|> endless delays as the software is hand coded and hand debugged. Sure,
|> hand made stuff can be better than "from the rack", but one look at the
|> price tag on a custom suit and most of us head to JC Penny.
|>
..... more stuff .....

I find it interesting that I saw and work with an exeample of
the "pay a co-op to click and drag together" concept once. I'll
pass on what happened. Two identical projects were started to meet
a design specification for a product. I and another at another
company started the project, not realizing that we would eventually
meet in a business sense. Anyhow, the other
company hired a sharp kid that quickly hacked to gether a preliminary
software demonstration using one of the popular OO tools. I started
my design with the klunky old procedural Forth. After 3 weeks,
we both did a quick demonstation for the customer of the basic
operations. Mine was real pretty but showed that I had a good
understnading of the problem at hand. The other fellows was impressive
from the user interface standpoint but missed the specifications
in many aspects. Neither of us realize that we were competing against
a third party. I spent the next 3 weeks polishing up the user interface
and general improvements in speed. I verified that it met the
specifications and having a good understanding of the problem,
made it so I could make quike changes to fix problems with the
customers specification ( that I knew wouldn't work as desired )
and made the next trip to the customer. We both got there only
to find that the customer had desided to go with off the shelf
product that only met their spec in a few reguards. Oh well!!
I check with the other fellow to see how he was coming along.
He hadn't proceeded much more than the original. It seemed
the many of the tools he had used didn't have the proper
modifiable properties to make them work quite as require.
He was in the process of writing classes from scratch ( with
some barrowing ) to solve the incompability problems.
What I learned from this was that:
1. Never count you're chickens before they hatch.
2. One-size-fits-all rarely fits well.
3. There is no replacement for understanding the problem.
One of the things that Forth touted as a feature years
ago was the potential for reusable code. We found that really
what was needed was reusable simple tools and not complex
tools. OO hasn't solved any of the problems that the
Forth community came up against when looking at the problems
of reusablity. When one sees things like Smalltalk and how
well things seem to be structured, one misses the fact that
Smalltalk is a language and not the application. Applications
have complexities that can't have been understood by the
language writer or the writer of the library. The more one
depends on the language to handle the complexities of the
application, the more likely the application will fail in
unexpected ways. This comes back to point number 3, understanding
the problem is you're best way to go. The biggest problem with
these kind of bugs is that they are the most costly to find.
All this isn't to say that reusability isn't important. I reuse
a lot of my code. I never reuse it blindly. Remember,
" A good engineer never redesigns the same wheel, he designs a
proper wheel."
IMHO
Dwight

David Taylor

unread,
Apr 27, 1996, 3:00:00 AM4/27/96
to

Scott Wheeler (sco...@bmtech.demon.co.uk) wrote:

No, it isn't. Two (typical-I think) samples break down as

A B

Executable code 79% 75%
Tables (your maps?) 14% 18%
Program constants 5% 5%
Data and stack 3% 2%

Example A is production-ready and totals 366,000 bytes, while example B
totalled 412,000 bytes at two weeks into development (when these stats
were taken).

--


-- Dave

Zsoter Andras

unread,
Apr 27, 1996, 3:00:00 AM4/27/96
to

Piercarlo Grandi (p...@aber.ac.uk) wrote:

>Therefore Self, and all prototype based systems, and all actor based

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


>systems, cannot be OO :-/, for they have no classes, and some no
>inheritance to speak of.

I do not know about Self but I know about Dreams which is prototype-based.
I do not know -- however -- why it is called OOP.
I like the idea but it seems to be something else, not OOP as we know that.
There should be abother name for it.

Andras


Chris Savage

unread,
Apr 27, 1996, 3:00:00 AM4/27/96
to

deta...@holli.com (David Taylor) wrote:

|
|The engine management systems I know of run 300K+ in size and are
|highly cost constrained--production in the millions. Is this small??
|

|--
|
|
| -- Dave
Ok so I chose a bad example out of thin air early in the morning. Now
please address the point I was making.

=================================================
Chris Savage MSc Applications Software Engineer
Sevcon Ltd. Kingsway Gateshead NE11 0QA UK
Tel: +44 191 487 8516 Fax: +44 191 482 4223
=================================================

Chris Savage

unread,
Apr 27, 1996, 3:00:00 AM4/27/96
to

ch...@kd3bj.ampr.org (C. T. Nadovich) wrote:

|
|Even in highly cost-constrained commercial applications, HARDWARE design
|is usually OO. Often the economics of build vs. buy favor "buy",
|especially for commodity items, like resistors and bolts. Companies
|simply can't vertically integrate every technology.
|

|Why won't software go that way? Why won't managers eventually see the
|advantage of OO? A manager can pay a co-op to click and drag together a
|toaster's embedded control system in a few minutes using third-party OO
|widgets. Sure, a high priced engineer can code the same thing in less RAM,
|bumming instructions or hacking FORTH, but is bumming instructions a
|useful skill with megabit DRAMS selling for less than $2. If the toaster
|costs $0.10 more with OO, but I can get it to market 6 months earlier, my
|bottom line tells me what to do.
|
|Get rid of the software tailor. That's one less high-priced craftsman I
|need on staff --- not to mention one less schedule that slips because of
|endless delays as the software is hand coded and hand debugged. Sure,
|hand made stuff can be better than "from the rack", but one look at the
|price tag on a custom suit and most of us head to JC Penny.
|

|I know that point of view may rub some embedded SW gurus the wrong way, and
|I'm not saying it's true today, but the dumbing down of software design is

|coming at all levels IMHO. Fortunately, those high priced engineers can


|all get jobs designing objects --- although they may have to move to
|Russia or India.
|

|--
|73 de KD3BJ SK .. http://www.kd3bj.ampr.org
|+1 215 257 0635 (voice) +1 215 257 2744 (data/fax)

Your bottom line will get you in court with a personal injury suit
within six weeks of your first fork lift controller hitting the
market. By neatly avoiding the endless delays those terrible engineers
introduce your toaster will happily burn someone's kitchen down
because of something the guy two steps removed who sold you your
software widgets didn't foresee.
Please tell me what company you work for and what they make. I want to
be sure of never buying any.

Bob Kitzberger

unread,
Apr 27, 1996, 3:00:00 AM4/27/96
to

ch...@kd3bj.ampr.org (C. T. Nadovich) wrote:

|Why won't software go that way? Why won't managers eventually see the
|advantage of OO? A manager can pay a co-op to click and drag together a
|toaster's embedded control system in a few minutes using third-party OO
|widgets. Sure, a high priced engineer can code the same thing in less RAM,
|bumming instructions or hacking FORTH, but is bumming instructions a
|useful skill with megabit DRAMS selling for less than $2. If the toaster
|costs $0.10 more with OO, but I can get it to market 6 months earlier, my
|bottom line tells me what to do.

The "widgets" of which you speak -- where can I get some? ;-)
I ask that with a bit of sarcasm; generally reusable components
are not designed for real-time applications (your high-priced
engineers would know that, but alas, you've fired them :-)

As I've worked in and around the real-time embedded market these past
12 years, I've seen it's resistance to change, and in general there is
_good reason_ for this resistance. The stakes are often very high in
embedded systems development, and jettisoning proven development
methods is high-risk. Switching from time-sliced periodic scheduling
to event-driven scheduling approaches for avionics applications (for
example) is something people take very seriously. The risks
range from the expensive (cutting a new release of enginer controller
software, burning it in PROMs, shipping it to all car service
departments, recalling the cars, etc.) to the life-critical
risks of avionics and medical applications.

Switching from assembler to C took a long time. Switching from
in-house kernels to third-party kernels is still happening at many
sites. Switching to Ada or C++ is slowly happening (admittedly the
pitfalls of C++ are still tripping up people). Switching from
structural to OO methods is slowly happening. And despite all the
hype, I would expect there to be a _long_ lag before Java is embraced,
if ever.

Are embedded systems developers missing out on not embracing OO
design and development methods? Perhaps, but they're the ones who
know what the risks and rewards are for their particular domain.

|I know that point of view may rub some embedded SW gurus the wrong way, and
|I'm not saying it's true today, but the dumbing down of software design is
|coming at all levels IMHO. Fortunately, those high priced engineers can
|all get jobs designing objects --- although they may have to move to
|Russia or India.

At one level, the software development world is dividing between
those that make frameworks, class libraries, domain-specific
architectures, etc. and those that implement real systems using
those components. This is a good thing, IMHO, and there is nothing
inherently "better" about framework designers vs. application
designers.

Even as these components start becoming available, this does not
mean that the competence level of those using the components
can decrease.

And I certainly see no evidence that framework development is
moving to India or Russia in droves.

Zsoter Andras

unread,
Apr 28, 1996, 3:00:00 AM4/28/96
to

Bob Kitzberger (r...@rational.com) wrote:
>Zsoter Andras (h929...@hkuxa.hku.hk) wrote:
>: Bob Kitzberger (r...@rational.com) wrote:

>: > tiny 0 -- 1k lines
>: > small 1k -- 10k lines
>: ^^^^^^^^^^

>: Are we still living in the FORTRAN age?

>No doubt you object to "lines" of code metric. I agree that SLOC
>is a poor metric for most uses, but for our purposes (a very rough
>feel for what small/medium/large systems are) I think it is acceptable.

Well, so which one of the following is one LINE:

OVER + SWAP DUP *

or

OVER
+
SWAP
DUP
*

The functionality is the same!

Andras


Bob Kitzberger

unread,
Apr 28, 1996, 3:00:00 AM4/28/96
to

Zsoter Andras (h929...@hkuxa.hku.hk) wrote:
: Bob Kitzberger (r...@rational.com) wrote:

: >No doubt you object to "lines" of code metric. I agree that SLOC


: >is a poor metric for most uses, but for our purposes (a very rough
: >feel for what small/medium/large systems are) I think it is acceptable.

: Well, so which one of the following is one LINE:

[...]

Can we drop this now? Every reasonably experienced developer
understands that SLOC is usually a poor metric.

Paul Long

unread,
Apr 28, 1996, 3:00:00 AM4/28/96
to

Lawrence M. Gearhart wrote:
[snip]
> I disagree. The essence of object-oriented programming is that it
> extends the notion of abstract data types in two ways:[snip]

I agree with you that object-oriented programming is based on the ADT;
however, many people use the definition of "object-oriented" that was
established by Dr. Peter Wegner several years ago in a paper published in
the ACM's _SIGPLAN Notices_ called something like "Dimensions of
Object-Oriented Languages." He subsequently wrote a more pedestrian version
as an article for _Byte_ magazine.

In the paper and article, Dr. Wegner says that an "object-based" language
supports the unification of data and code. Examples are Ada 83 and
Modula-II. A "class-based" language allows a programmer to incorporate
common behavior (and state variables) for a _set_ of entities into a
"class." The only class-based languages are academic (e.g., CLU?). An
"object-oriented" language allows the programmer to define a new class based
on another class--inheritance. Examples are, of course, Smalltalk, C++, and
Java. In other words:
object-based = data + code
class-based = object-based + classes
object-oriented = class-based + inheritance

Since it is obvious from this thread that there is no concensus on what, in
particular, "object-oriented" means, and since not one of these definitions
is any more correct than another, how about adopting Dr. Wegner's
definitions and hierarchy? If no definition is more correct than another and
since his have a history in the literature and the industry, how about let's
all just accept his definitions and move on to something more interesting
such as what comes after OO?

BTW, although his paper and article are about programming languages, I have
found them useful for describing other things such as coding style and
development environments. For example, I have written an object-based symbol
table for a compiler, a class-based file-transfer stack over X.25, and a
real-time object-oriented performance monitor for a DS3 repeater. All were
written in C, not C++. I also published a table in _Object Magazine_ that
categorizes development environments using Dr. Wegner's terminology.

--
Paul Long mailto:pl...@perf.com Smith Micro Video Products
http://www.teleport.com/~pciwww/ http://www.smithmicro.com/
"I hate quotations." - Ralph Waldo Emerson

Bruce R. McFarling

unread,
Apr 29, 1996, 3:00:00 AM4/29/96
to

ch...@nihilist.demon.co.uk (Chris Savage) wrote:
>ch...@kd3bj.ampr.org (C. T. Nadovich) wrote:
>
>|
>|Even in highly cost-constrained commercial applications, HARDWARE
>|design is usually OO. Often the economics of build vs. buy favor
>|"buy", especially for commodity items, like resistors and bolts.
>|Companies simply can't vertically integrate every technology.

>|Why won't software go that way? Why won't managers eventually see the
>|advantage of OO? ...

>Your bottom line will get you in court with a personal injury suit
>within six weeks of your first fork lift controller hitting the
>market. By neatly avoiding the endless delays those terrible
>engineers introduce your toaster will happily burn someone's
>kitchen down because of something the guy two steps removed
>who sold you your software widgets didn't foresee.

If there is going to be an appeal to conventional producer
commodities, it might be useful to consider their history. Firms
did not start at hand crafting each nut to fit each bolt, and jump
immediately in the next step to buying nuts and bolts as commodities.
Instead, after the stage of hand crafting fasteners came the
standardization of the production process of individual firms.
And sometimes there was a third step where firms that were skilled
in making the machinary to make standardized parts sold the
machinery. But in any event, the initial customers were using
internally standardized parts before the parts started to
become available as commodities. And in each step in the
process, the lessons of the previous stage were relied upon,
and in each stage unexpected problems arose that had to be
solved.
Is there any reason to expect this to be any different?
When firms find that there is are some standardized software
components that they have developed that can be sold without
giving away any of their core competence, and other firms find
that the reliability is as good or better as they can do on their
own, commodification of software components will proceed. But
internal standardization and breakdown into components may well
be crucial first steps toward commodity software components, since
monolithic hand crafted applications may have too much of the core
competences embedded inside to sell without selling the store.

xian the desk lisard

unread,
Apr 29, 1996, 3:00:00 AM4/29/96
to

thus spake Zsoter Andras in comp.lang.forth...
. Piercarlo Grandi (p...@aber.ac.uk) wrote:

. >Therefore Self, and all prototype based systems, and all actor based
. ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
. >systems, cannot be OO :-/, for they have no classes, and some no
. >inheritance to speak of.

. I do not know about Self but I know about Dreams which is prototype-based.
. I do not know -- however -- why it is called OOP.
. I like the idea but it seems to be something else, not OOP as we know that.
. There should be abother name for it.

it's called OOP because it's about objects, not classes. objects
don't need to belong to classes to be objects. besides, the work on
prototype-based obejct oriented systems dates back at least as far as
smalltalk.
--
xian the desk lisard -- cdah...@comp.brad.ac.uk
[ red, pink and blue, but mainly purple ribbons ]
thirty years ago how the words would flow with passion and precision
--- you had them all on your side, didn't you?

Emil P. Rojas

unread,
Apr 29, 1996, 3:00:00 AM4/29/96
to

stop

Piercarlo Grandi

unread,
Apr 29, 1996, 3:00:00 AM4/29/96
to

>>> On 29 Apr 1996 16:04:41 GMT, cdah...@comp.brad.ac.uk (xian the desk
>>> lisard) said:

cdahello> thus spake Zsoter Andras in comp.lang.forth...
cdahello> . Piercarlo Grandi (p...@aber.ac.uk) wrote:

pcg> Therefore Self, and all prototype based systems, and all actor based
pcg> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pcg> systems, cannot be OO :-/, for they have no classes, and some no
pcg> inheritance to speak of.

Zsoter> I do not know about Self but I know about Dreams which is
Zsoter> prototype-based. I do not know -- however -- why it is called
Zsoter> OOP. I like the idea but it seems to be something else, not OOP
Zsoter> as we know that. There should be abother name for it.

cdahello> it's called OOP because it's about objects, not classes.

Well, actually "Object Oriented" is a terrible misnomer, for OO is
really about encapsulation of ADTs (some call, for some mysterious
reason, ADT-oriented programming as "object based"), not objects or
classes as such.

cdahello> objects don't need to belong to classes to be objects.

Rather: encapsulation of ADTs can be done in (prototype) objects as well
as in classes.

cdahello> besides, the work on prototype-based obejct oriented systems
cdahello> dates back at least as far as smalltalk.

Even worse :->, work on actor systems by Hewitt inspired directly the
design of early PARC Smalltalk (in particular Smalltalk-72), and such
inspiration is vestigial in the terminology used for Smalltalk-80, which
uses the same terms for it as for actor systems, rather inappropriately,
which has proven rather unfortunate, as a lot of people tend to take
labels ("Object Oriented", "Message Sending") rather too literally

Zsoter Andras

unread,
Apr 30, 1996, 3:00:00 AM4/30/96
to

Bruce R. McFarling (ec...@cc.newcastle.edu.au) wrote:

> If there is going to be an appeal to conventional producer
>commodities, it might be useful to consider their history. Firms
>did not start at hand crafting each nut to fit each bolt, and jump
>immediately in the next step to buying nuts and bolts as commodities.
>Instead, after the stage of hand crafting fasteners came the
>standardization of the production process of individual firms.

^^^^^^^^^^^^^^^^^^^^^


>And sometimes there was a third step where firms that were skilled
>in making the machinary to make standardized parts sold the
>machinery. But in any event, the initial customers were using
>internally standardized parts before the parts started to
>become available as commodities. And in each step in the
>process, the lessons of the previous stage were relied upon,
>and in each stage unexpected problems arose that had to be
>solved.
> Is there any reason to expect this to be any different?
>When firms find that there is are some standardized software

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


>components that they have developed that can be sold without
>giving away any of their core competence, and other firms find
>that the reliability is as good or better as they can do on their
>own, commodification of software components will proceed. But
>internal standardization and breakdown into components may well
>be crucial first steps toward commodity software components, since
>monolithic hand crafted applications may have too much of the core
>competences embedded inside to sell without selling the store.

Well, the keyword is STANDARDIZATION and the software world lacks it.
How else could such thing happen that Win95 sells miilions of copies.
Up to what standard it is? (I mean in the OS world there is such
thing as POSIX, which is definitly something else than Win95!)
As long as customers are HAPPY to buy such products you have
no choice but to hand craft.
If even your hardware is custom made (as in embedded controllers)
you really have no choice.
One day we might have the bolts and nuts of SW but first there must
be a demand for them in the market.
Today there are no relyable standard components out there. :-(
[IMHO STANDARD means that the specification is available
for anyone and the parts from different vendors are interchangable.
"X.Y. Corp's products are very relyable" have nothing to do with
standard.]

Andras

Marc Furguson

unread,
Apr 30, 1996, 3:00:00 AM4/30/96
to

Harry V. Bims (bi...@isl.Stanford.EDU) wrote:
: >Henning Rietz (ri...@condat.de) wrote:
: >
: >: I can say "everybody" is using OO in some areas (mainly network
: >: management, switch provisioning, customer care), BUT there are (almost)
: >: no examples in the area of (small) embedded systems, main reasons for
: >: that being:
: >
: >: - "OO systems are too slow"
: >: - "OO systems eat up too much memory"

: Quite the contrary. Here are Wireless Access, I have designed and built
: a real-time, objected-oriented system running on a PC. The system
: creates a local area two-way paging environment. It schedules and
: manages multiple channels and users simultaneously. A coomparable system
: was implemented without OO technology, and it requires a 250MHz Alpha to
: do the same job. In addition, code changes can occur much more quickly

I don't think anyone can draw any conclusions about these "comparable"
systems. Perhaps you're just a better designer or coder than the person,
or persons, unknown who implemented the other system.

: of 15K lines, that no longer becomes an issue. In fact, because of code
: and data reuse, my code is more efficient from a memory requirements
: perspective than the non-OO counterpart.

I am an OOPs convert and I think I can see the many benefits of the OOPs
paradigm, but I think it is pointless trying to generalise statements
about speed, efficency and space requirements. As above, anyone can just
recite the gospel according to <insert your bandwagon here> and avoid
producing hard facts. I would propose two axioms about OOA/OOD/OOP:

1) Software engineering and the software lifecycle benefits at all
stages from object-oriented techniques in terms of time and cost
to implement.

2) The higher level of the software components being produced, the
greater the processing and space requirements needed to handle
them.

Assuming that this is the generally accepted situation, the burden of
proof falls on those who would disagree with the two points above.

On the other hand you may not agree with these axioms and you would
be at liberty to use whatever axioms you think work. The problem
is that the true facts are hidden in the noise when you try to
argue across different languages, projects, implementation domains,
staff profiles etc. etc. If I wanted to 'defend' my axioms I think
it would require a book for me to feel that I had made my case
clearly. I don't actually care to defend them, I simple state them
and leave it at that. On the other hand you are free to not accept
them, and leave it at that too. If you care to attack them I would
expect about a book's worth of arguments and facts before it would
stand a chance of changing my present opinion.
ie. "Eiffel For Video Games Programmers" ISBN ?????????

Afterthought:
"God, what if there is such a book, I'll look so stooopid"- worry, worry.

--
Mark Ferguson

Elliott N Hughes

unread,
Apr 30, 1996, 3:00:00 AM4/30/96
to

Zsoter Andras wrote:
>
> I do not know about Self but I know about Dreams which is prototype-based.
> I do not know -- however -- why it is called OOP.

Self's probably called object-oriented because everything's an object. In
my opinion, the prototype-based languages are *more* OO than the class-based
oddities. What is a class? It's not an object, is it?

> I like the idea but it seems to be something else, not OOP as we know that.


> There should be abother name for it.

Nah, "OOP as [you] know it" should change its name.


- enh

Matt Kennel

unread,
May 1, 1996, 3:00:00 AM5/1/96
to

Elliott N Hughes (en...@minster.york.ac.uk) wrote:

: Zsoter Andras wrote:
: >
: > I do not know about Self but I know about Dreams which is prototype-based.
: > I do not know -- however -- why it is called OOP.

: Self's probably called object-oriented because everything's an object. In
: my opinion, the prototype-based languages are *more* OO than the class-based
: oddities. What is a class? It's not an object, is it?

What do you mean "every" "thing" is an object?

Is every character of program source "an object"?

Is the current "state" of the memory 'an object'? Is the current
stack and execution point 'an object'?


Bruce R. McFarling

unread,
May 1, 1996, 3:00:00 AM5/1/96
to

Paul Long <pl...@computek.net> wrote:
>... In other words:


> object-based = data + code
> class-based = object-based + classes
> object-oriented = class-based + inheritance

Speaking as a layman on the topic, this is an excellent
way to get beyond semantic quibbling and look at the issue involved.
It makes it far easier to follow whether someone is disagreeing about
substance or expression of the substance.

And the most important point is this:

>BTW, although his paper and article are about programming languages,
>I have found them useful for describing other things such as coding

>style and development environments. For example, ... [examples]


>All were written in C, not C++.

Whether or not a problem can be solved efficiently using
object-oriented *programming*, and whether or not it can be solved
efficiently using a particular object-oriented *language*, or even
more specifically a particular object-oriented language implementation
are not identical questions. A language can be not-entirely object
oriented (as defined above), but still support object oriented
programming. The original argument that attracted my attention
to this thread was that Forth's CREATE ... DOES> provides the
'essence' of object oriented programming, and on the definitions
above, we can see that it is literally true. CREATE ... DOES> ...
is an object based programming construct, and object base is the
common core of the three levels of object programming. And OOP
requires more than just this base, which someone programming in
FORTH will have to provide in the normal way, by defining words
that provide these capabilities.

Zsoter Andras

unread,
May 1, 1996, 3:00:00 AM5/1/96
to

Elliott N Hughes (en...@minster.york.ac.uk) wrote:
>Zsoter Andras wrote:
>>
>> I do not know about Self but I know about Dreams which is prototype-based.
>> I do not know -- however -- why it is called OOP.

>Self's probably called object-oriented because everything's an object. In
>my opinion, the prototype-based languages are *more* OO than the class-based
>oddities. What is a class? It's not an object, is it?

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

In my DOOF it is. ;-) (Although not EVERYTHING is an object in it.)


>Nah, "OOP as [you] know it" should change its name.

Or OOP as me, C++ -ists, Turbo Pascal programmers, etc. know it
should change its name. ;-)

Andras


Ell

unread,
May 1, 1996, 3:00:00 AM5/1/96
to

Matt Kennel (m...@caffeine.engr.utk.edu) wrote:
: Elliott N Hughes (en...@minster.york.ac.uk) wrote:
::Self's probably called object-oriented because everything's an object. In

::my opinion, the prototype-based languages are *more* OO than the class-based
::oddities. What is a class? It's not an object, is it?

: What do you mean "every" "thing" is an object?

:
: Is every character of program source "an object"?
:
: Is the current "state" of the memory 'an object'? Is the current
: stack and execution point 'an object'?

I do not if Self considers them so, but from a larger OO perspective, yes
on all counts.

Elliott

It is loading more messages.
0 new messages