5 views

Skip to first unread message

Nov 18, 1992, 8:53:00 PM11/18/92

to

Regarding the issue of the relative "efficiency" of programming languages

available in different mathematical/symbolic computation systems I would

note that years ago I started off playing with SMP (and eventually did some

real work with it) on a VAX-780 which typically had 20 users; now

I can run Mathematica on a DECstation 3100 or 5000 all by myself.

The ratio of the hardware power available to me now to that then is

something like 50-100; thus differences in the *runtime* "efficiency" of

programming languages of factors of 4 -- or even more -- in what are

*supposed to be Very-High Level (VHL) programming languages* (following

in the lineage of APL, the first widely-used VHL language) --

leave me quite cold. What is important to me is the

total time spent formulating and solving a problem.

available in different mathematical/symbolic computation systems I would

note that years ago I started off playing with SMP (and eventually did some

real work with it) on a VAX-780 which typically had 20 users; now

I can run Mathematica on a DECstation 3100 or 5000 all by myself.

The ratio of the hardware power available to me now to that then is

something like 50-100; thus differences in the *runtime* "efficiency" of

programming languages of factors of 4 -- or even more -- in what are

*supposed to be Very-High Level (VHL) programming languages* (following

in the lineage of APL, the first widely-used VHL language) --

leave me quite cold. What is important to me is the

total time spent formulating and solving a problem.

[I note that the same relative timing ratio may be significant

or not depending on context -- a difference between 2 and 20 seconds

for a single operation is important to me when working away on

the interactive simplification of a large expression, because

psychologically you're twiddling your thumbs waiting for the result,

but the difference between 1 hour and 10 hours is not important --

they're both batch jobs where I can go and think about something

else.]

These "efficiency" arguments have all appeared before in the history

of computing when the first FORTRAN compiler appeared to challenge

assembly language; "those who do not know history are condemned to

repeat it". (See the comments of John Backus, the "father

of FORTRAN" in his Turing Award Lecture: "Can Programming Be

Liberated from the von Neumann Style? A Functional Style and

Its Algebra of Programs?" I believe Backus also discusses

these issues in the "History of Programming Languages" conference

proceedings edited by Richard Wexelblat (1978)).

To me, it seems that Maple is certainly a VHL programming system in

terms of the facilities available, but that it does not quite have a

VHL programming language as linguistic glue to weld together its different

facilities -- that its lineage is basically from what Backus calls

"Von Neumann" (ALGOL/FORTRAN/Pascal) languages. (In a somewhat

similar vein, the IMSL scientific subroutine library offers individual

VHL numerical problem-solving facilities, but you still have to use

"low-level" FORTRAN to weld together various IMSL routines in a

stand-alone program.) In contrast SMP/Mathematica's (spiritual,

at least) programming lineage comes more from APL.

Finally, it seems to me disingenuous to attack the Mathematica

user-level programming language on the grounds of "if it is so good,

why don't the developers of Mathematica use it to bootstrap their own

system?" (To me, a bit like asking, "if the set-theoretic query

languages of relational database management systems are so good, why

aren't they written in SETL?") The whole point of VHL languages is to

insulate the *user* from details which reflect the details of the

computer hardware architecture. But given that you're actually going

to run the system on a `Von Neumann' architecture, the *developers* of

the system can hardly afford to be ignorant of these details.

If the developers at WRI sweat away and write in C to get

factors of 4 difference in runtime, then good for them -- their

effort is amortized over the whole Mathematica user community.

But after all, we're paying them for their sweat. What I care

about is my effort in using their system, not their effort

in creating it.

Ron Balden

Nov 19, 1992, 12:43:49 PM11/19/92

to

In my view one of the most powerful abstractions for programming

mathematical algorithms is recursion. If a user writes a recursive

program in Mathematica, it appears that there is an enormous

efficiency penalty, at least compared to using built-in Map-like

commands. This is quite unfortunate. It leads to intellectual

inefficiency. If recursion and pattern matching were so good, perhaps

we wouldn't see so much claptrap in "fast Mathematica" programs

with # and &, which (I suspect) are merely ways to AVOID pattern matching

and the usual function call overhead.

mathematical algorithms is recursion. If a user writes a recursive

program in Mathematica, it appears that there is an enormous

efficiency penalty, at least compared to using built-in Map-like

commands. This is quite unfortunate. It leads to intellectual

inefficiency. If recursion and pattern matching were so good, perhaps

we wouldn't see so much claptrap in "fast Mathematica" programs

with # and &, which (I suspect) are merely ways to AVOID pattern matching

and the usual function call overhead.

digression ...

When IBM came out with the IBM-360 design in the mid 1960s, they

missed the boat on recursion and subroutine calling generally.

The architects seemed to not know about stacks:

They left out instructions to increment a stack pointer, and indeed

even screwed up by not having more than 16 bit relative addresses

(limiting the size of a stack). A subroutine call in PL/I was initially

INCREDIBLY expensive (for C fans, consider calling malloc at every

subroutine call for space to save registers).

Some people think the IBM 360 set back the progress of programming

languages by 10 years or more.

end of digression...

Tools -->and their efficiency <-- affect the way you think.

Mathematica is, in my view, poor in this regard. It encourages you

to think inefficiently because, in spite of providing so many tools,

it does not promote the right ones, or provide efficient versions of

them. (Recursion being one; information hiding a second prominent one.)

Stephen Wolfram's ambition (according to an interview in an article in

The Chronicle of Higher Education some months ago) appears to be

to have the Mathematica language replace all others as a first

programming language (i.e. instead of Pascal, Modula II, Scheme,

Ada, ... etc). I am not enthusiastic about this prospect.

RJF

--

Richard J. Fateman

fat...@cs.berkeley.edu 510 642-1879

Nov 19, 1992, 12:13:34 PM11/19/92

to

In article <18NOV199...@reg.triumf.ca> orw...@reg.triumf.ca (BALDEN,

RON) writes:

> Regarding the issue of the relative "efficiency" of programming

languages

> available in different mathematical/symbolic computation systems I would

> note that years ago I started off playing with SMP (and eventually did

some

> real work with it) on a VAX-780 which typically had 20 users; now

> I can run Mathematica on a DECstation 3100 or 5000 all by myself.

> The ratio of the hardware power available to me now to that then is

> something like 50-100; thus differences in the *runtime* "efficiency" of

> programming languages of factors of 4 -- or even more -- in what are

> *supposed to be Very-High Level (VHL) programming languages* (following

> in the lineage of APL, the first widely-used VHL language) --

> leave me quite cold. What is important to me is the

> total time spent formulating and solving a problem.

>

(More sensible remarks deleted)> Regarding the issue of the relative "efficiency" of programming

languages

> available in different mathematical/symbolic computation systems I would

> note that years ago I started off playing with SMP (and eventually did

some

> real work with it) on a VAX-780 which typically had 20 users; now

> I can run Mathematica on a DECstation 3100 or 5000 all by myself.

> The ratio of the hardware power available to me now to that then is

> something like 50-100; thus differences in the *runtime* "efficiency" of

> programming languages of factors of 4 -- or even more -- in what are

> *supposed to be Very-High Level (VHL) programming languages* (following

> in the lineage of APL, the first widely-used VHL language) --

> leave me quite cold. What is important to me is the

> total time spent formulating and solving a problem.

>

I agree with you completely. I find it easier to quickly generate correct

programs in Mathematica than in MAPLE. The fact that these programs then

do not execute as quickly as in MAPLE is *usually* of little significance.

This is because the programs I write are generally one-shot affairs---I

run them once and then they are history. In most other cases when the

program *is* used repeatedly it executes quickly enough that I don't

really care whether it can be made to run even faster. In any case, the

time I spend programming in typically much greater than the time the

computer spends executing my code.

This thinking is just the obvious extension of standard programming

notions. If you want to speed up the execution of a program what do you

do? You profile the program to find out where it is spending most of its

time and then you spend *your* time on improving the execution speed of

*those* parts. Well, if I "profile" the time I spend on a CAS problem I

generally find that the overwhelming majority of time is spent

programming, not executing.

Of course, if you are writing a program or package which will be run many

many times then the balance may shift to where the speed of execution is

more important than the speed of development. And as Ron pointed out, this

is precisely why Mathematica itself has so many lines of C code.

And below, my $.02 worth on a related topic:

IMHO we should steer away from discussion that occasionally verges on "My

CAS is better than yours (Nyah, Nyah, Nyah!)". Certainly both Mathematica

and MAPLE are serious programming systems. Equally certainly, each one is

better at certain tasks than the other. Therefore it is *impossible* to

make a blanket statement that one is better than the other (unless you are

simply firing off flame-bait or releasing pent-up programming

frustration). Let us, instead, educate each other on the respective

benefits to be obtained from each system so that we may all choose the

system which is best for our own needs.

To that end, let me note that the pattern matching abilities of

Mathematica have been invaluable in my development of Operations Research

related notebooks.

--

--------------------------------------------------------------------------

Dan Scott m...@ienext.unl.edu NeXT mail welcome

--------------------------------------------------------------------------

Nov 19, 1992, 3:32:02 PM11/19/92

to

In article <1eghvu...@crcnis1.unl.edu> m...@ienext.unl.edu writes:

>In article <18NOV199...@reg.triumf.ca> orw...@reg.triumf.ca (BALDEN,

>RON) writes:

>> What is important to me is the

>> total time spent formulating and solving a problem.

>>

>(More sensible remarks deleted)

>

>I agree with you completely.

>In article <18NOV199...@reg.triumf.ca> orw...@reg.triumf.ca (BALDEN,

>RON) writes:

>> What is important to me is the

>> total time spent formulating and solving a problem.

>>

>(More sensible remarks deleted)

>

>I agree with you completely.

>This is because the programs I write are generally one-shot affairs---I

>run them once and then they are history. In most other cases when the

>program *is* used repeatedly it executes quickly enough that I don't

>really care whether it can be made to run even faster. In any case, the

>time I spend programming in typically much greater than the time the

>computer spends executing my code.

>run them once and then they are history. In most other cases when the

>program *is* used repeatedly it executes quickly enough that I don't

>really care whether it can be made to run even faster. In any case, the

>time I spend programming in typically much greater than the time the

>computer spends executing my code.

I usually stay out of religious discussions, but the above reminded me

of a time several years ago when I was bragging to a friend about a

trick I had found that saved several microseconds in execution of a

Z80 assembly language program. My friends response: "How many

microseconds did it take you to come up with this idea?"

Dave Withoff

Nov 20, 1992, 12:13:51 AM11/20/92

to

Others have rightly pointed out that a concept of time efficiency that

does not include the programmer's time is inadequate in a modern

research environment. Unfortunately, almost everyone seems to have

ignored space efficiency in designing their symbolic algebra libraries. I

spent several frustrating hours yesterday trying to do a tricky

calculation with Maple on a busy machine, all because the size of the

computation kept growing beyond available resources. (We have 128MB of

memory, of which about 30MB was available during this session. I did

eventually manage to run my job by changing my approach to the problem,

but the relatively trivial computation I was engaged in should have fit

in 30MB. The library routines involved were obviously written by someone from

the "RAM is cheap" school of thought.) What I and, I'm sure, many

other scientists could really use is the ability to tell a symbolic

algebra program whether to trade time for space or vice-versa. Quite

often, making some sort of compromise is the only way to get the

calculation through the machine. I'm perfectly willing to compromise on

turn-around time if it's the difference between successfully completing

a calculation and staring at yet another "Object too large" message.

I do know about FORM. One of these days, I'll have to take a good

hard look at it. My preliminary impression (based on inspecting the

manual) is that the type of computations which I most commonly carry

out are not naturally expressed in FORM's language.

I would be interested to hear whether the major players (the Symbolic

Computation Group, Wolfram Research, the vendors of various Macsyma variants,

etc.) worry much about space efficiency when they are creating new

library code. I get the impression that the answer is no, at least for

Maple and Mathematica.

does not include the programmer's time is inadequate in a modern

research environment. Unfortunately, almost everyone seems to have

ignored space efficiency in designing their symbolic algebra libraries. I

spent several frustrating hours yesterday trying to do a tricky

calculation with Maple on a busy machine, all because the size of the

computation kept growing beyond available resources. (We have 128MB of

memory, of which about 30MB was available during this session. I did

eventually manage to run my job by changing my approach to the problem,

but the relatively trivial computation I was engaged in should have fit

in 30MB. The library routines involved were obviously written by someone from

the "RAM is cheap" school of thought.) What I and, I'm sure, many

other scientists could really use is the ability to tell a symbolic

algebra program whether to trade time for space or vice-versa. Quite

often, making some sort of compromise is the only way to get the

calculation through the machine. I'm perfectly willing to compromise on

turn-around time if it's the difference between successfully completing

a calculation and staring at yet another "Object too large" message.

I do know about FORM. One of these days, I'll have to take a good

hard look at it. My preliminary impression (based on inspecting the

manual) is that the type of computations which I most commonly carry

out are not naturally expressed in FORM's language.

I would be interested to hear whether the major players (the Symbolic

Computation Group, Wolfram Research, the vendors of various Macsyma variants,

etc.) worry much about space efficiency when they are creating new

library code. I get the impression that the answer is no, at least for

Maple and Mathematica.

Marc R. Roussel

mrou...@alchemy.chem.utoronto.ca

Nov 20, 1992, 9:15:02 AM11/20/92

to

Subject: Re: The Real Meaning of Efficiency? (Re: Serious Programming,

etc.)

From: Richard Fateman, fat...@peoplesparc.Berkeley.EDU

Date: 19 Nov 1992 17:43:49 GMT

In article <1egjol...@agate.berkeley.edu> Richard Fateman,

fat...@peoplesparc.Berkeley.EDU writes:

etc.)

From: Richard Fateman, fat...@peoplesparc.Berkeley.EDU

Date: 19 Nov 1992 17:43:49 GMT

In article <1egjol...@agate.berkeley.edu> Richard Fateman,

fat...@peoplesparc.Berkeley.EDU writes:

>In my view one of the most powerful abstractions for programming

>mathematical algorithms is recursion.

If recursion and pattern matching were so good, perhaps

>we wouldn't see so much claptrap in "fast Mathematica" programs

>with # and &, which (I suspect) are merely ways to AVOID pattern matching

>and the usual function call overhead.

>Stephen Wolfram's ambition (according to an interview in an article in

>The Chronicle of Higher Education some months ago) appears to be

>to have the Mathematica language replace all others as a first

>programming language (i.e. instead of Pascal, Modula II, Scheme,

>Ada, ... etc). I am not enthusiastic about this prospect.

>

===========================

i hear comments like this quite often from computer science professors

and it explains why the teaching of programming by cs departments to

engineers and scientists and mathematicians is almost a total failure.

recursion is said to be a powerful tool but it is almost never used. why?

one reason, of course, is that the old FORTRAN didn't support recursion.

another reason is that recursive thinking is in fact difficult to master

and many people are uncomfortable with it.

yet another reason is that recursion is not always very efficient.

pascal, modula II, scheme or ada are terrible choices as first languages

for engineers or scientists since these languages are in fact not used in

the technical fields. it at least make a little sense to teach

programming in FORTRAN since this language is actually used inscientific

computing and it makes the most sense to teach programming in one of the

cas languages because the vast majority of professionals will be doing

their computing on desktops with cas's that support graphics, algebraic

manipulation, etc.

the cs departments have managed with their control of the teaching of

programming to scientists and engineers and mathematicians to turn those

people off from programming at the very time that computing has become an

indispensible tool in these fields. i know they say that its the eng.

depts. that make them teach FORTRAN (sounds like george bush blaming

the congress) but its totally clear that most cs departments don't have

the faintest idea what non-cs people want or need to know about

programming. this can be seen from the recent move at many places from

FORTRAN to C as the first language. if they thought FORTRAN was unpopular

amongst engineering students wait till they see the student reaction to C.

it has been said that you shouldn't teach programming with mathematica

because its a bizarre and inconsistent language and if you learn

mathematica programming it will ruin your ability to learn how to program

properly.

this is COMPLETE nonsense.

one can ignore all of the things one thinks are 'wierdly idiosyncratic'

to mathematica [to some of us they are neat features not bugs] and just

use parts of the language to explain the general principles of

programming and algorithm development [eg., recursion, functional,

procedural and rule-based programming].

This is being done at the university of illinois in the intro cs course

that is required of all engineering students and the students are

responding to the course much more positively than they did in the past.

the best thing that could happen to the teaching of programming to tech

people would be to use mathematica to do it.

finally, the comment about the use ofthe 'claptrap' of anonymous

functions to avoid pattern matching is total claptrap (which according to

the dictionary means pretentious nonsense). i write almost all of my

programs using anonymous functions because i use a functional programming

style which is very natural to me. others prefer to program using

pattern matching (which is far more sophisticated in mathematica than in

most other languages). the ability to choose the style most appropriate

to the problem and to the way in which the programmer thinks is one of

the most important features of mathematica.

hopefully, this thread will have ended by the time i return from my

thanksgiving hoiday.

Nov 20, 1992, 12:05:03 PM11/20/92

to

One of the potential advantages of using a language like Common Lisp

is that it is possible to use tools developed outside the CAS

community to good advantage. CL compilers have settings to "optimize"

space, time, debuggability, and/or some combinations. It is also

possible to use tools like "memoization" (like Maple's option

remember, or a trick available in Mathematica [and Macsyma]) for

remembering certain kinds of function/value pairs. This tends to

trade lots of space for saving time, at least when it works.

is that it is possible to use tools developed outside the CAS

community to good advantage. CL compilers have settings to "optimize"

space, time, debuggability, and/or some combinations. It is also

possible to use tools like "memoization" (like Maple's option

remember, or a trick available in Mathematica [and Macsyma]) for

remembering certain kinds of function/value pairs. This tends to

trade lots of space for saving time, at least when it works.

But these are often ineffective when the algorithm must really be

changed by someone with a higher perspective on what is getting done.

One area in which space certainly was a consideration (and which

helped speed things up) was in the Poisson Series package in

Macsyma. Subexpressions like 3*u+4*v+5*w-3*x+5*y+7*z are stored

in one machine word. There are undoubtedly trade-offs in time v. space

in systems that map from abstractions to representations in some

controlled fashion (Axiom should do this).

I think that Maple, at least in its early days, as well as Macsyma

in its early days (on a 1.2 megabyte time-shared PDP-6) were concerned

with "space" generally, though they may have lost sight of that.

So, people have thought about space in these systems, at least in

some respects.

Nov 20, 1992, 12:48:47 PM11/20/92

to

In article <By0q9...@news.cso.uiuc.edu> Richard J. Gaylord <gay...@ux1.cso.uiuc.edu> writes:

>recursion is said to be a powerful tool but it is almost never used. why?

>one reason, of course, is that the old FORTRAN didn't support recursion.

Another reason is that most people are unable to think about it

without some training. That's why it is taught in CS classes.

Mathematical Induction is said to be a powerful tool but it is almost

never used. That's why it is taught in mathematics (and CS) classes.

digression: The split program was an example of induction that translated

into recursion. You couldn't write it that way in Mathematica without

losing big on efficiency. It would look Something like this..

Split[{},n_]:={} (* base case *)

Split[lis_,n_]:=MakeANewListOf[FirstNElementsOf[lis,First[n]],

Split[ElementsAfterFirstN[lis,First[n]],

Rest[n]]

undigression.

At Berkeley, we haven't taught Fortran in CS for many (20) years. There

is a Fortran intro course in engineering, taught by engineering faculty.

But the main upper division scientific computing for engineers course

is taught using Matlab, although there is some (I think) fortran. Maybe

sometime they will use a CAS too. Working engineers may have to make

many sacrifices to get the job done. We try to show them better

approaches that might help them during the 40+ years of their career.

We are not a vocational school. They can learn fortran in 1 week, and

they can become expert at it in 3-6 months. This is not important CS stuff.

The fact of the matter is that many non-CS scientists think that CS courses

"teach programming languages". This is (or should be) false. We should

be teaching ideas about algorithms, data structures, organization of

programs (correctness, etc). The nature of the artifacts (e.g. current

instruction sets, dialects of programming languages) are incidental, but

we teach some of that too.

I have heard it said at a major US gov't laboratory that the difference

between physicists and computer scientists is that the computer

scientists know no physics. Although this is is initially amusing,

it is actually a truthful indication of the sad state of computing in

physics. Do you really want your students to get jobs where they

can worry about someone else's COMMON statements,

and submit their revisions of programs as"card deck images" --- and

not know that there is something better???

>Its totally clear that most cs departments don't have

>the faintest idea what non-cs people want or need to know about

>programming.

I agree on this.

>This can be seen from the recent move at many places from

>FORTRAN to C as the first language.

False, at least for my department. We don't teach C first, and we certainly

don't push C as a language for scientific computing. We introduce Scheme

first as a way of conveying important ideas in COMPUTING.

>it has been said that you shouldn't teach programming with mathematica

>because its a bizarre and inconsistent language...

I'll agree with whoever said that.

... and if you learn

>mathematica programming it will ruin your ability to learn how to program

>properly.

I doubt that. But I've heard this said about BASIC, and FORTRAN too. The

human mind is sufficiently flexible that bad habits can be unlearned.

>the best thing that could happen to the teaching of programming to tech

>people would be to use mathematica to do it.

I disagree strongly. You may think this is a religious argument, but

here are my justifications, in brief:

Try to explain the concept of a linked list in Mathematica. Try to

explain why O(n log n) sorting algorithms take O(n^3). Try to explain that

a/b and 1/2 are different data structures.

Try to explain why x:=(x+x)/2 increases the accuracy of x, but

that x:= 2*x-x decreases it. (but only in versions < 2.0) .

If you want your students to believe that computers are magic,

although sometimes unreliable, then Mathematica is your language!

>

>finally, the comment about the use ofthe 'claptrap' of anonymous

>functions to avoid pattern matching is total claptrap (which according to

>the dictionary means pretentious nonsense). i write almost all of my

>programs using anonymous functions because i use a functional programming

>style which is very natural to me. others prefer to program using

>pattern matching (which is far more sophisticated in mathematica than in

>most other languages). the ability to choose the style most appropriate

>to the problem and to the way in which the programmer thinks is one of

>the most important features of mathematica.

Thanks for looking up claptrap.

> hopefully, this thread will have ended by the time i return from my

>thanksgiving hoiday.

I doubt it. Enjoy your turkey..

Nov 20, 1992, 1:36:48 PM11/20/92

to

In article <1ej8dv...@agate.berkeley.edu> fat...@peoplesparc.Berkeley.EDU

(Richard Fateman) writes:

>At Berkeley, we haven't taught Fortran in CS for many (20) years. There

>is a Fortran intro course in engineering, taught by engineering faculty.

>But the main upper division scientific computing for engineers course

>is taught using Matlab, although there is some (I think) fortran.

(Richard Fateman) writes:

>At Berkeley, we haven't taught Fortran in CS for many (20) years. There

>is a Fortran intro course in engineering, taught by engineering faculty.

>But the main upper division scientific computing for engineers course

>is taught using Matlab, although there is some (I think) fortran.

There are several constraints in designing a CS/programming curriculum

for scientists and engineers. One of them is that most of your students

are unlikely to take more than two half-courses because that is all that

their departments mandate. A worthy approach might be to start out with a

half-course in algorithms and basic theory, perhaps even without assigning any

programming exercises. (In other words, don't make them learn ANY

programming language. Implementation details too often detract from learning

in introductory CS courses.) Let the students learn about algorithms by

using their own brains. In a second half course, introduce a handful

of programming environments and design programming exercises that will

require interaction of these environments. For instance, you might

choose Fortran, Maple and Matlab. Maple does a symbolic computation,

generates Fortran code which is compiled and produces some output which

is passed to Matlab for visualization. (The whole process could

obviously and naturally be spread over several assignments.) That's the

way scientists really work and there is no sense pretending that any one

system can do everything.

>In article <By0q9...@news.cso.uiuc.edu> Richard J. Gaylord

><gay...@ux1.cso.uiuc.edu> writes:

>>Its totally clear that most cs departments don't have

>>the faintest idea what non-cs people want or need to know about

>>programming.

>>This can be seen from the recent move at many places from

>>FORTRAN to C as the first language.

>

>False, at least for my department. We don't teach C first, and we certainly

>don't push C as a language for scientific computing.

Perhaps, but some CS departments are doing this, even to their own

students. The truth is that neither Fortran nor C is a wonderful

teaching language. (Fortran is too restrictive and C is too

permissive.) Fortran in combination with other tools might not make a

bad combination however. The students can always learn C later.

>>the best thing that could happen to the teaching of programming to tech

>>people would be to use mathematica to do it.

Why Mathematica in particular? More importantly, why Mathematica

alone? Scientific computation almost never involves only one language

or tool anymore. I have no real objection to Mathematica being a part

of the curriculum, but it seems silly to not teach the students to be

flexible and use whatever tools seem most appropriate.

Marc R. Roussel

mrou...@alchemy.chem.utoronto.ca

Nov 20, 1992, 2:46:01 PM11/20/92

to

This is not to argue with any of the other points made in this

discussion so far, just to add a factual observation, for whatever

it's worth:

discussion so far, just to add a factual observation, for whatever

it's worth:

At least in my observation, in the past few years working engineers in

industry have been taking to Mathematica as an everyday working tool

in very much the same way that business/financial types took to

VisiCalc and its Lotus/Excel successors when they first came out.

I'm a university professor type myself, but I deal with working level

engineers in classes and consulting relations; and these days, when

they have a numerical calculation to do or some simple (or not so

simple) analysis to work out, they just slap the equations into mma,

see what comes out, and pass along the printout, or paste it into

their notebook or monthly report.

They may do this intelligently, or efficiently, or with some

sophistication, or they may not; but it's what they do; the dramatic

increase in their routine reliance on this approach is very evident.

Nov 20, 1992, 3:11:53 PM11/20/92

to

Subject: Re: The Real Meaning of Efficiency? (Re: Serious Programming,

etc.)

From: Richard Fateman, fat...@peoplesparc.Berkeley.EDU

Date: 20 Nov 1992 17:48:47 GMTetc.)

From: Richard Fateman, fat...@peoplesparc.Berkeley.EDU

In article <1ej8dv...@agate.berkeley.edu> Richard Fateman,

fat...@peoplesparc.Berkeley.EDU writes:

>We are not a vocational school.

=======================

you can't possibly believe that, richard.

berkeley, illinois, mit and all the rest of the so-called 'institutions

of higher education' are absolutely, definitely vocational schools.

=================

>>it has been said that you shouldn't teach programming with mathematica

>>because its a bizarre and inconsistent language...

>

>I'll agree with whoever said that.

======

actually, you said some time ago.

======

>If you want your students to believe that computers are magic,

>although sometimes unreliable, then Mathematica is your language!

===============

using a macintosh will also make you believe this.

aside - someone once said that any technology that is sufficiently

advanced is equivalent to magic.

have a nice thanksgiving, richard.

Nov 20, 1992, 3:15:47 PM11/20/92

to

i didn't mean to teach mathematica. i meant to teaching programming using

mathematica because you can teach different programming styles with the

language.

mathematica because you can teach different programming styles with the

language.

Nov 20, 1992, 1:02:21 PM11/20/92

to

In article <1egjol...@agate.berkeley.edu>, fat...@peoplesparc.Berkeley.EDU (Richard Fateman) writes:

> In my view one of the most powerful abstractions for programming

> mathematical algorithms is recursion. If a user writes a recursive

> program in Mathematica, it appears that there is an enormous

> efficiency penalty, at least compared to using built-in Map-like

> commands. This is quite unfortunate.

> In my view one of the most powerful abstractions for programming

> mathematical algorithms is recursion. If a user writes a recursive

> program in Mathematica, it appears that there is an enormous

> efficiency penalty, at least compared to using built-in Map-like

> commands. This is quite unfortunate.

Well, the C language implementations available pretty much have

the same overhead problem. Recursion being a lot more expensive

than an iteration.

So it is truly amusing that something like Mathematica has the same problem.

Especially considering how so many people think of it as

being so "high-level"

Maybe there there is some deep philosophical reason for this?

> digression ...

> When IBM came out with the IBM-360 design in the mid 1960s, they

> missed the boat on recursion and subroutine calling generally.

> Some people think the IBM 360 set back the progress of programming

> languages by 10 years or more.

>

Do you right tight, modular code, with lots of subroutines and data-driven

or object-oriented techniques?

You are going to get killed on the latest a RISC machines, as compared

with how fast the straightline-coded standard benchmark fair

will run.

-gjc

Nov 23, 1992, 6:39:01 AM11/23/92

to

We have been experimenting with various ways of teaching Computational

Physics to our undergraduates for a couple of years; our guinea pigs

have been III and IV Physics specialists (in the US == "majors").

Since we intend to offer "production" courses beginning Sept/93 I

have been following this thread with interest. Perhaps I can

contribute constructively. In most of what follows, the word 'physics'

is generic and others such as 'engineering' or 'physical chemistry'

can be substituted I think. Similarly, 'Mathematica' is generic

and other symbolic math programs can be substituted.

Physics to our undergraduates for a couple of years; our guinea pigs

have been III and IV Physics specialists (in the US == "majors").

Since we intend to offer "production" courses beginning Sept/93 I

have been following this thread with interest. Perhaps I can

contribute constructively. In most of what follows, the word 'physics'

is generic and others such as 'engineering' or 'physical chemistry'

can be substituted I think. Similarly, 'Mathematica' is generic

and other symbolic math programs can be substituted.

First, I see a strong parallel between the traditional mathematician/

physicist tension and the emerging comp_sci/physicist one. Which

means that the solutions may be similar: an amalgam of Dept of Comp

Sci courses, Physics Dept "CS for Physicists" ones, and building the

needed CS in the regular Physics courses in which it is needed.

Second, our students range from cyber-phobes to hacker. This does

not seem to correlate strongly with their overall ability in Physics,

although will clearly influence the kinds of Physics they are likely

to do in the future. Thus, if we wish to cast our net broadly, and

maybe even help our cyber-phobes become more comfortable with this tool,

we need to be extremely careful about getting too far down into the

computer-esque details of things.

Third, physicists tend to be people who are particularly interested in

CS (and math) only when it is needed to solve a particular Physics

problem.

We offer a course in "Microcomputer Interfacing" at the IV Year

level. For the students who choose this course, saying "you better

know or be willing to learn Pascal or C" is sufficient. These

students are, however, a subset of our student body.

An example: we wrote a package to investigate the physical pendulum.

The package has been used for two years in our III Year Classical

Mechanics course (level == Goldstein for the Physicists reading this),

which is a required course for all students. For reasons of speed of

execution we wrote a 4th order Runge-Kutta package in C talking to

Mathematica via the Mathlink protocols instead of using a Mathematica

RK package. The students produced phase plots, etc. and found this part

of the package a great success. In the process they learned a fair

amount of Mathematica to process the lists produced by the RK code.

Last year we then asked the students to take the existing 4th order C

code and produce 2nd and 3rd order Runge-Kuttas to investigate the

differences in the results of various algorithms. Disaster! Actually

writing (or in this case removing) C code was too much for a significant

fraction of these students. And they hated it.

This year we coded 2nd, 3rd, 4th and 5th order Runge-Kuttas and a

sympectic integrator, and the students could choose the algorithm by

editing a define used by the C pre-processor. This worked. The hackers

could get into the guts of how the integrators were coded, the interested

could see what such code looks like, and the cyberphobes at least had

the code in front of them in the editor while they looked for the cpp

define to change.

I agree when in article <1992Nov20....@alchemy.chem.utoronto.ca>

mrou...@alchemy.chem.utoronto.ca (Marc Roussel) writes:

> There are several constraints in designing a CS/programming curriculum

>for scientists and engineers. One of them is that most of your students

>are unlikely to take more than two half-courses because that is all that

>their departments mandate.

We aren't even going to mandate this much.

However I don't agree when he suggests:

>A worthy approach might be to start out with a half-course in algorithms

>and basic theory ...

We are designing our courses to be labs, just like our other Physics

labs except they are computer-based instead of equipment-based. In fact

if we can't illustrate a computational method at an appropriate level

with a real Physics problem, that method is de-emphasised or maybe even

ignored by these courses.

Since our Computational Physics courses will be getting our students 'up

to speed' in Mathematica, I think that our first Runge-Kutta experiment

above could work in those courses if we used RK's coded in Mathematica. I

am much less hopeful about C or FORTRAN unless we based the whole course

on one of those languages; doing that would necessarily mean we would end

up teaching a lot more Computer Science than we need to by using a more

natural interpreted environment such as Mathematica.

Finally, we believe we are beginning to see a critical mass phenomenon

with our students. Some of them begin using Mathematica to do their

problem sets in their regular courses. Their classmates observe them

doing this and decide to try it too. Finally, students who are still

solving their differential equations by hand realize they are at a

disadvantage. [The other phenomenon is the student who spends 3 hours

solving a task with a computer when she/he could have done it by hand

in 37 seconds! That is learning for the student too.] When our

undergrads go on to graduate school they are now demanding that their

supervisor provide them with Mathematica; I guess those educational

discounts sometimes do work for the vendors.

--

David Harrison | "For us believing physicists the

Dept. of Physics, Univ. of Toronto | distinction between past present

Inet: harr...@faraday.physics.utoronto.ca | and future is illusion, however

| persistent." -- Einstein

Nov 23, 1992, 1:07:16 PM11/23/92

to

In article <By631...@helios.physics.utoronto.ca> harr...@faraday.physics.utoronto.ca (David Harrison) writes:

> .... (stuff omitted)

>Finally, we believe we are beginning to see a critical mass phenomenon

>with our students. Some of them begin using Mathematica to do their

>problem sets in their regular courses. Their classmates observe them

>doing this and decide to try it too. Finally, students who are still

>solving their differential equations by hand realize they are at a

>disadvantage.

I also absolutely believe in having students use the best tools at

hand, and today that's Mathematica or something like it. But consider

the following: last week I gave a problem in a midterm exam for a

beginning lasers class which involved solving for just one of the

steady-state level populations in a simple three-level rate equation

example (in other words set up and solove for one of the variables in

a set of three coupled linear _algebraic_ equations).

If you looked at the physical problem itself you could immediately

pick out the two levels which had the fewest transition terms

connecting them; write the rate equations for just those two levels,

plus the "conservation of atoms" equation; make a few quick algebraic

manipulations in an intelligent order; and solve for the desired level

population in just a few lines.

One of the better students in the class blew the problem entirely;

and we talked about it afterwards. He clearly understood the material

perfectly; we'd done a number of even more complex rate equation

problems as homework assignments, and he'd done them all correctly;

but he'd done all of them with Mathematica, and as a result hadn't

practiced the little tricks of algebraic manipulation by hand that

made it possible to solve the exam problem quickly, with minimal

tedious algebraic manipulation.

This leads me to two questions:

1) If all students have and use Mathematica for this kind of

calculations -- as I agree they should -- how are we going to teach

(or _should_ we even teach?) the kinds of clever little tricks we've

learned from experience for the hand manipulation of algebraic

equations, or the simpler differential equations?

[Note that most of these "tricks" actually have little or no value

in doing a Mathematica solution, since the basic approach with

Mathematica is to plug in the equations straightforwardly and

accurately, and let Mathematica do the solving.]

2) How can we give midterms and final exams? Does every student

have to have a laptop? Does every exam have to be a "take-home", so

the student can use his or her computer?

Coping with these problems in the coming years will be interesting.

Nov 23, 1992, 5:30:49 PM11/23/92

to Anthony E. Siegman

When I started college, calculators were forbidden from tests because

it was unfair that only a few students had them. By the time I was

done, almost everybody had one and they were universally allowed. Now

students are not learning tricks to simplify arithmetic and the

calculation of transcendental functions (like logarithms) because it

is much easier to enter the raw numbers into their calculators.

it was unfair that only a few students had them. By the time I was

done, almost everybody had one and they were universally allowed. Now

students are not learning tricks to simplify arithmetic and the

calculation of transcendental functions (like logarithms) because it

is much easier to enter the raw numbers into their calculators.

The analogous thing is in the process of happening with symbolic

algebra systems and algebra. Is it a bad thing? Probably not. While

the transition is being made there will be a problem of fairness and

convenience, but it will pass.

-- Ethan

Nov 23, 1992, 7:06:59 PM11/23/92

to

In article <1992Nov23....@EE.Stanford.EDU> sie...@EE.Stanford.EDU (Anthony E. Siegman) writes:

>

> I also absolutely believe in having students use the best tools at

>hand, and today that's Mathematica or something like it. But consider

>the following: last week I gave a problem in a midterm exam for a

>beginning lasers class which involved solving for just one of the

> [story about how good student messed up a problem because the>

> I also absolutely believe in having students use the best tools at

>hand, and today that's Mathematica or something like it. But consider

>the following: last week I gave a problem in a midterm exam for a

>beginning lasers class which involved solving for just one of the

> student understood the material well, the student could not

> perform algebraic manipulations by hand quickly enough to finish

> on time]

> This leads me to two questions:

>

> 1) If all students have and use Mathematica for this kind of

>calculations -- as I agree they should -- how are we going to teach

>(or _should_ we even teach?) the kinds of clever little tricks we've

>learned from experience for the hand manipulation of algebraic

>equations, or the simpler differential equations?

I don't know how much weight this carries, but I am currently an

undergraduate senior majoring in computer engineering. I am presently

trying to survive Electrical Engineering 314, which calls itself

"Linear Circuits II," but is really a signal analysis course with

a large math component.

I have been using Maple throughout the semester to assist me with my

homework by grunting through very nasty systems of equations and

verifying my own work in transforming and inverse transforming

circuit equations. I was warned early on by an electrical

engineer friend of mine that I needed to stay proficient at doing

these sorts of operations by hand. I heeded that advice, and

I still work many problems by hand just to keep my skills sharp.

However, it seems to me that the EE department could do SO much more

with the class if the time-consuming drudgery of computation could

be minimized. It is not unusual for me to spend 20 minutes wrestling

with a nasty integral -- yet, I demonstrated my understanding of the

concept at the point where I wrote down the integral in the first

place. I can easily spend four hours working on four homework

problems.

Regarding tricks, I assume that there used to be some handy tricks

for making slide rule computations faster and simpler, yet these

tricks aren't very useful to us any more due to the decline of

the slide rule. The same will happen for any other sort of

manual computation as faster, more capable devices come along.

(Imagine something along the lines of a tricorder on Star Trek!)

> 2) How can we give midterms and final exams? Does every student

>have to have a laptop? Does every exam have to be a "take-home", so

>the student can use his or her computer?

To answer the first question: Give midterms and final exams that test

concepts, what the concepts MEAN (e.g. what does the convolution

integral *really* mean?), and how to apply them, not computational

skills and mathematical tricks.

As for every student having a laptop, when laptops fall to around

$250 or thereabouts, why *not* require them? I suppose that

calculators that offered a tenth of what my HP-42S can do were scarce

at one time (I was a toddler at that time), but not any more. I know

scores of people who own and use miniature powerhouses like HP-48SXs,

HP-28Ss, TI-85s, graphing Casios, and lowly HP-42Ss. The students

have demonstrated their willingness to purchase these types of

calculators to make their computational lives simpler (I'm also certain

that many of these calculators are purchased to make a status

statement of sorts.) Students should consider the cost to be

an investment in the tools of the trade much in the same what that

art students and such purchase their supplies.

My laptop, a Mac PowerBook 100 (4 MB, 20 MB disk), is quite sufficient

to run Maple at speeds I consider acceptable (let's not discuss

this particular point) -- I've tried it. In fact I plan I buying

Maple soon so I can run it at home. With current technology

developments, I fully expect a "calculator" with similar capabilities

to my PowerBook to weigh under 1 pound (~.5 kg) within another 3 to 4

years and cost under $300.

Daryl

--

Daryl Biberdorf N5GJM d-bib...@tamu.edu or dlb...@tamsun.tamu.edu

Nov 23, 1992, 8:03:18 PM11/23/92

to

I was taught to use a slide-rule in chemistry. Slide-rules differ

from calculators and lap-tops in not telling you where the decimal

point is. If you have no idea whether the answer is 3*10^6 or 3*10^7 or

even 3*10^(-7), you can't make much use of the slide-rule.

from calculators and lap-tops in not telling you where the decimal

point is. If you have no idea whether the answer is 3*10^6 or 3*10^7 or

even 3*10^(-7), you can't make much use of the slide-rule.

The intellectually correct posture that allows one to promote

the use by elementary school students of four-function

calculators for doing arithmetic, is that what should be taught

is Approximation. That is, the student should realize that,

regardless of the result of his/her machine computation, the

weight of a golf ball is not a kilogram, and the height of

the Sears Tower is not 2 miles. Unfortunately, most elementary

school teachers are innumerate (illiterate wrt numbers).

I don't know what exactly the equivalent skill would be for a computational

physicist (not being one) but I suspect it is includes

.. the answer must have the right dimensions

.. the answer must have the right behavior at special points (say t-> infinity)

.. the computational method, when applied to (simple) problems for which there

known solutions, must produce those known solutions.

Unfortunately, what most of us learned was

.. (in calculus) Every problem has a solution in closed form in terms of

elementary functions, or you wouldn't be asked to do it.

.. the solution method can be found in the just previous chapter.

.. If the answer seems to require a lot of algebra you must have missed

some simplifying trick.

Too bad. The computer algebra system in a laptop will not know any

of these heuristics (maybe because they are false in general).

Perhaps the point is to make sure that the CAS are used for extending

the reach of computational science/education, and engineering

applications, and not merely for dulling our senses by making short

work of trivial problems.

Students trained by Sesame Street already think that all problems can

be solved in 2 or 3 minutes.

Unfortunately, most college math/science teachers are not really

grounded in computation of the symbolic sort. Once you get beyond graphing

(something done by some hand-held calculators), computation becomes

magic, albeit unreliable magic. Some CAS confirm this impression.

Nov 24, 1992, 5:14:11 AM11/24/92

to

The computer is no substitute for learning to work the problem by hand.

In the undergraduate physics labs at Caltech, in my year we

fit and plotted data by hand (with a hand calculator) In the years

following me a set of programs were written for all to use in data

manipulation.

I worked with people who let the computer do all the work as

lab partners in 3rd and fourth year lab courses.

They did not know the motivations for the data manipulations, nor

have physical feel for error propagation.

Teach the hand way. In High School the proliferation of calculators

is preventing students from learning the basics. My kid is too

quick to go to the calculator and consequently is not drilling

on the fundamentals (example: Knowing that a trig function relates

parts of the right triangle is more important than finding the numerical

value of sin(45 deg))

Mathematica is no substitute for knowing how to work a problem.

If you don't know how to work the problem yourself you will either

get stuck or not know what confidence to put in what Mathematica has

come up with.

-Ted

Nov 24, 1992, 8:26:59 AM11/24/92

to

In article <1992Nov20.0...@alchemy.chem.utoronto.ca> mrou...@alchemy.chem.utoronto.ca (Marc Roussel) writes:

> I would be interested to hear whether the major players (the Symbolic

>Computation Group, Wolfram Research, the vendors of various Macsyma variants,

>etc.) worry much about space efficiency when they are creating new

>library code. I get the impression that the answer is no, at least for

>Maple and Mathematica.

>

> Marc R. Roussel

================================================================

In considering various variants of algorithms we (Maple) try to

optimize S^2*T (space squared times time). So to accept an

increase of a factor of two in space we would have to have a

decrease by more than a factor of 4 in the time.

We (I in particular) continue to be extremely serious about the

efficiency (time,space) of the entire system.

Gaston H. Gonnet.

Nov 24, 1992, 10:53:52 AM11/24/92

to

In article <1erv0m...@agate.berkeley.edu> fat...@peoplesparc.Berkeley.EDU

(Richard Fateman) writes:

(Richard Fateman) writes:

> The intellectually correct posture that allows one to promote

> the use by elementary school students of four-function

> calculators for doing arithmetic, is that what should be taught

> is Approximation. That is, the student should realize that,

> regardless of the result of his/her machine computation, the

> weight of a golf ball is not a kilogram, and the height of

> the Sears Tower is not 2 miles. Unfortunately, most elementary

> school teachers are innumerate (illiterate wrt numbers).

>

> I don't know what exactly the equivalent skill would be for a computational

> physicist (not being one) but I suspect it is includes

> ... the answer must have the right dimensions

> ... the answer must have the right behavior at special points (say t->

infinity)

> ... the computational method, when applied to (simple) problems for which

there

> known solutions, must produce those known solutions.

>

I agree completely. In my freshman physics class at MIT in 1965, our first

assignment was to estimate the number of blades of grass on the athletic field.

Do they still do that, I wonder? I hope so---that kind of skill is even more

valuable now than it was 25 years ago.

Lately I've been using Mathematica to make up simple undergraduate problem sets

for economics majors. It's very convenient for this purpose since the graphs

come out nicely scaled, I can play with the numbers to get simple solutions,

etc. After doing this for a few weeks, it struck me that it was remarkably

inefficient to develop a problem on Mathematica and then tell the student to do

it by hand. Why not give the student access to the same tools that I have

access to? That's what they will use in the future, might as well start now.

I'm not using anything special about Mathematica for these undergrad

courses---any symbolic/math/numeric system could do the calculations I'm

talking about.

--

Hal.V...@umich.edu Hal Varian

voice: 313-764-2364 Dept of Economics

fax: 313-764-2364 Univ of Michigan

Ann Arbor, MI 48109-1220

Nov 24, 1992, 12:35:35 PM11/24/92

to

Hal.V...@umich.edu writes:

>Lately I've been using Mathematica to make up simple undergraduate problem sets

>for economics majors. It's very convenient for this purpose since the graphs

>come out nicely scaled, I can play with the numbers to get simple solutions,

>etc. After doing this for a few weeks, it struck me that it was remarkably

>inefficient to develop a problem on Mathematica and then tell the student to do

>it by hand. Why not give the student access to the same tools that I have

>access to? That's what they will use in the future, might as well start now.

It's a real interesting question. When we teach home economics, we don't

teach it using an open hearth, we use ovens and microwaves. When we teach

industrial arts, we don't teach it using entirely hand tools, we use power

tools, the kind that the professionals use in the "real world".

So when it comes to Math or Science, why aren't more students exposed to the

tools that the professionals would use? Granted, the technology is changing

faster (i.e., ovens have been around a lot longer than Mathematica) but

by not exposing students to the technology now, won't they be further

behind when technology advances even further and the students enter the

"real world"?

Just some food for thought from an undergrad just dying to go home and get

some real food for himself. Happy Thanksgiving.

--

Justin Gallivan Calculus&Mathematica Development Team

gall...@after.math.uiuc.edu University of Illinois at Urbana-Champaign

-------------------------------------------------------------------------

"This is not a revolution, I did not do the revolution, thank you." -R.E.M.

Nov 24, 1992, 1:19:28 PM11/24/92

to

The argument for "power tools" is plausible if you can continue to

achieve the desired objectives. I've already argued that computer labs

for calculus can have a use if they merely boost enrollment in your

sections (leading to your tenure in the Math department of a great

metropolitan university). But I'm not in a math department, and

I already have tenure.

achieve the desired objectives. I've already argued that computer labs

for calculus can have a use if they merely boost enrollment in your

sections (leading to your tenure in the Math department of a great

metropolitan university). But I'm not in a math department, and

I already have tenure.

Another analysis:

Either we are teaching stupid things in calculus (certainly a possibility)

or not.

If we are teaching stupid things, then we should do something else.

E.g. we could teach Fortran, the secret language of the real world. :)

Another is that we could teach the blind use of Mathematica for

graphing and doing integrals. :(

Or if we truly wanted to be "real world" we could teach stuff like

"Why the state lottery is a bad bet" or "Which is better, a 7.5%

mortgage with 2 points or an 8.0% with 0 points." How to lie with

statistics. etc.

We could even teach touch-typing.

If we are already teaching things we want students to learn, like solving

calculus word problems or how to correctly solve algebraic equations

WHEN THEY ARE AWAY FROM A COMPUTER,

then it only makes sense to use Mathematica if it helps students to

solve problems in such situations. (by understanding better).

If we want to teach them how to do things WITH A COMPUTER, that is

obviously a different set of skills.

As an analogy,

1. You don't teach someone to walk by driving him around city streets

in a car to look at the crosswalks.

2. You don't teach someone to walk especially well by asking him to

cross freeways on foot at night.

Nov 25, 1992, 12:37:49 PM11/25/92

to

fat...@peoplesparc.Berkeley.EDU (Richard Fateman) writes:

>The argument for "power tools" is plausible if you can continue to

>achieve the desired objectives.

O.K., let's say that the objective is to teach calculus, I should have

been more clear.

I've already argued that computer labs

>for calculus can have a use if they merely boost enrollment in your

>sections (leading to your tenure in the Math department of a great

>metropolitan university).

This has no bearing on the teaching of calculus.

But I'm not in a math department,

Neither am I. and

>I already have tenure.

I suppose I should finish my undergraduate chemistry degree before any

discussions of tenure come up. :)

>Another analysis:

>Either we are teaching stupid things in calculus (certainly a possibility)

>or not.

Stupid may not be the best word, although a case could be made for it

I'm sure. If anything can be said it is probably that the material is

presented wrong, this is not an argument for or against books or computers

or anything, but if a student's time in a calculus course is spent learning

how to do <blank> (insert proocedure here) and not what <blank> really

means or when to use <blank>, it would seem to me that the student's time

is not being well spent, and neither is the teacher's. If a student leaves

the class with an arsenal of memorized procedures, but no insight as to

when or why to use them, that student is lost. Furthermore, 6 months or

so down the road, that student may forget some of the procedures, leaving

him or her further behind. Now, there is a student with little or no

insight into problem solving, and somewhat questionable computational

skills. (Sounds almost like a few CAS systems to me, but that is another

day on Oprah :) ) But, if even a small fraction of the student's time in the

course could be spent learning the why and when of <blank>, at least the

student would walk away from the course with some insight into how calculus

is used and for what reasons. If the computing time (meaning the repetition

of procedures already know by the student say algebra...) can be reduced

using a CAS (which one is of no consequence really), not only would the

student have more time to gain insight, but the student would also have

the ability to use that insight on a computer or on paper. This is to

say nothing of the advantages of knowing how to use a CAS for other

classes, or the ability experiment with mathematics with "instant" results.

I have seen countless students play "what if?" in front of a computer, but

I have yet to see a student see just how much better a 50 term series

converges than a 5 term series using pen and paper.

>If we are teaching stupid things, then we should do something else.

>E.g. we could teach Fortran, the secret language of the real world. :)

Since the objective is to teach calculus, I say we improve it's content/

presentation first. Now calculus with fortran, there's an idea :)

>Another is that we could teach the blind use of Mathematica for

>graphing and doing integrals. :(

I don't think anyone would recommend that :) But a student with a CAS

may just decide to plot a complicated function before integrating it

because it is very easy to do. At least they have a better idea if the

answer makes sense when they do integrate it.

>Or if we truly wanted to be "real world" we could teach stuff like

>"Why the state lottery is a bad bet" or "Which is better, a 7.5%

>mortgage with 2 points or an 8.0% with 0 points." How to lie with

>statistics. etc.

If a calculus problem can be made out of it, why not? It has a little

more bearing on a student's life than proving the mean value theorem

or something like that.

>We could even teach touch-typing.

This doesn't really teach calculus, and its value as a tool in calculus

is probably quite low.

>If we are already teaching things we want students to learn, like solving

>calculus word problems or how to correctly solve algebraic equations

>WHEN THEY ARE AWAY FROM A COMPUTER,

>then it only makes sense to use Mathematica if it helps students to

>solve problems in such situations. (by understanding better).

If you mean better understanding through experimentation, I am all for

it. Students will probably investigate more if they have a powerful

tool to help them. Continuing with that theme, just as a power saw makes

carpentry easier, it doesn't tell you how to build a house. But we

continue to use them because they take the drudgery out of building

houses, they give us more time to learn about building houses, and they

allow us to build bigger and better houses.

Just as a practical carpenter who wants to build big houses will turn

to the power tools, so will the scientist, and so will the potential

scientist. Just show the students the concepts, and how they are

used, and let them discover from there.

As an analogy,

"You don't teach someone sex by showing them pornography." :)

>If we want to teach them how to do things WITH A COMPUTER, that is

>obviously a different set of skills.

>As an analogy,

>1. You don't teach someone to walk by driving him around city streets

>in a car to look at the crosswalks.

>2. You don't teach someone to walk especially well by asking him to

>cross freeways on foot at night.

>--

>Richard J. Fateman

>fat...@cs.berkeley.edu 510 642-1879

Nov 25, 1992, 4:18:02 PM11/25/92

to

In article <ByA8z...@news.cso.uiuc.edu> gallivan@after (Justin Gallivan) writes:

>

>>We could even teach touch-typing.

>This doesn't really teach calculus, and its value as a tool in calculus

>is probably quite low.

>

>>We could even teach touch-typing.

>This doesn't really teach calculus, and its value as a tool in calculus

>is probably quite low.

But since your project seems to take as a given that students should

use Mathematica to help understand calculus, then I must conclude that

touch typing becomes helpful to calculus.

See how your effectiveness is altered by typing commands with, say,

the 4th finger of your left hand, only. Or even your WHOLE left hand.

Once you realize the near-impossibility of that, (try

Plot[Sin[x],{x,0,2 Pi}] ) then try it with the 4th fingers of both

hands.

Steve Wolfram is a very fast typist. Anyone using Mathematica

in a "live" demonstration has to be quite fast and accurate.

Anyone who types slowly and inaccurately will get very frustrated,

and might rightfully wonder what this has to do with understanding

math.

Now back to work.. typitytypitytypity

Cheers.

Nov 25, 1992, 8:40:42 AM11/25/92

to

In article <1992Nov23....@EE.Stanford.EDU>, sie...@EE.Stanford.EDU (Anthony E. Siegman) writes:

> One of the better students in the class blew the problem entirely;

> and we talked about it afterwards.

> One of the better students in the class blew the problem entirely;

> and we talked about it afterwards.

> [Note that most of these "tricks" actually have little or no value

> in doing a Mathematica solution, since the basic approach with

> Mathematica is to plug in the equations straightforwardly and

> accurately, and let Mathematica do the solving.]

The solution is to give the ''better students'' even MORE DIFFICULT problems

to solve, which would require both "the tricks" and a computer algebra system.

Or perhaps see the better students get totally turned off by

the the field of physics.

Of course "The tricks" do have value, in general, because they

simplify otherwise huge problems so that they can be actually solved on

a computer.

Now, then again, it all depends on the audience you are looking toward.

Why are people taking physics? Are you trying to reach that student who

is a potential Richard Feynman?

-gjc

Nov 25, 1992, 7:41:44 PM11/25/92

to

In article <1992Nov20.0...@alchemy.chem.utoronto.ca> mrou...@alchemy.chem.utoronto.ca (Marc Roussel) writes:

> Others have rightly pointed out that a concept of time efficiency that

>does not include the programmer's time is inadequate in a modern

>research environment. Unfortunately, almost everyone seems to have

[ ...stuff deleted... ]>does not include the programmer's time is inadequate in a modern

>research environment. Unfortunately, almost everyone seems to have

> I would be interested to hear whether the major players (the Symbolic

>Computation Group, Wolfram Research, the vendors of various Macsyma variants,

>etc.) worry much about space efficiency when they are creating new

>library code. I get the impression that the answer is no, at least for

>Maple and Mathematica.

Perhaps this posting by Gaston Gonnet to this newsgroup a yesterday has not

reached your site yet:

From: gon...@inf.ethz.ch (Gaston Gonnet)

Subject: Re: Space efficiency (Was: The Real Meaning of Efficiency?)

Message-ID: <1992Nov24.1...@neptune.inf.ethz.ch>

Date: Tue, 24 Nov 1992 13:26:59 GMT

Lines: 20

In article <1992Nov20.0...@alchemy.chem.utoronto.ca> mrou...@alchemy.chem.utoronto.ca (Marc Roussel) writes:

> I would be interested to hear whether the major players (the Symbolic

>Computation Group, Wolfram Research, the vendors of various Macsyma variants,

>etc.) worry much about space efficiency when they are creating new

>library code. I get the impression that the answer is no, at least for

>Maple and Mathematica.

>

> Marc R. Roussel

================================================================

Nov 25, 1992, 8:46:00 PM11/25/92

to

Richard J. Fateman:

>But since your project seems to take as a given that students should

>use Mathematica to help understand calculus, then I must conclude that

>touch typing becomes helpful to calculus.

>

>... Anyone who types slowly and inaccurately will get very frustrated,

>and might rightfully wonder what this has to do with understanding

>math.

>

>But since your project seems to take as a given that students should

>use Mathematica to help understand calculus, then I must conclude that

>touch typing becomes helpful to calculus.

>

>and might rightfully wonder what this has to do with understanding

>math.

>

Not only is touch typing indeed helpful to understanding calculus in the sense

identified by Richard Fateman, touch typing -- coupled with a computer text

editor -- is helpful to clear thinking generally. How, you ask?

Well, the very attempt to present your own understanding of a subject in

clear written English is a powerful aid to organizing your thinking and

study. (On this point see William Zinnsser's (sp?) recent book,

"Writing to Learn".) The essential prerequisite for producing

clear written English is a willingness to *rewrite and revise*.

And touch typing, coupled with a computer text editor, can enormously

reduce the *physical labor* of rewriting and revising -- and hence

enormously enhance your willingness to do so. As a university

student in the mid-seventies, I typed up (I learned

touch typing in junior high school) my essays and lab reports

on eraseable bond paper on a mechanical typewriter. On re-reading, I

was perhaps willing to erase a particularly infelicitous sentence --

or a paragraph at most -- and rewrite, but if a paragraph on page 3

was now seen to naturally belong to page 7, too bad. I wasn't going

to type the whole thing over. But now, with my trusty EDT/LaTeX

combo, no problem.

Touch typing is an essential skill of the computer age and should be

taught as such, preferably by grade 8. But it is never too late

for typing to be offered as a non-credit course. I understand

there are personal computer programs which teach touch typing

quite effectively.

But we needn't stop there. Computer science departments should

consider teaching juggling as an example of algorithmic thinking

in action. I'm not being facetious. I learned how to juggle several

years ago after reading the algorithmic description presented in

Seymour Papert's "Mindstorms" (and practicing for a while, to be

sure), and it has been a source of continued enjoyment.

The importance of the kinesthetic sense in mathematical thought

(and thinking generally) has been grossly undervalued or totally

ignored in conventional education. You can get a real understanding

of the non-commutivity of finite rotations in a swimming pool.

Ron Balden

Nov 29, 1992, 11:12:57 PM11/29/92

to

fat...@peoplesparc.Berkeley.EDU (Richard Fateman) writes:

>In article <ByA8z...@news.cso.uiuc.edu> gallivan@after (Justin Gallivan) writes:

>>

>>>We could even teach touch-typing.

>>This doesn't really teach calculus, and its value as a tool in calculus

>>is probably quite low.

>But since your project seems to take as a given that students should

>use Mathematica to help understand calculus, then I must conclude that

>touch typing becomes helpful to calculus.

I indicated in the previous article that no preference to a particular

CAS should be given. Yes, we use Mathematica. Why? Because it serves

our purposes here, it has a workable notebook interface and has for

some time now. If Maple, or Macsyma, or any other CAS had an interface

that would suit our needs, we would use that. Again, we teach calculus,

we don't make a living out of bashing other systems. Heck, if you

find it in your heart to write an effective mouse driven or voice

driven front end, we would be forever indebted to you :).

On an aside, I can't say I remember a student ever claiming that

our course was just to hard because of all the typing.

>See how your effectiveness is altered by typing commands with, say,

>the 4th finger of your left hand, only. Or even your WHOLE left hand.

>Once you realize the near-impossibility of that, (try

>Plot[Sin[x],{x,0,2 Pi}] ) then try it with the 4th fingers of both

>hands.

Even better, curl your pencil up in that same finger and try to write

using only the 4th finger of your left hand. :)

<drivel about Wolfram deleted>

>Now back to work.. typitytypitytypity

>

>Cheers.

>--

>Richard J. Fateman

>fat...@cs.berkeley.edu 510 642-1879

Reply all

Reply to author

Forward

0 new messages