Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Category Theory of Algorithms

10 views
Skip to first unread message

the.theorist

unread,
Apr 8, 2007, 7:20:47 PM4/8/07
to
I'm not myself very well versed in the study of algorithms (my
training is bachelor-level physics and applied math), but I found
myself pondering the computational road ahead. Especially the
transition into parallel computing.

It seems to me that most of the literature about the study of
algorithms details aspects such as bounds on time, number of
operations, amount of memory required, etc... All focus has been
primarily on efficiency. As we transition to a parallel architecture I
foresee a need for a different sort of algorithmic study, one that
focuses more on the mathematical properties of algorithms. Properties
such as closure, contanenation, OR-ing, AND-ing, etc.. (I don't fully
know what those terms might mean when applied to algorithms, which is
why I think there's research to be done here.) What are the natural
operations that can be performed on algorithms?

Category Theory has been really nice for the study of type systems,
and for providing a lens with which to view mathematics, and the
relationship between mathematical objects. Has it been applied to the
study of the mathematical properties of algorithms (as a mathematical
object)? I wasn't able to find a strong link between the two
disciplines using google, (maybe I'm barking up the wrong tree?)

Can anyone recommend some textbooks in this area? And what are the
current thoughts on what is needed for the transition to the parallel
paradigm?

Jym

unread,
Apr 9, 2007, 2:15:10 PM4/9/07
to
On Mon, 09 Apr 2007 01:20:47 +0200, the.theorist <the.th...@gmail.com>
wrote:

> Category Theory has been really nice for the study of type systems,
> and for providing a lens with which to view mathematics, and the
> relationship between mathematical objects. Has it been applied to the
> study of the mathematical properties of algorithms (as a mathematical
> object)? I wasn't able to find a strong link between the two
> disciplines using google, (maybe I'm barking up the wrong tree?)
>
> Can anyone recommend some textbooks in this area? And what are the
> current thoughts on what is needed for the transition to the parallel
> paradigm?

I think there is a "Category theory for computer science" (or similar
title) by Barr & Wells, but I'm not quite sure this deals with the ideas
your explaining.

--
Hypocoristiquement,
Jym.

Pummelo

unread,
Apr 9, 2007, 3:11:30 PM4/9/07
to
"Jym" <Jean-Yves....@ens-lyon.org> wrote:

> > Category Theory has been really nice for the study of type systems,
> > and for providing a lens with which to view mathematics, and the
> > relationship between mathematical objects. Has it been applied to the
> > study of the mathematical properties of algorithms (as a mathematical
> > object)?

No.

> > Can anyone recommend some textbooks in this area?

In which area?

> I think there is a "Category theory for computer science" (or similar
> title) by Barr & Wells,

This book is incredibly boring :-) And rather hard for a newbie.

BTW, William Lawvere wrote an excellent introductory book - "Conceptual
Mathematics: A First Introduction to Categories".


mp

the.theorist

unread,
Apr 10, 2007, 12:32:31 AM4/10/07
to

> > > relationship between mathematical objects. Has it been applied to the
> > > study of the mathematical properties of algorithms (as a mathematical
> > > object)?
>
> No.
>
Well, that was simple and concise. It raises further questions,
though. I'll get to that at the end.

> > > Can anyone recommend some textbooks in this area?
>
> In which area?

The area of studying algorithms as mathematical objects, with closure
under some set of operations.

See what I'm after is an algebra of algorithms. I'd like to know if
it's possible to take some small set of simple algorithms, and by
repeated application of some operators build up to more complex
algorithms. An, Algebra or Calculus of Algorithms, if you will.

Now, since nobody has applied Category Theory to the study of
algorithms in the manner I was talking about, perhaps there have been
difficulties, and it's been to hard. If that's the case, then: What,
specifically, are those difficulties?

the.theorist

unread,
Apr 10, 2007, 12:35:34 AM4/10/07
to

> Now, since nobody has applied Category Theory to the study of
> algorithms in the manner I was talking about, perhaps there have been
> difficulties, and it's been to hard. If that's the case, then: What,
> specifically, are those difficulties?

Actually, that's probably jumping the gun. I should first ask:
Has anybody even attempted this before?

un.st...@gmail.com

unread,
Apr 10, 2007, 2:13:03 AM4/10/07
to
On Apr 10, 7:32 am, "the.theorist" <the.theor...@gmail.com> wrote:
> See what I'm after is an algebra of algorithms. I'd like to know if
> it's possible to take some small set of simple algorithms, and by
> repeated application of some operators build up to more complex
> algorithms. An, Algebra or Calculus of Algorithms, if you will.

How would one define "complexity" of an algorithm?


sjr...@gmail.com

unread,
Apr 10, 2007, 8:45:51 AM4/10/07
to

For category theory applied to algorithmics, look at the very nice
book "Algebra of Programming" by Richard Bird and Oege de Moor.

sjr

Patricia Shanahan

unread,
Apr 10, 2007, 10:49:29 AM4/10/07
to
the.theorist wrote:
...

> See what I'm after is an algebra of algorithms. I'd like to know if
> it's possible to take some small set of simple algorithms, and by
> repeated application of some operators build up to more complex
> algorithms. An, Algebra or Calculus of Algorithms, if you will.
...

A computer machine code represents a small set of simple algorithms.
Computer programming is the process of building more complex behavior by
applying operators such as concatenation, repetition, and alternation to
simpler algorithms.

In modern programming languages, with extensive application programming
interfaces, the raw material may itself be substantial algorithms.
However, if you peel away enough layers you get back down to the machine
code.

These issues have been studied for quite a long time. For example, see:

BÖhm, Corrado, and Jacopini Guiseppe. Flow diagrams, Turing machines and
languages with only two formation rules. Comm. ACM 9 (May 1966), 366-371.

"Formation rules" in the paper title are effectively rules for combining
simpler steps.

[I'm aware of that particular paper because it is referenced in
Dijkstra's "Go To Statement Considered Harmful" letter.]

If we could not build up more complex algorithms from simple ones, there
would be little point in studying standard algorithms. Very few people
are directly interested e.g. in finding the shortest path between two
vertices in a graph. The algorithm for doing that is important because
we can use it as a unit in programs that solve a real world problems.

Patricia

Mitch

unread,
Apr 10, 2007, 11:03:17 AM4/10/07
to
On Apr 10, 12:32 am, "the.theorist" <the.theor...@gmail.com> wrote:
>
> > In which area?
>
> The area of studying algorithms as mathematical objects, with closure
> under some set of operations.

We're not sure to answer probably because of cultural baggage that
goes with the terms you are using.

'Algorithms' (or anything with algorithm in it) usually refers to the
study of design and analysis of specific problems: e.g. shortest
paths, string matching, matrix multiplication, satisfiability of
logical formulas. Take a particular problem, solve it.

With that in mind, there really isn't much idea of studying operations
on those algorithms; one might study a set of operations that are used
-in- an algorithm, but not operations between algorithms (er... except
maybe reduction).

The closest I can imagine to what you are looking for is really in -
programming language theory-, where you -do- compose separate pieces
of (arbitrary) code with different operations, and there is a well
studied (and continuing study) application of category theory to
programming languages. The text that is most familiar to me is

Carl Gunter, Semantics of Programming Languages: Structures and
Techniques (Foundations of Computing)

the key word to look for for other texts/references is programming
language semantics.

> See what I'm after is an algebra of algorithms. I'd like to know if
> it's possible to take some small set of simple algorithms, and by
> repeated application of some operators build up to more complex
> algorithms. An, Algebra or Calculus of Algorithms, if you will.
>
> Now, since nobody has applied Category Theory to the study of
> algorithms in the manner I was talking about, perhaps there have been
> difficulties, and it's been to hard. If that's the case, then: What,
> specifically, are those difficulties?

1) if you really meant what other people mean by programming language
semantics then many people have worked on it.

2) if you really do mean -algorithms-... my imagination is too small
to think of anything other than this line of thought leading back
towards programming language semantics. Maybe if you give a more
specific example of what you're looking for.

Mitch

Chris F Clark

unread,
Apr 10, 2007, 12:31:59 PM4/10/07
to
the.theorist wrote:
> ...
>> See what I'm after is an algebra of algorithms. I'd like to know if
>> it's possible to take some small set of simple algorithms, and by
>> repeated application of some operators build up to more complex
>> algorithms. An, Algebra or Calculus of Algorithms, if you will.
> ...

Patricia Shanahan <pa...@acm.org> wrote a good reply including:
...


> If we could not build up more complex algorithms from simple ones, there
> would be little point in studying standard algorithms. Very few people
> are directly interested e.g. in finding the shortest path between two
> vertices in a graph. The algorithm for doing that is important because
> we can use it as a unit in programs that solve a real world problems.

So, the answer to your question is both "yes" and "no" (and the "no"
is actually a "yes" also, as I'll get to).

For the first yes, we do have a "calculus" of combining algorithms
(actually numerous ones). In one version, it's operations are
sequencing, alteration, and looping. In another, we have recursion,
composition, and alternation. Note that the adherents of functional
programming (roughly) believe that one can live by the lambda calculus
and algebras built on top of it. Gries in _The_Science_of_
_Programming_ did a pretty good exposition of a calculus and algebra
for imperative programming based on the work of Dijkstra and Hoare.

And, that brings up the first "no" answer. The calculus isn't usually
defined in terms of ands and ors. That being said, people have done
algorithm calculi based on "and", "or", and composition. I recently
read a article on building complex visitors, where they used those
operators. However, "and" and "or" aren't always the best operators
for combining algorithms. So, you could probably build the calculus
that you originally proposed. It just isn't likely to be
revolutionary work, since we already have calculi at roughly that
level.

Note that the "problems" aren't at that level either. People can
write provably correct programs using simple algorithms and
compositional methods today. However, most people don't do that.

Jym

unread,
Apr 10, 2007, 2:50:31 PM4/10/07
to
On Tue, 10 Apr 2007 17:03:17 +0200, Mitch <mah...@gmail.com> wrote:

> On Apr 10, 12:32 am, "the.theorist" <the.theor...@gmail.com> wrote:
>>
>> > In which area?
>>
>> The area of studying algorithms as mathematical objects, with closure
>> under some set of operations.
>
> We're not sure to answer probably because of cultural baggage that
> goes with the terms you are using.
>
> 'Algorithms' (or anything with algorithm in it) usually refers to the
> study of design and analysis of specific problems: e.g. shortest
> paths, string matching, matrix multiplication, satisfiability of
> logical formulas. Take a particular problem, solve it.
>
> With that in mind, there really isn't much idea of studying operations
> on those algorithms; one might study a set of operations that are used
> -in- an algorithm, but not operations between algorithms (er... except
> maybe reduction).

Well, there also some sort of combination as closure that can be seen,
typically, in the definition of primitive recursion (primitive recursive
algorithm being those build on certains basics structures and closed by
certains operations).

This work later leads to similar characterisation of complexity classes
(Implicit Computational Complexity) such as the bounded recursion of
Cobham or the safe recursion of Bellantoni and Cook.
Again, they show that the set of functions computed by a given algorithmic
schemata (some basic operations and closure by some other operations) is
exactly FPtime. Is this really studying operation between algorithms ? I
don't know...
These works, and later ones, however, deals with the study of algorithms
and not functions in the sense that for a given function some algorithms
will be catched and some other won't. Typically almost none of the ICC
models I'm aware is able to show that quicksort is Ptime (or even simply
terminates) while insertion sort is usually not a problem.

Later works include people such as N. Jones, A. Ben-Amram, C.S. Lee (Size
CHange Termination), M. Hofmann (Non Size Increasing programs), K.H. Niggl
(\mu mesure), D. Leivant (tiering), J.Y. Marion (my PhD supervisor,
tiering also), myself (Quasi-interpretations), L. Kristiansen (mwp
bounds), P. Baillot, K. Terui (DLAL), R. Amadio, G. Bonfante, and lots of
other people (including numerous PhD and master students whose names I do
not all remember).

However, we're not currently really studying algorithms as mathematical
objects in a "algebra of algorithms" sense. I guess such an algebra could
be build with the right insight but theory of algorithms is quite
inexistant (it is not even easy to agree on what is an algorithm, cf also
works of Gurevich or Moskovakis) compared to theory of functions so it
will probably take some times before such a "structured" achievement can
be done.

>> Now, since nobody has applied Category Theory to the study of
>> algorithms in the manner I was talking about, perhaps there have been
>> difficulties, and it's been to hard. If that's the case, then: What,
>> specifically, are those difficulties?

> 2) if you really do mean -algorithms-... my imagination is too small


> to think of anything other than this line of thought leading back
> towards programming language semantics. Maybe if you give a more
> specific example of what you're looking for.

I guess one of the main difficutly is to get a precise definition of what
is an algorithm, one on which everyone can agree.
Then, I'm not sure category theory will be the best suited tool to do the
stuff... I'm currently looking at a way to use a tensor algebra to study
algorithm. It's only a wild idea currently even is some examples seem to
work quite well...

--
Hypocoristiquement,
Jym.

Jamie Andrews; real address @ bottom of message

unread,
Apr 10, 2007, 4:01:43 PM4/10/07
to
the.theorist <the.th...@gmail.com> wrote:
> See what I'm after is an algebra of algorithms. I'd like to know if
> it's possible to take some small set of simple algorithms, and by
> repeated application of some operators build up to more complex
> algorithms. An, Algebra or Calculus of Algorithms, if you will.

Others have answered this in ways that I agree with.
Here's another view on the same opinions.

In order to create a calculus of algorithms, you need to
have mathematical objects standing for algorithms. In order to
do this, you have to formalize algorithms in some way. I can't
see any way of formalizing them that would not end up as some
kind of simple programming language. You would therefore end up
in the well-studied area of category-theoretic foundations of
programming languages.

Put another way, the thing that distinguishes an algorithm
from a program in a programming language is that there are some
more or less vague, unformalized aspects to it, such as English
sentence fragments that we assume the reader can convert to the
more formal text required by a programming langauge. So, the
nature of an algorithm (as opposed to a program) is antithetical
to formalism.

As far as developing a system that will give you
information about how fast a particular program will run, Martin
Hofmann (Univ. of Munich) and others have worked on type systems
that do this. See

Programming languages capturing complexity classes
ACM SIGACT News 31(1), 31-42, March 2000

for a survey.

--Jamie. (efil4dreN)
andrews .uwo } Merge these two lines to obtain my e-mail address.
@csd .ca } (Unsolicited "bulk" e-mail costs everyone.)

the.theorist

unread,
Apr 11, 2007, 3:22:56 PM4/11/07
to
Everyone has given me some really great references, thank you all!

>Jym wrote:
>I guess one of the main difficutly is to get a precise definition of what
>is an algorithm, one on which everyone can agree.
>Then, I'm not sure category theory will be the best suited tool to do the
>stuff...

>Mitch wrote:
>The closest I can imagine to what you are looking for is really in -
>programming language theory-, where you -do- compose separate pieces
>of (arbitrary) code with different operations, and there is a well
>studied (and continuing study) application of category theory to
>programming languages.

It's quite fascinating that the semantics of programming languages was
touched on. One of the reasons that I wanted an algebra of algorithms
is to study the relationship between the algorithm (which hasn't been
formally defined) and it's expression in a particular language. I
thought that by attempting a formalization of the algorithm, as a
mathematical object, this relationship could be made a little more
clear. This is why I thought Category Theory might help, as a unifying
lens.
( The Sapir-Whorf style relationship between what we think
(algorithms) and what we say (programs) is what I'm aiming to go to
grad school to study. Also how that relationship could help guide the
creation of computer languages. )

>Mitch wrote:
>'Algorithms' (or anything with algorithm in it) usually refers to the
>study of design and analysis of specific problems: e.g. shortest
>paths, string matching, matrix multiplication, satisfiability of
>logical formulas. Take a particular problem, solve it.

It really is unfortunate that I don't have any particular examples of
what I mean by 'algebra of algorithms'.
It was the particularity of a solution to a problem that I wanted to
get rid of. I was hoping that a study of algorithms could be made that
would free each algorithm from it's problem domain. Given that
algorithms found in graph theory are useful for the practical world,
it's human insight and experience that makes such a connection. I
mean, there isn't any theory that would tell us where a particular
algorithm might be applicable. My aim was to abstract the noting of
algorithm so that it could be given clear; specific ties to all the
rest of mathematics, connections that would be formally established,
rather than ad hoc. I'd really find it nice if there was a system that
would aid our insight into the connection between an algorithm and
it's various uses in the real world.

>Jym wrote:
>However, we're not currently really studying algorithms as mathematical
>objects in a "algebra of algorithms" sense. I guess such an algebra could
>be build with the right insight but theory of algorithms is quite
>inexistant (it is not even easy to agree on what is an algorithm, cf also
>works of Gurevich or Moskovakis) compared to theory of functions so it
>will probably take some times before such a "structured" achievement can
>be done.

>....


>I'm currently looking at a way to use a tensor algebra to study
>algorithm. It's only a wild idea currently even is some examples seem to
>work quite well...

Interesting that you mention Moskovakis, he was my set theory teacher,
while I was an undergrad at UCLA.
I'd really like to see some of those examples, It might assist in
clearing the fog.

>Chris F Clark wrote:
>Note that the "problems" aren't at that level either. People can
>write provably correct programs using simple algorithms and
>compositional methods today. However, most people don't do that.

Very good point. To get back to my originally stated intent: to find a
means of moving us to a parallel paradigm.
I've got a half-baked thought that such a transition could be made if
a language were to be invented where expressing one-self in a non-
parallel fashion would be impossible, and the language would enforce
this in it's grammar. Such a language would prohibit operations that
weren't idempotent, and would probably end up quite functional in
nature. That way you don't have to work for parallelization, it would
come naturally for that language.


>Jamie Andrews wrote:
> In order to create a calculus of algorithms, you need to
>have mathematical objects standing for algorithms. In order to
>do this, you have to formalize algorithms in some way. I can't
>see any way of formalizing them that would not end up as some
>kind of simple programming language. You would therefore end up
>in the well-studied area of category-theoretic foundations of
>programming languages.

At this point, I quite agree, It's not possible to create an algebra
of algorithms, as I'd originally intended. Through this discussion
(again, thank you all), it's become increasingly evident that
algorithms themselves are simply too powerful to be reduced to some
sort of algebra. In attempting to do so, you'd end up creating a mini-
programming language, and wind up precisely where you started.

However, I haven't seen a formal proof that this is in fact the
situation; such a proof would be a nice result to have. Perhaps an
isomorphism can be made between algebras and some class in the
complexity hierarchy, and then a demonstration that algorithms form a
higher class in the hierarchy? or perhaps it's been proven already,
and I'm simply unaware of it.

Chris F Clark

unread,
Apr 11, 2007, 4:56:26 PM4/11/07
to
"the.theorist" <the.th...@gmail.com> writes:

> To get back to my originally stated intent: to find a
> means of moving us to a parallel paradigm.

Actually, there is an interesting point there. I recently was
listening to a many-core researcher, who was making "algorithms" for
Google-like huge arrays of many machines. He did develop a model that
was "physics" like and the "algorithms" self-assembled themselves,
learning the topology of the network and determining which part of the
problem they were supposed to solve.

His algorithms were in some sense both very simple and very complex
and he combined them with and/or type logic. However, it is hard to
describe his work without having you "see" his demonstrations. The
image that kept coming to my mind is that he was programming by taking
eyedroppers of the algorithmic parts and letting them diffuse through
the "solution" in the network.

There is definitely a functional programming connection here, which is
not surprising since the lack of side-effects in functional
programming tends to make it particularly suitable for parallel
replication.

0 new messages