The automorphism discussion

27 views
Skip to first unread message

Russ

unread,
Sep 12, 2010, 2:34:53 PM9/12/10
to kappa-developers
Eric's first post:

Jean,

I am very excited about the new implementation and the new
conventions. Having arbitrary algebraic expressions for rates and
plots on top of a faster simulator is just excellent!!

One quick point, however. Would there be any interest in having a
"mass action" automorphism convention that users could use? I know I
have been harping on this for a while, but it might get cumbersome for
users to always have to notice and correct for automorphisms in their
rules, and it might lead to some nasty "bugs" that non-specialized
users have trouble finding.

Just in case anyone is unfamiliar with what I mean, in the 'mass
action' convention we divide by "within-component" automorphisms, but
NOT "between component" automorphisms. So the rule A(x!1),A(x!1) ->
A(x),A(x) would have its rate divided by 2, but the reverse rule
A(x),A(x) -> A(x!1),A(x!1) would not have its rate modified. This is
meant to make the specified rates in the kappa model match with the
rates one would generally define if one were making an ODE model of
these kind of reactions.

I realize that the Danos convention is the most flexible and general,
but honestly 99.9% of people making models with kappa will be
implicitly assuming the mass action convention that is universally
used in ODE modeling (and I think it is used in BNGL, although I am
not sure about that). This conflict between user assumptions and
kappa reality could lead to a lot of grief for new users. I know
there has been some resistance to this idea in the past, and some
exasperation that we cannot decide on a convention, but I really think
it would make sense to give users the 'mass action' option.

Anyway, people can feel free to vent about my stance on
automorphisms. Regardless, I am really excited about the new
implementation!!

Cheers,
Eric

Russ

unread,
Sep 12, 2010, 2:36:12 PM9/12/10
to kappa-developers
Vincent's reply to Eric:

Eric> Just in case anyone is unfamiliar with what I mean, in the 'mass
action' convention we divide by "within-component" automorphisms, but
NOT "between component" automorphisms. So the rule A(x!1),A(x!1) ->
A(x),A(x) would have its rate divided by 2, but the reverse rule
A(x),A(x) -> A(x!1),A(x!1) would not have its rate modified. This is
meant to make the specified rates in the kappa model match with the
rates one would generally define if one were making an ODE model of
these kind of reactions.

perhaps one way to reconcile everyone is to write
A(x!1),A(x!1) -> A(x),A(x) @ koff
A(x),A(x) -> A(x!1),A(x!1) @ kon

and later in another file
koff = k/2
kon = k'
(if you want the mass action gameplay)

one advantage is that we can manipulate a file of parameters
independently?

Russ

unread,
Sep 12, 2010, 2:37:14 PM9/12/10
to kappa-developers
Russ's reply to Vincent:

Surely the best thing is for the simulator to have run-time options
that allow it to interpret rate constants however the user wishes.

"simplx --ssa ... " or "simplx --eric ... " or whatever

This is a never-ending conversation otherwise and nobody agrees on
which convention is best: my prefered convention is different again
from SSA, Vincent or Eric!

r.

Russ

unread,
Sep 12, 2010, 2:37:57 PM9/12/10
to kappa-developers
Nicolas' reply to Eric:

Eric> Anyway, people can feel free to vent about my stance on
automorphisms.
Regardless, I am really excited about the new implementation!!

I like the Danos convention, because I find it clearer in corner
cases.

What happens in rules where there is not always an automorhism but
some instances of
the rule have an automorhism?

A(s~p) , A() has its own rate
whereas
A(s~p), A(s~p) has a rate / 2

I find that surprising. Especially because people could expect the
first rate to be
dynamically rate/2 in the case where s~p on the second agent (Why not?
It is an
automorphism, at that point), which does not seem technically
feasible.

It seems to me the other convention is feasible for Petri nets but
lack consistence
for rules in a richer language.

Best regards,

Nicolas.

Russ

unread,
Sep 12, 2010, 2:38:51 PM9/12/10
to kappa-developers
Russ's reply to Nicolas:

hi Nicolas,

The real issue is not the absolute automorphisms of the LHS but which
of them are preserved by the action of the rule. Such an automorphism,
by definition, cannot be detected by the mechanism of interaction and
so most definitely should be used to divide the rule's activity. OTOH,
an automorphism that is not preserved by the action is detectable by
the mechanism and should not be used to divide activity.

This is my prefered convention: events = embeddings / preserved
symmetries.

Cheers
russ

Russ

unread,
Sep 12, 2010, 2:39:49 PM9/12/10
to kappa-developers
Jerome's message:

Hi,

Since we are dicussing about a convention,
I prefer backward compatibility.

Do we have a clear enumeration of the wished convention ?

(It is quite important to know which we cant to be supported before
doing
the engines, since a lot of feature are related with each other).

Cheers.
Jerome.

Russ

unread,
Sep 12, 2010, 2:40:35 PM9/12/10
to kappa-developers
Russ's reply to Jerome:

hi Jerome,

You mean you want a list of all the conventions we would like to
support?
I would say:

- Vincent's convention: event = embedding of LHS
- SSA convention: event = embedding of LHS / all automorphisms of LHS
- Eric's convention: event = embedding of LHS / product of
automorphisms of each connected component (i.e. divide by intra-
symmetries only)
- Russ's convention: event = embedding of LHS / all automorphisms of
LHS preserved by the action

I don't care which convention is the "native" convention of the
simulator as long as run-time options allow a user to choose his
preferred convention, i.e. transparently specify what he means by his
rate constants without having to do any numerical conversion by hand.

While we are having this conversation, what do you all think about a
kappa file specifying a volume and rate constants being deterministic?
Appropriate stochastic (sic) rate constants can then be calculated
trivially by the engine. At the moment, this is done entirely at the
level of the GUI in rulebase; is that good enough or should it be
built in to the engine?

Cheers
russ

Russ

unread,
Sep 12, 2010, 2:41:57 PM9/12/10
to kappa-developers
Vincent's reply to Russ:

Russ> - Vincent's convention: event = embedding of LHS
Russ> - SSA convention: event = embedding of LHS / all automorphisms
of LHS
Russ> - Eric's convention: event = embedding of LHS / product of
automorphisms of each connected component (i.e. divide by intra-
symmetries only)
Russ> - Russ's convention: event = embedding of LHS / all
automorphisms of LHS preserved by the action

so according to Russ's convention:
an event for A(x),A(x)->A(x!1),A(x!1) is an unordered pair of As,
whereas
an event for A(x,z~0),A(x,z~0)->A(x!1,z~0),A(x!1,z~1) is an ordered
pair of As
independently of the convention I like the way Russ phrases it, in
terms of what
an event "is"

because of 2. it is unreasonable to use SSA as different
representatives of an event
can be sometimes distinguished; here is the usual example:
x = A(x,z~0,t~0),A(x,z~0,t~1) goes to either of
y1= A(x!1,z~0,t~0),A(x!1,z~1,t~1) or
y2= A(x!1,z~1,t~0),A(x!1,z~0,t~1)
depending on which match one chooses
because of the "tags" t~1 and t~0 breaking the symmetry,
y1 and y2 are not isomorphic
(note that this will never happen in Petri nets, so SSA in PN is ok)

for the same reason, Eric's convention fails (needs another example
though, left to the reader)

remain mine and Russ's which are both compatible with the
"one event-one transition" principle (
it connects to conversations we had with
Eric, Russ and Nicolas on the simplicity of the underlying TS)

Russ's is the maximal quotient that is compatible with the principle;
it relies on the notion of "mechanism", the idea being that if two
LHS-symmetric matches never incur any possible distinctions
between the results, then the mechanism itself is symmetric,
and the rate is the rate of that mechanism, not of the syntactic
ways in which we describe its occurrences.

It is quite likeable I think and
very consistent - and as such it has some educational value
(and perhaps some heuristic value for later extensions eg with
truely geometric constraints). (does the refinement formula remains
exactly the same as in the plain convention?)

it might even be the "right one" on some abstract footing?

Russ> what do you all think about a kappa file specifying a volume and
rate constants being deterministic? Appropriate stochastic (sic) rate
constants can then be calculated trivially by the engine. At the
moment, this is done entirely at the level of the GUI in rulebase; is
that good enough or should it be built in to the engine?

I think it could be a nice option; eg if the volume varies during the
simulation,
or if you have multiple compartments (as in the rule-diffusion systems
Donal
has implemented, of which more later)

Russ

unread,
Sep 12, 2010, 2:43:53 PM9/12/10
to kappa-developers
Walter's message:

I had some discussions with Russ on this.

First off, there are many possible ways of slicing this pie - the 4
ones listed by Russ have some reasonable narrative. Some of these
narratives are formal, some physical. I think we need to be
pluralistic on this and leave it up to the user by enabling several
options, as Russ argues.

But first let me suggest a renaming of these scenarios (though not
necessarily names for the actual command line option). It makes no
sense to let private language slip into option names. Thus, --eric
makes no sense to my mother (though it may make sense to Eric's).

event-based convention (Vincent's convention): event = embedding of
LHS
physical convention (SSA convention): event = embedding of LHS / all
automorphisms of LHS
empirical convention (Eric's convention): event = embedding of LHS /
product of automorphisms of each connected component (i.e. divide by
intra-symmetries only)
mechanism-based convention (Russ's convention): event = embedding of
LHS / all automorphisms of LHS preserved by the action

The scenarios divide into how much structure your preferred view of
the world sees in collisions (events) between complexes. Since Kappa
sees events as site graphs induced by left hand sides of rules (but
see item 4 below), a translation may have to be made from your
preferred view into Kappa's and vice versa. In all of this, the
principal subtlety is to remain fully aware of the meaning of a rule
(to which Russ contributed great hygiene). The translations that were
begun by Vincent (via neutral refinement), then continued by Russ and
Eric by squaring off physical/empirical and formal views, helped make
us understand the "meaning of a rule" better.

Here's my poor man's take of this thread, which may need considerable
cleaning. I'm sure you all know this backwards and forwards, so excuse
the repetitiousness.

(0)

"Physical Events" come in two flavors:

- reactions between objects whose structure is "fully resolved" with
regard to the level of description set by site graphs.

- reactions between proper names, in which objects are black boxes;
i.e. the way ODEs or simple Petri Nets represent the world.

"Formal Events":

- embeddings of site graphs into a mixture, which, in Kappa, is an
object that cannot be further refined given a particular contact map.
(Such objects also appear on the LHS and RHS of what we call "reaction
rules".)

(1)

The Physical View knows about the internal structure of objects and
"fully resolves" events and objects up to symmetry. Thus the activity
of *reaction rules* (the only type of rules considered by the Physical
View) is divided by autos of objects *and* autos between objects
(symmetries of Physical-Event configurations). That's the standard
case and assumes that, conceptually, rate constants are "first-
principles rate constants".

The translation from the Kappa viewpoint is subtle, however, because
it already involves a take on the "meaning of a rule". That take was
clarified by Vincent by appeal to the notion of a neutral refinement.
In essence, given a rule, the set of all reaction rules derived from
it must be a neutral refinement. For a reaction rule to be a (member
of a) neutral refinement, the reaction rule must *ignore* any
symmetries it breaks or makes compared to the original rule; otherwise
it would contradict the mechanism asserted in the rule. For example,

r: A(x!1), B(x!1) - > A(x), B(x) @ k

might refine into a set of reaction rules of which one might read like

s: A(x!1,y!2), B(x!1), A(x!3,y!2), B(x!3) - > A(x!1,y!2), B(x!1),
A(x,y!2), B(x) @ j

The rule is asymmetric, but the reaction rule is symmetric. The point
is that if the rule asserts what you believe, the mechanism implied
must be blind to the symmetry of the reaction rule. To translate that
into a rate constant for s requires knowing the convention adopted by
the simulator implementation. Assume that convention is "divide the
number of embeddings by all autos of the LHS". I want the firing
probability of each Formal Event (embedding) of s to be exactly the
same (i.e. neutral refinement) as the firing probability of each
Formal Event of r. Thus, under the implementation convention, I must
set j = 2k, to compensate for what the simulator does. In so doing,
each Formal Event of s will be firing at the same rate as each Formal
Event of r, namely k.

Same thing in case r is symmetric and the refined reaction rule breaks
the symmetry.

r: A(x!1), A(x!1) -> A(x), A(x) @ k

s: A(x!1,y!2), A(x!1), B(y!2) -> A(x,y!2), A(x), B(y!2) @ j

Rule r asserts that the mechanism it describes cannot detect
asymmetries, even if the larger context in which it might be embedded
may contain such asymmetries; the mechanism is blind to anything
beyond A(x!1), A(x!1). For this to hold in s, j must be set to k/2. In
that case, as before, every Formal Event of r and every Formal Event
of s fire with the same likelihood (hence neutral refinement), namely
k/2 (assuming the implementation convention above). The blemish that
Vincent dislikes is that we don't notate the reaction rate constant
per "Formal Event" (k/2) but per Physical Event (k). But it only
shows up when you think through a refinement. If you do, you are
likely to be a user who understands the framework and there should be
no problem. If you want to think in terms of Physical Events at the
level of rules, as most do, there is no blemish (as you wouldn't
reason in terms of neutral refinements). The Physical Event convention
makes pragmatic sense.

The Formal Event view is important for theorizing about a system,
because the combinatorial factors are made visible in the translation
to the Physical Event view. These factors are key in understanding the
relative weights introduced by a non-neutral refinement (when you
choose whatever rate constants you wish) - these biases being relative
to the contribution of each refinement at the neutral point.

This story iterates further when going from reaction rules to
reactions, i.e.

r: A(x!1), A(x!1) -> A(x), A(x) @ k

refines into the reaction rule

s: A(x!1,y!2), A(x!1), B(y!2) -> A(x,y!2), A(x), B(y!2) @ j=k/2

which translates into the reaction:

t: Eric() -> Jean(),Russ() @ m

There is no change in the rate constant vis a vis s (both lack
symmetry and that was accounted for with j=k/2). Thus m = k/2. The
comparison between the rate constants of r and t makes the Physical
View explicit.

(2)

The Empirical View works with empirical rate constants, which absorb
all combinatorial factors. This is the case that Eric is concerned
with. However, read carefully how I portray it, because I'm not sure
this is really what he is saying (but it is the only way I could make
sense of it). The Empirical View inherits from the Physical View the
stance that Physical-Event configurations are up to symmetry, but it
asserts that that symmetry has already been accounted for by the
*measurement* of the rate constant.

Thus, let the k in

r: A(x!1),A(x!1) -> A(x),A(x) @ k

come from experiment. Since experiment does not resolve "A(x1),A(x!1)"
but only "Jean()", the k is "per Physical Event". Hence it is OK to
have it divided by 2 in the simulator, since the simulator does not
see "Jean()" but "A(x1),A(x!1)" and counts two Formal Events,
overcounting relative to the physical measurement.
Likewise, in

r: A(x),A(x) -> A(x!1),A(x!1) @ j

Eric argues that the measurement sees "A(x),A(x)" as "Bob(),Bob()" and
implicitly makes the division by 2 already in delivering the
macroscopic rate constant, as the measurement cannot resolve between
the ordering of the Bob()'s. Thus the empirical j is already adjusted
and *no* division by 2 should be made *by the simulator*
(alternatively, Eric would multiply j by 2 to offset the simulator
action).

In summary, Eric argues that the Empirical View cannot see any
structure *internal* to the objects - it only sees proper names (thus
his reference to ODEs). As a consequence, any reference to internal
structure by Kappa must be undone - either the simulator already does
it by the SSA convention as in the case of the dissociation, or you
have to do it, as in the case of the association. However, Eric wants
an option that does it automatically.

(3)

Vincent's view is the pure Formal Event view. I think it's meaning is
clear from the discussion in (1). It is theoretically very clean. It's
value, however, is important if you do theory with Kappa, but too
"disruptive" if one simply wants to do simulations using the
traditional Physical or Empirical world views.

(4)

Now, finally, the Mechanism-based view of Russ. That view clarifies
one thing about the meaning of rules that was not yet clarified by the
neutral refinement take. The fact that in A(x,y), A(x,y) you may have
*two* bona fide mechanistic paths for a reaction. If the reaction is:

r: A(x,y), A(x,y) -> A(x!1,y), A(x,y!1)

then there are two events, since there are two bonding opportunities,
as there are two (x,y) pairs. This is the analogue of the BNGL case of

b: A(s~0,s~0) -> A(s~1,s~0)

where agent interfaces are multisets.

The issue arises because now "symmetry" reaches into what the
interface "can do", which you can't know unless you look at the RHS of
the rule. Either we redefine the notion of "Formal Event" (and
accordingly all translations in (1) and (2), and add a translation to
(3)) or we treat this as another view (my preference), which I would
call the Mechanism-Based view, because you have to look at the RHS to
assess the full mechanism and discover that the LHS has two "Sites of
Action". I can see the physicality of it.

In any case, options are really very much needed, along with the
Russetta Stone.

w

(By the way, not sure whether you knew that BNGL makes its reaction
rate dependent also on the molecularity of the products, which is
biophysically not entirely unwarranted, because of entropic effects -
think explosion...)

Russ Harmer

unread,
Sep 12, 2010, 3:55:38 PM9/12/10
to kappa-developers
1.

Vincent> because of 2. it is unreasonable to use SSA as different
Vincent> representatives of an event
Vincent> can be sometimes distinguished;
Vincent> (note that this will never happen in Petri nets, so SSA in PN is ok)

In the context of a contact map where

A(x,y), A(x,y) -> A(x!0,y), A(x,y!0) @k

cannot be further refined, the SSA convention collapses the two possible events. To get the expected kinetics would require a manual tweaking of the rate constant to 2k. This is the "structured version" of the issue in the world of structureless reactions/PNs where there is only one syntactic expression of homo-dimerization:

A + A -> A_2 @k

so how do you make the difference between a symmetric and an asymmetric mechanism? Well, depending on your simulator's convention:
- SSA simulator => asymmetric mechanism requires rate constant 2k
- non-SSA simulator => symmetric mechanism requires rate constant k/2

So, while Vincent is correct to say that SSA "is OK" for PNs, this is only because certain aspects of mechanism are encoded in the rate constants. That this is necessarily the case is, to my mind, an irrefutable argument against the use of structureless reactions/PNs, period.

Walter> The Physical View knows about the internal structure of objects and
Walter> "fully resolves" events and objects up to symmetry. Thus the activity
Walter> of *reaction rules* (the only type of rules considered by the Physical
Walter> View) is divided by autos of objects *and* autos between objects
Walter> (symmetries of Physical-Event configurations). That's the standard
Walter> case and assumes that, conceptually, rate constants are "first-
Walter> principles rate constants".

Maybe I misunderstand what Walter is saying here, but it seems to me he is wrong; for the same reason as above.

In any case, our reference point in discussing automorphisms and rules should not be reactions of any kind. A rule is a formal mechanistic hypothesis that states all and only what you hypothesize to be relevant to the interaction. Walter's claim that the physics actually "knows" more than that seems to me tantamount to saying that rules are just a convenient fiction, and not really mechanistic hypotheses. But if a rule is correct [admittedly, hard to ever verify...] then it really does state all and only those aspects of the physics that actually matter. So what more can the physics know? This is not splitting hairs. I find it bizarre that we are still occasionally sliding back into this extensional physicist's way of thinking where "fully resolved" objects actually exist -- in the sense of "esse est percipi". If we really truly take rules seriously, these fully resolved wotsits do not exist. And the SSA convention -- that relies on them existing -- makes absolutely no sense.

2. I am uncomfortable with [Walter's take on] Eric's empirical point of view simply because I am deeply suspicious of structureless reactions, period. Packaging aspects of mechanism into rate constants [which is what this convention is actively advocating] just seems wrong-headed to me. That these are the rate constants that can be measured in the lab is not a reason to short-cut our description of the actual underlying mechanism, IMHO. Nor a reason to ascribe the "same" mechanism to two actually quite different processes.

3. My personal feeling -- and I see that Vincent seems to agree with me -- is that only the Vincent and Russ conventions are kosher w.r.t. rules as mechanistic hypotheses. Vincent's is agnostic to mechanism whereas mine takes it as seriously as possible. I am currently working out the mathematical side of all this, including a fully semantic treatment of "actions", and notably how these two conventions different w.r.t. refinement. Clearly, my convention leads to a more subtle calculation of neutral refinement since the preserved symmetries of a refined LHS can in general be quite different to those of the original rule. But this is precisely what we would expect, given that a refined rule is hypothesizing a different mechanism to the original rule.

Cheers
russ

PS -- None of the above should be taken to mean that we should not have run-time options for the SSA or Eric conventions; we certainly should. My arguments are just about the ultimate conceptual status [or apparent lack thereof] of these conventions.

Vincent Danos

unread,
Sep 12, 2010, 6:07:03 PM9/12/10
to Russ Harmer, kappa-developers


On Sun, Sep 12, 2010 at 8:55 PM, Russ Harmer <russ....@gmail.com> wrote:
I find it bizarre that we are still occasionally sliding back into this extensional physicist's way of thinking where "fully resolved" objects actually exist -- in the sense of "esse est percipi". If we really truly take rules seriously, these fully resolved wotsits do not exist. And the SSA convention -- that relies on them existing -- makes absolutely no sense.

to second that, there is no fully resolved description of say "transcription" 
of a gene at the level of a reaction;
that would mean that one transcription step is described as A -> B;
I would be more cautious than Walter in giving reactions an ontological premium,
maybe, biochemical network realities are best captured by rules after all;
at the end of the day it is a matter of who invents the best info/statistical mechanics
of the aforementioned biochemical networks.

PS: total respect pour le "esse est percipi"


Walter

unread,
Sep 12, 2010, 8:22:49 PM9/12/10
to kappa-developers
Ho ho ho! Hold the horses, Russ and Vincent.

I go for an innocent shopping and come back to see the group taken
over by the religious left.

Read what I write. Never ever have I said or claimed anything about
what "physics actually knows". What do I know what physics actually
knows? What do you, for that matter? My exposition was simply a review
of the NARRATIVES (the informal stories) that go along with VIEWS,
i.e. stances, assumptions; hence my term the "Physical View" etc.
Notice even the capitalization to separate the term from whatever is
"actually going on", which neither you nor I can know for sure.

Indeed, this is not splitting hairs. But for crying out loud, stop
pretending to be a fundamentalist. If you assert at every turn that a
rule is forever subect to change and hard to verify etc etc, then I
find it bizarre to complain that I'm not taking rules seriously enough
only because I'm taking seriously that there are other readings of the
world - readings that do NOT claim any ontological substance, priority
or whatever, but exist entirely for PRAGMATIC reasons. Gosh, you guys
have some hubris on occasion, don't you?

All I was summarizing were views and their rationale for treating rate
constants. We all are in this group because we believe that Kappa adds
a useful point of view. We even believe it to render the world in a
more meaningful way than any other view. is that good enough? But this
does not diminish the set of people who may be less interested in
fundamentals than in using Kappa as a plain tool. We can explain to
them why they should adopt the Kappa View, but they should be able to
choose otherwise, if for no other reason than to be compatible with
whatever they did before.

I agree - and I think I did not say otherwise - that Russ'
interpretation of rules as MECHANISMS is kosher. I demur about
Vincent's, but I see the utility of the Formal View. I just don't
connect it with mechanism, but I think it is important.

@ Vincent: the "transcription" example is nonsense. The Physical View
or the Empirical View, as I summarized them, were subordinate to the
Kappa View. In other words, my narrative was: GIVEN RULES, what would
the Physical View amount to? This is why a Kappa rule always preceded
a reaction rule or a reaction. If a Kappa rule is allowed to assert "-
> PI3K()", then the Physical View is allowed to consider translation
as a one-step process, no? These VIEWS are all interpretations of
whatever level of granularity you wish to assert in Kappa or Suaheli.
What constitutes the right level is not easy to answer and whatever
the answer, there is nothing in Kappa that would provide it, for Kappa
rules are descriptive and nothing but descriptive. I have NEVER taken
an ontological stance - and the least about reactions or reaction
rules. I think it should be easy to agree that this whole discussion
is about points of view (Physical, Empirical, Formal, Mechanistic -
for want of better labels, although I think these are not bad)
relative to one another and their respective merits. Kappa is a better
rendition of the epistemological situation - that much seems certain.
but when we get to ontology, I suggest we ALL better be more cautious.
Amen.

w

On Sep 12, 6:07 pm, Vincent Danos <vincent.da...@gmail.com> wrote:
> On Sun, Sep 12, 2010 at 8:55 PM, Russ Harmer <russ.har...@gmail.com> wrote:
> > I find it bizarre that we are still occasionally sliding back into this
> > extensional physicist's way of thinking where "fully resolved" objects
> > actually exist -- in the sense of "esse est percipi". If we really truly
> > take rules seriously, these fully resolved wotsits *do not exist*. And the

Russ Harmer

unread,
Sep 12, 2010, 11:02:57 PM9/12/10
to Walter, kappa-developers
This is a discussion about symmetries/automorphisms in rule-based modeling. Four conventions have been suggested; Vincent and I are pointing out that two of them are mathematically completely compatible with the framework of rules whereas the other two require contortions to fit in. This is not hubris nor is it fundamentalism; it is mathematics and that is what I mean by "taking rules seriously". Translating this to an informal discussion of "Views" loses a lot of the clarity that mathematics brings, particularly since it seems to put all four conventions back on the same footing. But this does not mean that those other two conventions should be banished; they probably have pragmatic use -- I even said so explicitly -- and some people will quite justifiably be more comfortable using them.

However, the "Physical View" is confusing as stated because it seems to be inconsistent:
- it cannot possibly accurately describe physics -- even in the unattainable paradise where we know we have written the correct reaction rules -- because (in general) it over-quotients the space of events;
- it therefore must be encoding certain aspects of the physics in the numerical values of rate constants -- which therefore cannot possibly be "first-principles" as you suggest. Witness the need to fiddle the rate constant to obtain asymmetric homo-dimerization under the SSA convention; exactly of course what the mathematics already told us.

r.

Walter

unread,
Sep 13, 2010, 1:28:39 AM9/13/10
to kappa-developers
Russ,

We are mostly in agreement. But I do believe that you have a distorted
view of the Physical View. The question here is not whether Kappa
makes more sense than the Physical View. (It does, but for reasons
that are foreign to the Physical View.) The question is simply: what
does the Physical View assert? The answer to this question cannot come
from Kappa mathematics, since the Physical View was forged
independently of it. The Physical View does not know about embeddings.
But why is that the fault of the Physical View (PV)?

The PV refers to the structure of objects. It refers to their
structure, because in its gory details it does quantum mechanics with
these things. In a bimolecular scenario, it takes two objects and
calculates the collision cross-section of a reaction between them.
That cross-section involves some QM calculations on energy surfaces
along "reaction coordinates" and then an integration over all paths.
This is called Molecular Reaction Dynamics and is an active field of
chemical physics. The end result of these calculations is a rate
constant. Obviously, this can be carried out in only very simple
cases, but the principles are spelled out in general. As far as I'm
concerned, this is as first-principles as a reaction rate constant can
get. (We all agree that such rate constants are likely to be a fiction
in actual cell biology, but that is a different issue.)

When a symmetric molecule reacts with something, the PV reaction
constant can be specified "per site" or "per molecule", since you are
doing this calculation on the full structure. A benzene ring with
sixfold symmetry will offer 6 equivalent carbons for a reaction with a
chloride. Nowhere does the PV divide by 6. In fact the PV never
divides, except for inter-symmetries, because it adopts the quite
reasonable stance that these are indistinguishable. So, rate constants
get combinatorial factors that come from true multisets of sites (such
as the carbons in the benzene) and from identical interaction
partners, based on the stance that a collision is an unordered pair.
Now what is so inconsistent about that? In fact, the PV exactly
reproduces your case. It would immediately take into account that
there are two sites of action in A(x,y),A(x,y) -> A(x!1,y),A(x,y!1)
[where I'm abusing Kappa to write a chemical reaction equation...].
However, it would divide by 2 because of identical reaction partners.
The net effect is that the divison by 2 cancels the factor 2 from the
identical action sites. It comes out exactly as you assert in your
Mechanism View.

Seen from this angle, oddities, if any, come from Kappa, which
introduces embeddings, a non-physical concept made necessary by the
representation of patterns as site graphs and the semantics of rules.
The translation from PV into Kappa simply reflects two different
representations. The clarity that your mathematics brings, it brings
to Kappa, because it is Kappa-specific. The PV makes sense on its own;
it doesn't need Kappa to give it meaning.

As an aside, the SSA, is a poor man's PV that deals, originally, only
with reactions; that is, with proper names; no structure. Thus the
consistent division by autos, which by definition can only be inter-
autos. Your situation of a double action site cannot even arise in the
SSA.

Maybe what you are reacting to is the application of these conventions
to Kappa, that is, treating Kappa rules as if they were about SSA or
PV objects. That I do understand. I am not advocating at all that
Kappa-the-formalism should use any of these conventions - it clearly
should use the mathematics appropriate to Kappa! I'm only advocating
that these conventions be available in the simulator so that people
can use Kappa for purposes in which these conventions make sense.

In sum, I believe that the PV is actually exactly what you have
arrived to as the Mechanism View. The SSA is not the PV; it is the
coarsening of the PV in which the structure of objects is not
accessible. It is OK , because your case does not arise; but of course
it is a stretch to apply it to Kappa in which objects do have a
structure.

I could imagine a PV in which the objects are "reaction centers". The
question then may arise how to embed these reaction centers into
actual molecules for the purpose of getting the factors to convert
"per reaction center" rate constants into reaction rate constants.
That would generate an embedding concept like in Kappa. There is no
question in my mind that your case would be treated as two reaction
centers. The PV always looks at the left and the right of a reaction,
or else it could not determine reaction coordinates! I thus venture to
claim that the PV is Russ' MV. Can't get more consistent than that,
no?

w


On Sep 12, 11:02 pm, Russ Harmer <russ.har...@gmail.com> wrote:
> This is a discussion about symmetries/automorphisms in rule-based modeling.
> Four conventions have been suggested; Vincent and I are pointing out that
> two of them are mathematically completely compatible with the framework of
> rules whereas the other two require contortions to fit in. This is not
> hubris nor is it fundamentalism; it is mathematics and that is what I mean
> by "taking rules seriously". Translating this to an informal discussion of
> "Views" loses a lot of the clarity that mathematics brings, particularly
> since it seems to put all four conventions back on the same footing. But
> this does not mean that those other two conventions should be banished; they
> probably have pragmatic use -- I even said so explicitly -- and some people
> will quite justifiably be more comfortable using them.
>
> However, the "Physical View" is confusing as stated because it seems to be
> inconsistent:
> - it cannot possibly accurately describe physics -- even in the unattainable
> paradise where *we know we have written the correct reaction rules* --
> because (in general) it over-quotients the space of events;
> - it therefore must be encoding certain aspects of the physics in the
> numerical values of rate constants -- which therefore cannot possibly be
> "first-principles" as you suggest. Witness the need to fiddle the rate
> constant to obtain asymmetric homo-dimerization under the SSA convention;
> exactly of course what the mathematics already told us.
>
> r.
>

Vincent Danos

unread,
Sep 13, 2010, 9:44:11 AM9/13/10
to Walter, kappa-developers


On Mon, Sep 13, 2010 at 1:22 AM, Walter <walter....@gmail.com> wrote:
@ Vincent: the "transcription" example is nonsense. The Physical View
or the Empirical View, as I summarized them, were subordinate to the
Kappa View.

Hi Walter

I must say at the moment I understand little about your long message which is very rich and probably deserves some slow-paced studying.  So I was only responding to you saying that reactions are the only type of rules considered by the PV. This bit of your message to be precise:
    "Thus the activity of *reaction rules* (the only type of rules considered by the Physical View) ..." [my italicizing]
What did you mean? 
Essentially, I oppose the above statement if it means that any situation ultimately (in this view or another) can be made sense in terms of reactions - like the golden standard (reaction) of the US dollar (rule).

In the case of transcription (and any significantly solid state polymer chemistry) there are no reasonable reactions, just rules, and I am saying this ontologically, in any view of the world (that does purport to say something about transcription obviously). So i don't understand your reaction to my remark.

I am not sure how this is connected to anything concrete though, so perhaps this is matter that we don't have to agree on.

Vincent

Eric Deeds

unread,
Sep 13, 2010, 4:13:13 PM9/13/10
to kappa-developers
So, first off, thanks to Russ for reposting this thread here on kappa-
developers so that I could read it!

I have to concur with Vincent that there is a huge amount of stuff to
digest in this thread, and I have not had enough time to do so. I
have only just come out from under this avalanche of text, so what
follows is just a few brief reactions and impressions.

For one, we have to not get too far ahead of ourselves. Cells contain
molecules, and those molecules undergo chemical reactions. The
process of transcription, for instance, involves thousands of chemical
reactions--if we choose to coarse-grain them in some model, fine, but
that does not mean that it is generally improper to posit or consider
their existence. I mean, people have done many, many experiments on
the transcription reactions: they have watched them with spectroscopy
and in iterative crystal structures, they have characterized them in
living cells, they have probed the various transition states that
occur as each step in transcription happens. I have no idea what
Vincent is saying when he says that we cannot make sense of this type
of thing as a set of chemical reactions. That is its very nature. I
am probably misunderstanding something here.

I am also not sure why it does not make sense to view things
extensionally when thinking about modeling physical systems using
kappa--after all, we are talking about counting embeddings, which
seems an inherently extensional endeavor to me. In a simple example,
say we have a rule A(x),A(x) -> A(x!1),A(x!1). We take the A's on the
LHS and call them A1 and A2. Say we have two A(x) patterns in the
soup. The agents in each of these patterns is labeled; say A234 and
A555. We have four embeddings: (1->234,2->555),(1->555,2->234),(1-
>234,2->234),(1->555,2->555).

Now, which of these extensional embeddings do we want to keep?
Obviously, we don't want the last 2; they violate our physical
intuition to such a degree that we don't like to consider them valid
embeddings at all; we treat them as a mistake and introduce "clashes"
to deal with them. What about the others? Well, the SSA person (call
him Gibbs) thinks there is something fishy going on here, as he thinks
of these "A" thingies as being fundamentally indistinguishable. In
his view, we introduced an extensional fiction in order to do this
simulation--that is, we imagined we could come up with a single
consistent labeling scheme for A's in the mixture. What Gibbs wants
to do is consider embeddings only up to a permutation of labels, since
the labels were arbitrary in the first place. In essence, he says
that the activity of this rule should be defined by some rate constant
times the number of equivalence classes of embeddings under
permutation.

Now, Vincent and Russ seem to be arguing that Gibbs is more than just
stubborn--his view is not self-consistent within the mathematical
framework of Kappa. One example that is floating around is the
A(x,y),A(x,y) -> A(x!1,y),A(x,y!1). In this case, I think Gibbs is
still cool--he does not want Kappa to take care of site-based reaction
constant things. Gibbs is a chemist so he already thought of all that
and he has a good rate constant in his hand. He just wants to make
sure you aren't permuting labels on him. I'm not sure what Walter
wants, but my SSA-loving Gibbs is happy with the way the simulator
would deal with this rule under our normal SSA convention.

I apologize if the above is too tongue-in-cheek, and I honestly do not
want to be inflammatory in any way. I would just absolutely love to
understand why Vincent and Russ have the only mathematically
consistent conventions in this discussion. Any help (with examples
not left to the reader :) ) would be appreciated!

Eric

On Sep 13, 8:44 am, Vincent Danos <vincent.da...@gmail.com> wrote:
> On Mon, Sep 13, 2010 at 1:22 AM, Walter <walter.font...@gmail.com> wrote:
> > @ Vincent: the "transcription" example is nonsense. The Physical View
> > or the Empirical View, as I summarized them, were subordinate to the
> > Kappa View.
>
> Hi Walter
>
> I must say at the moment I understand little about your long message which
> is very rich and probably deserves some slow-paced studying.  So I was only
> responding to you saying that reactions are the only type of rules
> considered by the PV. This bit of your message to be precise:
>     "Thus the activity of *reaction rules* (*the only type of rules
> considered by the Physical **View*) ..." [my italicizing]

Vincent Danos

unread,
Sep 13, 2010, 6:54:10 PM9/13/10
to Eric Deeds, kappa-developers
Hi Eric

Vincent is saying when he says that we cannot make sense of this type
of thing as a set of chemical reactions.  That is its very nature.  I
am probably misunderstanding something here.

I am just saying that no description of transcription takes into account the 
*entire* DNA molecule the transcription is taking place on - transcription is
explained in terms of an unspecified context - technically we don't call this a 
reaction. 

I am saying even that it is
a feature of transcription to not be a reaction but a rule (a one that Darwinian
forces presumably had to fight hard for). I have heard a lecture about the
fine-grained modeling of transcription, base-by-base with aborts, fall-offs,
backtracks etc - but none of this model posits a definite DNA sequence;
they have Kappa contexts, as opposed to the weaker Petri net contexts
of reactions (stricto sensu).

unless you use reaction in a different way, eg as in a polymerisation reaction

actually I would say that all the "information carriers" theme is a good illustration
of the possible radical inefficiency of the concept of reactions (stricto sensu) in biology 
[a subject on which Walter has written a beautiful article circa 1996]

V.

Vincent Danos

unread,
Sep 13, 2010, 6:55:31 PM9/13/10
to Eric Deeds, kappa-developers
Hi Eric (again!)

i am replying to the technical part now:
 
I am also not sure why it does not make sense to view things
extensionally when thinking about modeling physical systems using
kappa--after all, we are talking about counting embeddings, which
seems an inherently extensional endeavor to me. In a simple example,
say we have a rule A(x),A(x) -> A(x!1),A(x!1).  We take the A's on the
LHS and call them A1 and A2.  Say we have two A(x) patterns in the
soup.  The agents in each of these patterns is labeled; say A234 and
A555.  We have four embeddings:  (1->234,2->555),(1->555,2->234),(1-
>234,2->234),(1->555,2->555).

Now, which of these extensional embeddings do we want to keep?
Obviously, we don't want the last 2; they violate our physical
intuition to such a degree that we don't like to consider them valid
embeddings at all; we treat them as a mistake and introduce "clashes"
to deal with them.  What about the others?  Well, the SSA person (call
him Gibbs) thinks there is something fishy going on here, as he thinks
of these "A" thingies as being fundamentally indistinguishable.  

in this case they are indeed indistinguishable in the sense of Russ, because
the lhs automorphism that exchanges the As preserves the rule action

but in some other cases an lhs automorphism will not have this property

the example i was proposing is
A(x,z~0),A(x,z~0) -> A(x!1,z~0),A(x!1,z~1)
equating event 1= (1->234,2->555),
and event 2 = (1->555,2->234) violates the Russ convention
since they lead to different flipping actions
flip(234@z) for event 2 and flip(555@z) for event 1

this means we can find contexts that will distinguish these two flips
(though some will not, which means that the transition graph is not
simple - which is a problem for describing equlibria with energies btw);
we can make one up by assuming that A has an additional site t which is 
- free in A234
- and bound to some B(x) in A555

so that on the ground we get either
B(x!1),A555(t!1,x,z~0),A234(t,x,z~0) -> B(x!1),A555(t!1,x!2,z~0),A234(t,x!2,z~1) = y1
B(x!1),A555(t!1,x,z~0),A234(t,x,z~0) -> B(x!1),A555(t!1,x!2,z~1),A234(t,x!2,z~0) = y2
y1 and y2 are not isomorphic, they are different states of the world;
so equating event 1 and event 2 violates the one-event-one-transition principle

i think that Russ's quotient is maximal such that the principle is safe if one allows oneself to increase interfaces at will

NB: note that the two rules 
A(x,z~0),A(x,z~0) -> A(x!1,z~0),A(x!1,z~1)
A(x,z~0),A(x,z~0) -> A(x!1,z~1),A(x!1,z~0)
are isomorphic
here we are disputing something different, namely rather
the spaces of matches of these rules should be quotiented

Jérôme

unread,
Sep 14, 2010, 9:04:39 AM9/14/10
to kappa-developers
Whaouh! Things are going fast.

To reply to Russ's question (the first one -):).
The answer is yes, I would like to have an exhaustive list of the
conventions we would like to support,
and I am happy of the answer.

I would like to brainstorm a little bit about how to make it work
(from a software enginer / or syntax point of view).

My point is that if we want people to share Kappa modules, we should
allow the sharing of modules using different conventions.

How do we do that ?

- Just using a command-line option is not relyable, because it would
only works if all modules use the same - conventions.

- We could use directives such as #ssa meaning that in the following
and until the next directives, we use the ssa convention. Or we could
use different symbols for the rates.

lhs -> rhs @ssa k
lhs' -> rhs' @method' k'.

I do not have any preference, I just want to raise this issue, and see
what you think.






Jérôme

unread,
Sep 14, 2010, 9:08:27 AM9/14/10
to kappa-developers
***
With respect to the conventions, and their argumentations.

I am quite skeptical with respect to the arguments pros/cons each of
the conventions.
This must be due to my lack of background in physics and/or biology.

- With respect to experiments, I do not really care if we measure k, k/
2, k/4 as long as we have the conversion formula.
(Is n't is the same for any kind of convention? )

- With respect to the symmetries (intra/inter/gained/lost), I do not
really understand why symmetries which would be lost in the rhs, are
important if their are not lost in a given application of a rule.
Although that I understand that from a CS point of view, we want to
stick to the locality principle since it would be a nightmare to
define any semantics notion is the weight of an event depends on the
context of application, but it is another kind of consideration, isn't
it.

The only thing which matters to me, as a CS guy, is that the formula
is local (does not depend on the context of application of a rule),
It gives nice structure to semantics notions (such as the notion of
test), and allows the design of efficient abstractions for the
quantitative semantics.

Walter

unread,
Sep 14, 2010, 1:28:48 PM9/14/10
to kappa-developers
I'd like to tidy up the loose end I left hanging on the "ontology of
Kappa"... Nothing really serious, but perhaps nonetheless useful to
mull over on occasion. I have the water up to my chin right now, as
I'm soon leaving on a little trip. I'll try to get to it in the next
days - unless you implore me to stop wasting your time. Apologies for
the length of my previous messages - that happens when I'm in a rush.
Compression requires thinking, which requires time...

w

Eric Deeds

unread,
Sep 14, 2010, 5:26:49 PM9/14/10
to kappa-developers
Vincent,

In reply to the thread about transcription, I now see what you are
saying. I agree with you, mostly, although I might put it in slightly
different terms.

In the cell, or a test tube, or wherever, the transcription event that
is happening is happening as a reaction in the strict sense. We can
measure or control the sequence of the DNA molecules present, and in
so doing perceive and understand the entirety of the context in which
transcription takes place. The fact that we don't need to include,
say, the entire sequence of DNA in order to have a good model of
transcription is great, but it does not derive from the fact that
transcription is "rule-based" a priori. It has to do with the fact
that long DNA polymers floating in salt buffers inhabit a very
dissipative environment, and so ignoring these details results in a
very good approximation.

I really like your observation that, in order to evolve information-
carrying objects, living systems had to come up with systems for which
this type of approximation will work well. It is a very good point,
but it does not mean that representing transcription systems using
rules is not relying on an approximation of reality, however good that
approximation may be.

Eric

Russ Harmer

unread,
Sep 14, 2010, 6:01:13 PM9/14/10
to Eric Deeds, kappa-developers
We can ... perceive and understand the entirety of the context in which transcription takes place.
 
But the enzyme doing the transcription can't. So the rule is only "approximating" reality because it doesn't mention the stuff that the mechanism doesn't depend on?

As a Harvard professor once wrote, "understanding systems requires resisting the temptation of adopting the view of an outside observer".

r:)

Eric Deeds

unread,
Sep 17, 2010, 10:09:29 PM9/17/10
to kappa-developers
Russ,

I don't think that my statements require adopting the view of an
outside observer. I was just pointing out that the entire sequence is
indeed there, in the test tube or cell or nucleus or whatever.

It is of course true that the polymerase cannot "see" very far...the
screening length in salt water is so short, and the motions of these
molecules so over-damped, that there is absolutely no chance you could
ever get long-scale electrostatic or mechanical coupling. We should
not, however, interpret this fact as implying that the polymerase
cannot possibly ever know about sequences distal from it.

That is due to the fact that there are (rare) configurations in which
these distal sequences are close enough to the polymerase in space
that they could exert an influence. There is an entropic barrier
here, sure, but that does not mean that the influence in question is
not important. In fact, this is preciesly the mechanism underlying
enhancer elements, which are sequences that influence transcription
despite being hundreds of thousands of base pairs up- or downstream of
the gene in question.

By taking a rule-based approach here, we are essentially adopting a
mean-field approximation. I agree that it is a good approximation;
aside from things like enhancers, most DNA elements that happen to
wander near the polymerase aren't going to have much influence, so we
can (implicitly) average them out in producing our rules and still
have a useful description. This is different, however, from saying
that the polymerase "can't know" about the existence of these
sequences a priori; it can.

Eric

On Sep 14, 5:01 pm, Russ Harmer <russ.har...@gmail.com> wrote:
> > *We* can ... perceive and understand the entirety of the context in

Russ Harmer

unread,
Mar 3, 2011, 10:48:22 AM3/3/11
to Jérôme, kappa-developers
I don't think we ever came to a conclusion about this?

I would still like the command-line option -- it would often be sufficient. But Jerome is right that it is not general enough; I don't like the idea of annotating every rule individually, but am perfectly OK with directives.

@Jean: what does KaSim currently do in this regard?

Cheers
russ
-- 
Russ Harmer
Sent with Sparrow

Jean Krivine

unread,
Mar 3, 2011, 11:14:55 AM3/3/11
to Russ Harmer, Jérôme, kappa-developers
Hi

Kasim for the moment simply adopts the less committed posture: no division whatsoever.

Russ Harmer

unread,
Mar 3, 2011, 11:18:43 AM3/3/11
to Jean Krivine, Jérôme, kappa-developers
I mean, is there any facility for changing convention or is it currently all up to the user?

-- 
Russ Harmer
Sent with Sparrow

Jean Krivine

unread,
Mar 3, 2011, 11:19:21 AM3/3/11
to Russ Harmer, Jérôme, kappa-developers
No everything is up to the user

Russ Harmer

unread,
Mar 3, 2011, 12:30:30 PM3/3/11
to Jean Krivine, Jérôme, kappa-developers
OK. So what do you think about Jerome's ideas? It would be nicely 'user friendly' especially now that KaSim uses a unique, never seen before, convention :)

-- 
Russ Harmer
Sent with Sparrow

jean.k...@gmail.com

unread,
Mar 3, 2011, 12:39:54 PM3/3/11
to Russ Harmer, Jean Krivine, Jérôme, kappa-developers
Which one ? Something like
%set: ssa
L->R @ k for ssa convention after the instruction?

If you have some time could you write the description of what you would like on the issue page ?

Le , Russ Harmer <russ....@gmail.com> a écrit :

Russ Harmer

unread,
Mar 3, 2011, 12:49:06 PM3/3/11
to jean.k...@gmail.com, Jérôme, kappa-developers
Well, I was wondering if you had any ideas / preference? We can discuss these things before filing github issues!

-- 
Russ Harmer
Sent with Sparrow

On Thursday, March 3, 2011 at 12:39 PM, jean.k...@gmail.com wrote:

Which one ? Something like
%set: ssa
L->R @ k for ssa convention after the instruction?

If you have some time could you write the description of what you would like on the issue page ?

Le , Russ Harmer <russ....@gmail.com> a écrit :
>
>
> OK. So what do you think about Jerome's ideas? It would be nicely 'user friendly' especially now that KaSim uses a unique, never seen before, convention :)
>
>
>
> -- 
> Russ Harmer
> Sent with Sparrow
>
>
> On Thursday, March 3, 2011 at 11:19 AM,, Jean Krivine wrote:
>
> No everything is up to the user
>
> On Thu, Mar 3, 2011 at 5:18 PM, Russ Harmer russ....@gmail.com> wrote:
>
>
>
>
> I mean, is there any facility for changiing convention or is it currently all up to the user?
>
>
>
> -- 
> Russ Haarmer

> Sent with Sparrow
>
>
>
> On Thursday, March 3, 2011 at 11:14 AM, Jean Krivine wrote:
> Hi
>
> Kasim for the moment simply adopts the less committed posture: no division whatsoever.
>
>
>
> On Thu, Mar 3, 2011 at 4:48 PM, Russ Harmer russ....@gmail.com> wrote:
>
>
> I don't think we ever came to a conclusion about this??

>
>
> I would still like the command-line option -- it would often be sufficient. But Jerome is right that it is not general enough; I don't like the idea of annotating every rule individually, but am perfectly OK with directives.
>
>
>
>
>
>
> @Jean: what does KaSim currently do in this regard?
>
>
> Cheers
> russ
> -- 
> Russ Harmer
>
>
>
>
> Sent with Sparrow
>
>
> On Tuesday, September 14, 2010 at 9:04 AM, Jérôme wrote:
> I would like to brainstorm a little bit about how to make it wwork
>

Reply all
Reply to author
Forward
0 new messages