Walter's message:
I had some discussions with Russ on this.
First off, there are many possible ways of slicing this pie - the 4
ones listed by Russ have some reasonable narrative. Some of these
narratives are formal, some physical. I think we need to be
pluralistic on this and leave it up to the user by enabling several
options, as Russ argues.
But first let me suggest a renaming of these scenarios (though not
necessarily names for the actual command line option). It makes no
sense to let private language slip into option names. Thus, --eric
makes no sense to my mother (though it may make sense to Eric's).
event-based convention (Vincent's convention): event = embedding of
LHS
physical convention (SSA convention): event = embedding of LHS / all
automorphisms of LHS
empirical convention (Eric's convention): event = embedding of LHS /
product of automorphisms of each connected component (i.e. divide by
intra-symmetries only)
mechanism-based convention (Russ's convention): event = embedding of
LHS / all automorphisms of LHS preserved by the action
The scenarios divide into how much structure your preferred view of
the world sees in collisions (events) between complexes. Since Kappa
sees events as site graphs induced by left hand sides of rules (but
see item 4 below), a translation may have to be made from your
preferred view into Kappa's and vice versa. In all of this, the
principal subtlety is to remain fully aware of the meaning of a rule
(to which Russ contributed great hygiene). The translations that were
begun by Vincent (via neutral refinement), then continued by Russ and
Eric by squaring off physical/empirical and formal views, helped make
us understand the "meaning of a rule" better.
Here's my poor man's take of this thread, which may need considerable
cleaning. I'm sure you all know this backwards and forwards, so excuse
the repetitiousness.
(0)
"Physical Events" come in two flavors:
- reactions between objects whose structure is "fully resolved" with
regard to the level of description set by site graphs.
- reactions between proper names, in which objects are black boxes;
i.e. the way ODEs or simple Petri Nets represent the world.
"Formal Events":
- embeddings of site graphs into a mixture, which, in Kappa, is an
object that cannot be further refined given a particular contact map.
(Such objects also appear on the LHS and RHS of what we call "reaction
rules".)
(1)
The Physical View knows about the internal structure of objects and
"fully resolves" events and objects up to symmetry. Thus the activity
of *reaction rules* (the only type of rules considered by the Physical
View) is divided by autos of objects *and* autos between objects
(symmetries of Physical-Event configurations). That's the standard
case and assumes that, conceptually, rate constants are "first-
principles rate constants".
The translation from the Kappa viewpoint is subtle, however, because
it already involves a take on the "meaning of a rule". That take was
clarified by Vincent by appeal to the notion of a neutral refinement.
In essence, given a rule, the set of all reaction rules derived from
it must be a neutral refinement. For a reaction rule to be a (member
of a) neutral refinement, the reaction rule must *ignore* any
symmetries it breaks or makes compared to the original rule; otherwise
it would contradict the mechanism asserted in the rule. For example,
r: A(x!1), B(x!1) - > A(x), B(x) @ k
might refine into a set of reaction rules of which one might read like
s: A(x!1,y!2), B(x!1), A(x!3,y!2), B(x!3) - > A(x!1,y!2), B(x!1),
A(x,y!2), B(x) @ j
The rule is asymmetric, but the reaction rule is symmetric. The point
is that if the rule asserts what you believe, the mechanism implied
must be blind to the symmetry of the reaction rule. To translate that
into a rate constant for s requires knowing the convention adopted by
the simulator implementation. Assume that convention is "divide the
number of embeddings by all autos of the LHS". I want the firing
probability of each Formal Event (embedding) of s to be exactly the
same (i.e. neutral refinement) as the firing probability of each
Formal Event of r. Thus, under the implementation convention, I must
set j = 2k, to compensate for what the simulator does. In so doing,
each Formal Event of s will be firing at the same rate as each Formal
Event of r, namely k.
Same thing in case r is symmetric and the refined reaction rule breaks
the symmetry.
r: A(x!1), A(x!1) -> A(x), A(x) @ k
s: A(x!1,y!2), A(x!1), B(y!2) -> A(x,y!2), A(x), B(y!2) @ j
Rule r asserts that the mechanism it describes cannot detect
asymmetries, even if the larger context in which it might be embedded
may contain such asymmetries; the mechanism is blind to anything
beyond A(x!1), A(x!1). For this to hold in s, j must be set to k/2. In
that case, as before, every Formal Event of r and every Formal Event
of s fire with the same likelihood (hence neutral refinement), namely
k/2 (assuming the implementation convention above). The blemish that
Vincent dislikes is that we don't notate the reaction rate constant
per "Formal Event" (k/2) but per Physical Event (k). But it only
shows up when you think through a refinement. If you do, you are
likely to be a user who understands the framework and there should be
no problem. If you want to think in terms of Physical Events at the
level of rules, as most do, there is no blemish (as you wouldn't
reason in terms of neutral refinements). The Physical Event convention
makes pragmatic sense.
The Formal Event view is important for theorizing about a system,
because the combinatorial factors are made visible in the translation
to the Physical Event view. These factors are key in understanding the
relative weights introduced by a non-neutral refinement (when you
choose whatever rate constants you wish) - these biases being relative
to the contribution of each refinement at the neutral point.
This story iterates further when going from reaction rules to
reactions, i.e.
r: A(x!1), A(x!1) -> A(x), A(x) @ k
refines into the reaction rule
s: A(x!1,y!2), A(x!1), B(y!2) -> A(x,y!2), A(x), B(y!2) @ j=k/2
which translates into the reaction:
t: Eric() -> Jean(),Russ() @ m
There is no change in the rate constant vis a vis s (both lack
symmetry and that was accounted for with j=k/2). Thus m = k/2. The
comparison between the rate constants of r and t makes the Physical
View explicit.
(2)
The Empirical View works with empirical rate constants, which absorb
all combinatorial factors. This is the case that Eric is concerned
with. However, read carefully how I portray it, because I'm not sure
this is really what he is saying (but it is the only way I could make
sense of it). The Empirical View inherits from the Physical View the
stance that Physical-Event configurations are up to symmetry, but it
asserts that that symmetry has already been accounted for by the
*measurement* of the rate constant.
Thus, let the k in
r: A(x!1),A(x!1) -> A(x),A(x) @ k
come from experiment. Since experiment does not resolve "A(x1),A(x!1)"
but only "Jean()", the k is "per Physical Event". Hence it is OK to
have it divided by 2 in the simulator, since the simulator does not
see "Jean()" but "A(x1),A(x!1)" and counts two Formal Events,
overcounting relative to the physical measurement.
Likewise, in
r: A(x),A(x) -> A(x!1),A(x!1) @ j
Eric argues that the measurement sees "A(x),A(x)" as "Bob(),Bob()" and
implicitly makes the division by 2 already in delivering the
macroscopic rate constant, as the measurement cannot resolve between
the ordering of the Bob()'s. Thus the empirical j is already adjusted
and *no* division by 2 should be made *by the simulator*
(alternatively, Eric would multiply j by 2 to offset the simulator
action).
In summary, Eric argues that the Empirical View cannot see any
structure *internal* to the objects - it only sees proper names (thus
his reference to ODEs). As a consequence, any reference to internal
structure by Kappa must be undone - either the simulator already does
it by the SSA convention as in the case of the dissociation, or you
have to do it, as in the case of the association. However, Eric wants
an option that does it automatically.
(3)
Vincent's view is the pure Formal Event view. I think it's meaning is
clear from the discussion in (1). It is theoretically very clean. It's
value, however, is important if you do theory with Kappa, but too
"disruptive" if one simply wants to do simulations using the
traditional Physical or Empirical world views.
(4)
Now, finally, the Mechanism-based view of Russ. That view clarifies
one thing about the meaning of rules that was not yet clarified by the
neutral refinement take. The fact that in A(x,y), A(x,y) you may have
*two* bona fide mechanistic paths for a reaction. If the reaction is:
r: A(x,y), A(x,y) -> A(x!1,y), A(x,y!1)
then there are two events, since there are two bonding opportunities,
as there are two (x,y) pairs. This is the analogue of the BNGL case of
b: A(s~0,s~0) -> A(s~1,s~0)
where agent interfaces are multisets.
The issue arises because now "symmetry" reaches into what the
interface "can do", which you can't know unless you look at the RHS of
the rule. Either we redefine the notion of "Formal Event" (and
accordingly all translations in (1) and (2), and add a translation to
(3)) or we treat this as another view (my preference), which I would
call the Mechanism-Based view, because you have to look at the RHS to
assess the full mechanism and discover that the LHS has two "Sites of
Action". I can see the physicality of it.
In any case, options are really very much needed, along with the
Russetta Stone.
w
(By the way, not sure whether you knew that BNGL makes its reaction
rate dependent also on the molecularity of the products, which is
biophysically not entirely unwarranted, because of entropic effects -
think explosion...)