Linear Types with Conditionals?

51 views
Skip to first unread message

David Barbour

unread,
Oct 7, 2013, 11:53:12 PM10/7/13
to reactiv...@googlegroups.com
In Sirea, I modeled conditional operations using a variation on ArrowChoice. A decision expression would return an (a :+: b) type, which corresponds to `Either a b`. Programmers could then operate on just one branch or another, using:

     left :: (a ~> a') -> ((a + b) ~> (a' + b))
     mirror :: (a + b) ~> (b + a)
     assocls :: (a + (b + c)) ~> ((a + b) + c)
     -- and other plumbing ops

This design has some very nice properties. It keeps the choice open for composition and extension. It separates the responsibilities of expressing the condition and expressing how it is handled. When compiled, the tags and data plumbing are eliminated entirely.

More recently, I moved to a tacit concatenative model. Initially, the same conditional model applied, in minor variations. The argument to `left` is now part of the environment.

    left' :: (a -> a')*(a+b) -> (a'+b)

This has some nice properties compared to Hughes' arrows model. Haskell arrows require binding the behavior at the call site unless we use a separate stage of computation (such as construction in the lambda calculus) for abstraction. While I like staged computation, the distribution of parameters seems to entangle subprograms, hindering decomposition and complicating the program. 

In practice, 'left' is generally applied in conjunction with 'first'. Optimizing for the common case, I actually want a sort of `firstleft` operation as the primitive, focusing on just one choice in a product-structured environment:

    left'' :: (a -> a')*((a+b)*e) -> ((a'+b)*e)

Anyhow, I was pretty happy with this design until recently. Two factors I find worthy of concern:

First, I'm trying to tighten up for developing a bytecode. Operations for a sum type do not seem minimal. For example, I can model a similar effect using ("left" * x) vs. ("right" * y), plus some form of conditional application. This reasoning doesn't seem convincing to me, though I haven't been able to put my finger on why not; I only have an intuition that there is much value in a dedicated sum type.

Second, use of `left` seems problematic for linear types, since it will consume the `(a -> a')` behavior even if `b` is active. A linear typed function must be used on both branches of sum, which suggests the behavior instead be:

    left''' ::  (((a->a')*a)+b)*e -> (a'+b)*e

This variation forces distribution of a potentially linear function prior to applying it. Of course, if I do this then I need some way to deep-manipulate the structure - e.g. to model the earlier left, I need to drop after distribution, and it would be counterproductive to distribute more functions to drop the first ones I distributed.

One possibility that has amused me involves rebuilding symmetry. We can operate on products as part of a sum and vice versa.

    first :: ((((a->a')*a)*b)+e) -> ((a'*b)+e)
    left  :: ((((a->a')*a)+b)*e) -> ((a'+b)*e)
    swap ::  ((a*b)+e) -> ((b*a)+e)
    mirror :: ((a+b)*e) -> ((b*a)*e)
    assoclp :: ((a*(b*c))+e) -> (((a*b)*c)+e)
    assocls :: ((a+(b+c))*e) -> (((a+b)+c)*e)
    ...

This design has the potential to simplify deep operations, reduce reliance on (a->a') blocks. But it also seems complicated, difficult to extend if I ever want a new structural type (e.g. `&` for additive conjunction), difficult to document and explain the motivations. So at the moment I don't take this possibility very seriously.

Getting back to the first point, modeling conditional data using a product (tagged values) isn't anything new, and in some ways seems simpler than dealing with a dedicated sum type. I believe that dependently typed pairs can often be inferred from a program, at least for common cases when the tags are simple and have predictable structure.

Yet I am left with a question: how could I model conditional application in the presence of linear typed functions, while keeping it simple, open, extensible, and compositional like the earlier versions of `left`. A related question is whether all applications should be conditional, with the regular variation of 'first' simply declaring a true condition.

At the moment I don't have answers for this, just thinking out loud a bit and documenting the concern. But if anyone wants to chime in, I'm open to ideas.

Best,

Dave


Ross Angle

unread,
Oct 8, 2013, 12:30:30 AM10/8/13
to reactiv...@googlegroups.com
Hi Dave,

I'm thinking you don't actually need to worry about consuming the function most of the time. If the function must be used once, then you have to distribute it to the branch you want to use it on--but you must also consume it on the other branch. :) And if you want to consume it zero or one times, then you could just model it as a ticket that can be redeemed for either nothing or a function. The ticket would be an additive conjunction (1 & the function).

You seem to be wondering how you can distribute the "drop" axiom into the other part of the computation without again having to worry about dropping the drop axiom. Even if we're using &, this concern is relevant because we're using an axiom to eliminate &, and we may need to drop that axiom itself. Fortunately, I think this isn't really an issue, because the axioms themselves can be used any number of times. You can just build a system that works for repeatable functions, and then later you can use that system to do linear stuff. :)

Well, I think so anyway. :-p And let me know if I'm not making sense, lol.

Warm Regards,
Ross



--
You received this message because you are subscribed to the Google Groups "reactive-demand" group.
To unsubscribe from this group and stop receiving emails from it, send an email to reactive-dema...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Matt McLelland

unread,
Oct 8, 2013, 11:22:54 AM10/8/13
to reactiv...@googlegroups.com
You know, there are a number of similarities between your combinators here and my process combinators which I use to connect ports in a linear 1-1 way with other processes/ports (a notable difference being that I use named ports rather than structural position).   And I could add a way to sum protocols similar to your sum types here.  If I did that, I can see how to make sense of (a + b) as a port satisfying either a or b.   And I could implement something similar to your original left.  I'm not sure how to read your left'.  Which of those types are time-varying?   Does the type (a -> a') -> ... not refer to a time-varying time-varying function? Or are those real function arrows?  Anyway, I don't really know what's going on, but I wonder if Ross wasn't saying that lifting (a -> a') into a sum type should be considered consuming it, even if that branch of the sum type isn't taken.  That feels like the correct behavior to me, even though, again, I don't know what's going on.

David Barbour

unread,
Oct 8, 2013, 10:04:26 AM10/8/13
to reactiv...@googlegroups.com
If I follow correctly, you're suggesting to use the earlier `left' :: (a -> a') * (a + b) -> (a' + b)`, but require the `(a -> a')` behavior be droppable. Then just deal with linear or relevant behaviors as a special case (i.e. requiring developers distribute them first, then using '[apply] left' or similar for deep application). Do I understand correctly? 

Hmm. This is a viable possibility. And I suppose the opposing choice is to distribute-then-drop in the common case, which hardly seems any better. I guess I'll go with that for now.

I'm still trying to understand why I favor `(a + b)` types more than dependently typed pairs. Maybe something to do with why I favor composition and communication vs. constructors and pattern-matching. (cf http://lambda-the-ultimate.org/node/4659#comment-73675).

Warm Regards,

Dave

David Barbour

unread,
Oct 8, 2013, 1:19:00 PM10/8/13
to reactiv...@googlegroups.com

In all forms of 'left' the (a -> a') is generally a static function. In the programming model I'm developing, every value has implicit spatial-temporal attributes; one might understand `static` as a special space-time location meaning "wherever and whenever this program fragment is".

In Haskell, the `left :: (a ~> a') -> ((a + b) ~> (a' + b))` uses two different arrow types, `->` for pure Haskell functions and `~>` for the arrow being modeled (which generally is not pure). In this case, it is clear that the argument `(a ~> a')` is not only static, but also cannot be distributed through the arrow itself - i.e. it is syntactically bound to the point of application (modulo staged metaprogramming).

The `left' :: ((a -> a')*(a + b))->(a'+b)` is different from `left` because it distributes the `(a -> a')` operation within the arrow. Consequently, I only need one arrow type (->) for all behaviors. (I could just as easily use `~>` for all behaviors; perhaps this would have been more consistent in presence of the Haskell arrows.) The (a -> a') behavior is typically still static. It could potentially be dynamic, though.

...

Though 'static' code is time-varying. It just happens in a higher layer. Live programming, for example.

I'm not sure I understood what you meant by "a port satisfying either a or b". The (a + b) type is selected by the provider, not the receiver. So a port satisfying (a + b) must be capable of handling both option a and option b (though only one at a time). If we had an (a & b) type, additive conjunction, that would be selected by the receiver, in which case the port 'satisfying a' would also satisfy (a & b). Can you help me understand what you meant?

I think Ross was just saying that I should worry about no-drop (linear or relevant) types when they come up, as a special case, and everything will work out. At least, that's what I took away from it. He's probably right. :)

Could you explain a bit more about your process combinators?

Warm Regards,

Dave

On Oct 8, 2013 10:22 AM, "Matt McLelland" <mclella...@gmail.com> wrote:
You know, there are a number of similarities between your combinators here and my process combinators which I use to connect ports in a linear 1-1 way with other processes/ports (a notable difference being that I use named ports rather than structural position).   And I could add a way to sum protocols similar to your sum types here.  If I did that, I can see how to make sense of (a + b) as a port satisfying either a or b.   And I could implement something similar to your original left.  I'm not sure how to read your left'.  Which of those types are time-varying?   Does the type (a -> a') -> ... not refer to a time-varying time-varying function? Or are those real function arrows?  Anyway, I don't really know what's going on, but I wonder if Ross wasn't saying that lifting (a -> a') into a sum type should be considered consuming it, even if that branch of the sum type isn't taken.  That feels like the correct behavior to me, even though, again, I don't know what's going on.

--

Matt McLelland

unread,
Oct 8, 2013, 3:48:53 PM10/8/13
to reactiv...@googlegroups.com
I understood what you meant by (a + b), but my wording was sloppy.  Here's a brief synopsis.

As we've discussed before, the high level view is that a Process is just:

    type Process = Signal -> (Process, Signal)  -- Very coarsely typed

You can sort of see in that simple definition the presence of both an object system (responding to a signal/message and getting a new state) and an effect system (emitting a signal/message and getting a new state).   I think this interplay between objects and effects as understood by just hooking up processes turned out to be pretty powerful.  Because these processes only communicate by sending / receiving signals, they are pure values and it's easy to capture or duplicate or resume them.

The next important thing to say is how they're connected.  They're connected via named ports.  A port is conceptually just an address (an abstract identifier) prepended to a signal.  It's important, though, for security reasons, not to let well behaved processes read mail that's intended for another recipient as it's passing through.  I originally intended that this would be accomplished by polymorphism to hide the types of messages, but after considerations that arose from our discussion on LtU, I realized that this scheme still allowed a man-in-the-middle to count the messages as they went back and forth, which still leaks information. So I fixed this by making the message forwarding mechanism opaque.  The only way to build a Process that dispatches on mail intended for a particular port is to know the protocol of that port, because the combinators that build Processes only allow you to hook up ports that you know are on matching protocols.

A protocol is just the type of a port.  It just keeps track of which signals are allowed at any given time, and how the observed signal changes that state going forward.  Think of it as a state machine.  This is also the only place where I distinguish between male (send first) /female (receive first) ports.  There's much more to say here, but I won't get into it in now.

So here's simple example code for a very simple case (in pseudo-syntax):

    handle io with object { write msg = ... }
    -- comment
    print 'hello'  --  communicates on port io
    print 'goodbye'

The line 'handle ...' sets up a process listening on port 'io' and connects it to the process that follows.  The process that follows (all of the stuff after 'comment') sends message (io.write 'hello') and then changes to a state where it's listening for a return () and then when it receives that return from the handler, signals (io.write 'goodbye').  Here the dot in io.write is port composition, supporting nested ports. 

I still need to finalize the notation around my port combinators, but they build up processes by:  adding a handler for a port and exposing it in the environment of a child process, connecting two ports in the environment and leaving the remaining environment to a child process, transitioning from one process to another, installing a process to be captured, split a port into subports...   and then at the bottom you can have pure processes built by functions.  I think that's all of them that I have now, but I expect to add a few more for various reasons.   Like, semantically, I want to be able to use impolite processes that read mail they're not supposed to, for debugging purposes.

I still need to write this stuff up.

Best,
Matt





Ross Angle

unread,
Oct 8, 2013, 5:21:40 PM10/8/13
to reactiv...@googlegroups.com

Hi all,

David got the point I was making. :) With explicit linear types, here's what I'm imagining for the left operator:

!((((!(a -o b)) * (a + c)) * s) -o ((b + c) * s))

The two occurrences of ! show which resources are usable any number of times. That is, the entire left operator itself is reusable, and the function we're applying must be reusable. (Technically we only need to use the function zero or one times, and we could express that using additive conjunction as (1 & (a -o b)), but then we'd need operators to manipulate &.)

The types ((!a) -o b) or !((!a) -o b) would almost correspond to RDP's usual (a -> b) behaviors, but there's a caveat. A behavior of type ((!a) -o b) must be used at a specific time offset. RDP's usual behaviors would be more like some kind of (ForAllNonnegativeDurations d. addDelay d !((!a) -o b)), where "addDelay d <type>" does a deep substitution of static time offsets throughout the type. Not to mention, RDP behaviors can have side effects, and David's been talking about adding ripening and expiration times, so these type interpretations might get more complicated depending on what we choose to reason about.


On Oct 8, 2013 8:22 AM, "Matt McLelland" <mclella...@gmail.com> wrote:

Which of those types are time-varying?   Does the type (a -> a') -> ... not refer to a time-varying time-varying function? Or are those real function arrows?

David's been using the notation (a ~> b) for behaviors while working in Haskell, but I keep wanting to say (a -> b) instead because I think we only need one arrow type for this. Pure function types are a special case of RDP behavior types, and RDP side effects can only occur when certain *input or output* types are involved (namely, signals located at the partition where side effects occur), so purity is still apparent in the behavior's type.

(Actually, I just realized that's not true.... If we support inlined closures like what I've been doing in Underreact, then invoking a closure may enable side effects fueled by those captured signals, even if the closure's own input and output are completely disconnected from that phenomenon. Interesting. I think David saw this coming. ^_^; )


Anyway, I don't really know what's going on, but I wonder if Ross wasn't saying that lifting (a -> a') into a sum type should be considered consuming it, even if that branch of the sum type isn't taken.

Sorry, the first paragraph of my email was pretty muddled. I started out by saying David wouldn't need to worry about consuming functions, and then I jumped into a quick absurdity argument about how, if we were consuming a function, we would need to make sure we consumed it on both branches of the sum. Then I talked about viable ways to model that. I think I forgot I was aiming for absurdity. XD

Let's try that again: If we have a resource we need to consume exactly once, and if we have a sum we want to distribute that resource over, then we're going to consume that resource on both branches of the sum. So if David wants to have a "left" operator which applies a function on just one branch, then there's no use fretting over the idea that the function could be a linear resource. If it's a linear resource, we'll already need to use it on both branches.

-Ross

David Barbour

unread,
Oct 8, 2013, 9:44:39 PM10/8/13
to reactiv...@googlegroups.com
Thanks for this overview. 

I certainly like managing protocols in a fully typed manner. Linear types are pretty useful for ensuring protocols are completed. I'm not so fond of functions folding or accumulating state over time, but that's a discussion we've had elsewhere. :)

You use the phrase 'child process' a couple times. Is sequential composition (as with 'print') implicitly forming a child process? And 'transitioning from one process to another' - what do you mean by transition? Do you have types for these combinators?

I'll be interested in seeing you write up your model at some point.

Warm Regards,

Dave 

Matt McLelland

unread,
Oct 8, 2013, 10:52:46 PM10/8/13
to reactiv...@googlegroups.com
> Linear types are pretty useful for ensuring protocols are completed.

I wonder if they can express anything that my system can't.  Any ideas for a counterexample?

> I'm not so fond of functions folding or accumulating state over time, but that's a discussion we've had elsewhere. :)

I'm in favor of minimizing state, but I can see some of the allure of pushing state to the edges.  I'd like to see a reactive approach whenever possible, but with these processes driving the updates.

>You use the phrase 'child process' a couple times.

I just meant that many of the combinators act on one or two processes to produce a new process.  I was calling the former 'child processes', but 'combined process' might have been better.  Transition is just a specific combinator that captures a certain signal and uses it to abandon the current process state and move to another one.  So I might write the above example as [ {write: ... }  <=>io  (print 'hello' >>k print 'goodbye') ].   The >>k combinator says "transition on k" where k is the continuation signal.   If you fix a continuation signal, processes form a monad under transition (sequential composition) on that signal.   The X <=>io Y says handle port io of Y with process X.  Finally, the { port: process, ... } is a branch on subports.   

> Do you have types for these combinators?

Yea.  It's mostly straightforward. 

BTW, another goal for these is to use them as a compilation target, after eliminating the more abstract processes and parameter references, etc.  I'm not too far down that road yet.



David Barbour

unread,
Oct 8, 2013, 11:41:16 PM10/8/13
to reactiv...@googlegroups.com
On Tue, Oct 8, 2013 at 9:52 PM, Matt McLelland <mclella...@gmail.com> wrote:
> Linear types are pretty useful for ensuring protocols are completed.

I wonder if they can express anything that my system can't.  Any ideas for a counterexample?

I can't speak with certainty. I know some interesting things linear types can express/enforce, but I do not know as much about your system. 

My understanding is that you use dependent types and state, which together form a useful 'typestate'-like model where processes are essentially typed by a step-varying set of signals they expect. This enables you to express multi-step protocols (handshakes, lock-unlock, grammars), but I do not see an effective way to enforce that those protocols are completed - as opposed to leaving a protocol hanging indefinitely in an intermediate state. Linear types could help in that role.

Linear types can similarly enforce connectivity relationships between processes, i.e. that an environment process is invoked at least once, or that ports are one-to-one vs. one-to-many vs. many-to-one, or even precise numbers like 1:N or at-most-N:1. (Modeling precise limits on connectivity - fan-in and fan-out - is useful if compiling to hardware or FPGA.)

If you have continuations, linear types can enforce that continuations are followed rather than abandoned (modulo divergence). Structured programming can enforce the same thing, but linear types can do it without the structure... which can be very nice if trying to compose multiple ad-hoc frameworks or similar.
 

Transition is just a specific combinator that captures a certain signal and uses it to abandon the current process state and move to another one.

That sounds similar to the 'become' operation common to smalltalk and actors model.


 
another goal for these [combinators] is to use them as a compilation target, after eliminating the more abstract processes and parameter references, etc.

I think you'll like it. I've found combinators make a very nice intermediate language. :) 

Best,

Matt McLelland

unread,
Oct 9, 2013, 9:06:25 AM10/9/13
to reactiv...@googlegroups.com
> but I do not see an effective way to enforce that those protocols are completed

Well, you have to distinguish two cases.  You're right that I don't have a way to declare "this process must be resumed until its protocol is completed," which would require substructural types or something.  But I see this as a feature, not a bug.  If I have a process as a value, I can copy/suspend/resume/drop it as I please. 

On the other hand, my protocol types implicitly enforce that a process should complete its protocol.  If the protocol says "receive A => send B", then when that process gets an A it will put a B.   But consider the situation at top level, where the type of 'main' is a process that interacts with the environment in a certain way.   That 'main' process has no way of suspending the environment, so its only choice is to complete protocols.   And we can repeat this construction at any depth by requiring a process that communicates with that layer as "the environment."  But an arbitrary process as a value only gets the guarantee that *if* it's resumed, it will be in accordance with proper protocol. 

This is I meant by avoiding substructural types by modeling things structurally.  It is possible, though, that my model misses other useful things that substructural types can capture, which is why I was asking.

Also, it's probably useful to consider capabilities again: we can model a capability as a constructor for a process that communicates with the environment through an opaque port.   Because the port is opaque, the only way you can use that capability is, because of the type system, to hook it up to the process you're building so that it may communicate with the environment.  

> That sounds similar to the 'become' operation common to smalltalk and actors model.

Yep, it's the same thing.  I currently have a 'become' keyword in the language level presentation of objects. 

> I think you'll like it. I've found combinators make a very nice intermediate language. :)

Thanks, I hope so!

Best,
Matt

David Barbour

unread,
Oct 9, 2013, 10:16:24 AM10/9/13
to reactiv...@googlegroups.com

> If I have a process as a value, I can copy/suspend/resume/drop it as I please. 

Wouldn't similar words also serve as justification for dynamic types, breaking encapsulation, etc?

But types are for expressing and enforcing properties that we believe are important. I find value in enforcing completion of protocols within programs, even between pure subprograms.

Also, if I understand correctly this argument wouldn't apply to processes connected indirectly through a combinator or named port. The processes involved don't really "have" those as values, only an ability to signal them.

> That 'main' process has no way of suspending the environment, so its only choice is to complete protocols. 

I think this "only choice" you're assuming might be in error, unless you also introduce types asserting on the state of the environment (Hoare logic). If your environment is robust or multi-agent at all, it can unfortunately be polluted and littered upon - e.g. starting many 'new' sessions with a service but not completing any of them.

Matt McLelland

unread,
Oct 9, 2013, 1:55:35 PM10/9/13
to reactiv...@googlegroups.com
> Wouldn't similar words also serve as justification for dynamic types, breaking encapsulation, etc?

You're right, I phrased that poorly.  My point isn't that it's my value so I should be able to ignore the types.  Rather it's that it's just a value, so its types should be describing what it can do and not what I'm allowed/required to do with it.   That probably sounds a bit like "types should be structural because types should be structural," and it is to some extent, but phrased another way, the environment as named values is an abstraction I want to keep.  If names in the environment come with types that place obligations/constraints on other parts of the environment, that abstraction breaks. 

> Also, if I understand correctly this argument wouldn't apply to processes connected indirectly through a combinator or named port.  The processes involved don't really "have" those as values, only an ability to signal them.

That's correct, which is how I enforce protocols be completed with structural types.

> I think this "only choice" you're assuming might be in error, unless you also introduce types asserting on the state of the environment (Hoare logic). If your environment is robust or multi-agent at all, it can unfortunately be polluted and littered upon - e.g. starting many 'new' sessions with a service but not completing any of them.

I can do something like Hoare logic, but I don't think it's required for this.  The requirement here is that protocols aren't starved, so we need to know that an object doesn't get stuck working on other ports forever.  I'm thinking most types should include this claim by default, so e.g. when you write something like { IO } Nat, meaning a computation that interacts on IO before yielding a Nat, that would imply that the interaction on IO is supposed to be finite.   Unfortunately, this is a non-local property (like non-termination in general) that's not easily checked in an automatic way by a type system.   My preference in such situations is to allow correct-by-construction approaches, but not require them, leaving the possibility that you might have unfulfilled proof obligations in your code... where it's supposed to terminate, but you haven't gotten around to proving that yet. 

Linear types will not be able to prove that ports aren't starved, either, but they could at least prove that there is starvation in the obvious cases where you switch to a state that drops the port entirely.   This might let me change the error message from "unfulfilled proof obligation" to "type error" in those cases, but wouldn't ever establish a lack of type error.






David Barbour

unread,
Oct 9, 2013, 3:22:03 PM10/9/13
to reactiv...@googlegroups.com
On Wed, Oct 9, 2013 at 12:55 PM, Matt McLelland <mclella...@gmail.com> wrote:
it's just a value, so its types should be describing what it can do and not what I'm allowed/required to do with it

Heh. When a value *represents* an obligation, contract, debt, or resource, it isn't "just a value" any more than `750.82` is just a number when it's on a check or invoice. Substructural types can model not only values but some of the intermediate transitional stages between values (useful for certain decompositions), and economies for obtaining or exchanging values, and resources. 

 
the environment as named values is an abstraction I want to keep.  If names in the environment come with types that place obligations/constraints on other parts of the environment, that abstraction breaks. 

I believe these concerns are orthogonal. It isn't as though one part of the environment can directly obligate another; there is always a 'buy in' protocol - ie. some communication that must occur before any obligations are formed. Can you explain to me how this breaks abstractions?
 

The requirement here is that protocols aren't starved

Interesting property. I can see how this would serve a similar role to substructural types, if you could easily enforce it.
 

Linear types will not be able to prove that ports aren't starved, either

Not by themselves, no. But in combination with expiration types, they can.

Warm Regards,

Dave

Matt McLelland

unread,
Oct 9, 2013, 5:23:29 PM10/9/13
to reactiv...@googlegroups.com
> When a value *represents* an obligation, contract, debt, or resource, it isn't "just a value"

In my world view it is.  Representations are external to values.  When I talk about named values, those values are in a specific mathematical domain of discourse that you can reason about in a certain logic.   Anyway, I'm not criticizing your approach, but this setup of pure environments with structural types makes sense to me.

> Can you explain to me how this breaks abstractions?

I wasn't saying anything deep there -- just that obligations aren't values in my sense.   Again, I'm saying it doesn't fit into the mold I'm working with, not that it's wrong.

> if you could easily enforce it.

I could enforce it with techniques a la total functional programming or with dependent types, but as I said, I'd rather not enforce it and rather have the system note when it wasn't provable.  

> expiration types

I'm guessing you mean a type that says "after X seconds, the resource cannot be used".   Since establishing time bounds is even more difficult than establishing termination, you can guess how I feel about it given my previously stated preference on proving termination.  But I understand that you're working on approaches that try to keep track of time bounds bottom up in a low pain way.  I'll be interested in seeing what you come up with.


David Barbour

unread,
Oct 9, 2013, 9:31:21 PM10/9/13
to reactiv...@googlegroups.com
On Wed, Oct 9, 2013 at 4:23 PM, Matt McLelland <mclella...@gmail.com> wrote:
> When a value *represents* an obligation, contract, debt, or resource, it isn't "just a value"

In my world view it is.  Representations are external to values.  When I talk about named values, those values are in a specific mathematical domain of discourse that you can reason about in a certain logic.   Anyway, I'm not criticizing your approach, but this setup of pure environments with structural types makes sense to me.

I once comprehended values as part of a platonic mathematical domain. 

It took me a long time to get on board with linear types; I understood what they were, but not their utility, even as recently as 2011. I think it was Ross's influence that led me to me pondering them on a regular basis; Ross Angle and Chris Martens both pursue linear types for interactive fiction and other interesting problems. The tipping point was when I was contemplating how to limit fan-in and fan-out on resources in order to compile directly to hardware. Linear types filled that void, and further enabled expressing many other concepts I've since found useful but hadn't considered at the time (such as obligations).

As I grew comfortable with linear types (and related fields: reversible computation, constructive logics), I found ever increasing value in treating computation as a 'mechanical' mathematics. I began to focus on information - values with history, future, and context - rather than values as a mathematical concept. I learned that information truly is physical (cf. Landauer's principle). I began to understand that representation is fundamental, that 'value' is not an entity but rather a meaningful relationship between representation and environment. 

So I think I know where you're coming from, though I find it terribly difficult to recall my old world view. 

I believe that you, too, would find much value in linear types if you were to pursue them - most likely more than you expect.
  

Since establishing time bounds is even more difficult than establishing termination

Termination requires analyzing an *unbounded* future, which means you don't know in advance how deeply you need to analyze for termination, and you must often resort to indirect inductive mechanisms. Time bounds, OTOH, provide a bounded limit on how much analysis is necessary. In context of metaprogramming of a reactive model, the analysis of time bounds can actually become one basis for ensuring typecheck terminates.

Best,

Matt McLelland

unread,
Oct 10, 2013, 12:21:39 PM10/10/13
to reactiv...@googlegroups.com
> I once comprehended values as part of a platonic mathematical domain.

Yes, one explanation is that I haven't yet been enlightened.   I'm not yet convinced it's me, though :).   Do you have plans for a formal system for reasoning about correctness in your system?  To me, a simple platonic domain seems like a requirement here, and I don't think linear logic belongs in it.   We can always encode it if needed (see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.13.8671, that I just found through search, where someone encodes linear logic in Coq).   The point being that I don't doubt that you are right that linear logic is a good source of ideas (Wadler's paper on the correspondence to session types is somewhere in my queue), but I'll surprised if I ever decide to incorporate them in a fundamental way that requires abandoning pure environments.   I like the cold platonic viewpoint too much.

David Barbour

unread,
Oct 10, 2013, 12:59:08 PM10/10/13
to reactiv...@googlegroups.com

Formalization? Yes, I'm doing that in Agda. (Haskell wasn't expressive enough for rational latency and location types.) And I do have a formal type system.

Linear logic can be part of the formalization, too. Axioms can be introduced or dropped at will, but linear types can still be useful to enforce some resource constraints or obligations even on proofs.

Best,

Dave

Reply all
Reply to author
Forward
0 new messages