[Haskell-cafe] Ord for partially ordered sets

48 views
Skip to first unread message

Ivan Lazar Miljenovic

unread,
Apr 24, 2015, 9:06:28 AM4/24/15
to Haskell Cafe, Haskell Libraries
What is the validity of defining an Ord instance for types for which
mathematically the `compare` function is partially ordered?

Specifically, I have a pull request for fgl [1] to add Ord instances
for the graph types (based upon the Ord instances for Data.Map and
Data.IntMap, which I believe are themselves partially ordered), and
I'm torn as to the soundness of adding these instances. It might be
useful in Haskell code (the example given is to use graphs as keys in
a Map) but mathematically-speaking it is not possible to compare two
arbitrary graphs.

What are people's thoughts on this? What's more important: potential
usefulness/practicality or mathematical correctness?

(Of course, the correct answer is to have a function of type a -> a ->
Maybe Ordering :p)

[1]: https://github.com/haskell/fgl/pull/11

--
Ivan Lazar Miljenovic
Ivan.Mi...@gmail.com
http://IvanMiljenovic.wordpress.com
_______________________________________________
Haskell-Cafe mailing list
Haskel...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe

Tom Ellis

unread,
Apr 24, 2015, 9:17:14 AM4/24/15
to haskel...@haskell.org, Haskell Libraries
On Fri, Apr 24, 2015 at 11:06:07PM +1000, Ivan Lazar Miljenovic wrote:
> What is the validity of defining an Ord instance for types for which
> mathematically the `compare` function is partially ordered?

I'm confused. What is supposed to be the result of `g1 <= g2` when `g1` and
`g2` are not comparable according to the partial order?

Ivan Lazar Miljenovic

unread,
Apr 24, 2015, 9:27:54 AM4/24/15
to Tom Ellis, Haskell Cafe, Haskell Libraries
On 24 April 2015 at 23:17, Tom Ellis
<tom-lists-has...@jaguarpaw.co.uk> wrote:
> On Fri, Apr 24, 2015 at 11:06:07PM +1000, Ivan Lazar Miljenovic wrote:
>> What is the validity of defining an Ord instance for types for which
>> mathematically the `compare` function is partially ordered?
>
> I'm confused. What is supposed to be the result of `g1 <= g2` when `g1` and
> `g2` are not comparable according to the partial order?

With the proposed patch, it's the result of <= on the underlying [Int]Maps.

Does the definition of Ord on Data.Map make sense? e.g. what should
be the result of (fromList [(1,'a'), (2,'b'), (3, 'c')]) <= (fromList
[(1,'a'), (4,'d')])? What about (fromList [(1,'a'), (2,'b'), (3,
'c')]) <= (fromList [(1,'a'), (2,'e')])?

Tom Ellis

unread,
Apr 24, 2015, 9:59:47 AM4/24/15
to haskel...@haskell.org, Haskell Libraries
On Fri, Apr 24, 2015 at 11:27:46PM +1000, Ivan Lazar Miljenovic wrote:
> On 24 April 2015 at 23:17, Tom Ellis
> <tom-lists-has...@jaguarpaw.co.uk> wrote:
> > On Fri, Apr 24, 2015 at 11:06:07PM +1000, Ivan Lazar Miljenovic wrote:
> >> What is the validity of defining an Ord instance for types for which
> >> mathematically the `compare` function is partially ordered?
> >
> > I'm confused. What is supposed to be the result of `g1 <= g2` when `g1` and
> > `g2` are not comparable according to the partial order?
>
> With the proposed patch, it's the result of <= on the underlying [Int]Maps.

Ah, so it's a case of adding a valid Ord instance that isn't a natural one
for the particular datatype. If you really need something like that, for
example to add your graphs to a Data.Set, then I would suggest a newtype
might be appropriate.

Tom

Henning Thielemann

unread,
Apr 24, 2015, 10:02:11 AM4/24/15
to Ivan Lazar Miljenovic, Haskell Libraries, Haskell Cafe

On Fri, 24 Apr 2015, Ivan Lazar Miljenovic wrote:

> Specifically, I have a pull request for fgl [1] to add Ord instances for
> the graph types (based upon the Ord instances for Data.Map and
> Data.IntMap, which I believe are themselves partially ordered), and I'm
> torn as to the soundness of adding these instances.

In an application we needed to do some combinatorics of graphs and thus
needed Set Graph.

Nonetheless, I think that graph0 < graph1 should be a type error. We can
still have a set of Graphs using a newtype.

Ivan Lazar Miljenovic

unread,
Apr 24, 2015, 10:23:54 AM4/24/15
to Henning Thielemann, Haskell Libraries, Haskell Cafe
On 25 April 2015 at 00:01, Henning Thielemann
<lem...@henning-thielemann.de> wrote:
>
> On Fri, 24 Apr 2015, Ivan Lazar Miljenovic wrote:
>
>> Specifically, I have a pull request for fgl [1] to add Ord instances for
>> the graph types (based upon the Ord instances for Data.Map and Data.IntMap,
>> which I believe are themselves partially ordered), and I'm torn as to the
>> soundness of adding these instances.
>
>
> In an application we needed to do some combinatorics of graphs and thus
> needed Set Graph.
>
> Nonetheless, I think that graph0 < graph1 should be a type error. We can
> still have a set of Graphs using a newtype.

This could work; the possible problem would be one of efficiency: if
it's done directly on the graph datatypes they can use the underlying
(ordered) data structure; going purely by the Graph API, there's no
guarantees of ordering and thus it would be needed to call sort, even
though in practice it's redundant.

Andreas Abel

unread,
Apr 24, 2015, 10:48:28 AM4/24/15
to Ivan Lazar Miljenovic, Haskell Cafe, Haskell Libraries
On 04/24/2015 03:06 PM, Ivan Lazar Miljenovic wrote:
> What is the validity of defining an Ord instance for types for which
> mathematically the `compare` function is partially ordered?

I'd say this is harmful, as functions like min and max (and others) rely
on the totality of the ordering.

Partial orderings are useful in itself, I implemented my own library


https://hackage.haskell.org/package/Agda-2.4.2/docs/Agda-Utils-PartialOrd.html

mainly to use it for maintaining sets of incomparable elements:


https://hackage.haskell.org/package/Agda-2.4.2/docs/Agda-Utils-Favorites.html

> Specifically, I have a pull request for fgl [1] to add Ord instances
> for the graph types (based upon the Ord instances for Data.Map and
> Data.IntMap, which I believe are themselves partially ordered), and
> I'm torn as to the soundness of adding these instances. It might be
> useful in Haskell code (the example given is to use graphs as keys in
> a Map) but mathematically-speaking it is not possible to compare two
> arbitrary graphs.
>
> What are people's thoughts on this? What's more important: potential
> usefulness/practicality or mathematical correctness?
>
> (Of course, the correct answer is to have a function of type a -> a ->
> Maybe Ordering :p)
>
> [1]: https://github.com/haskell/fgl/pull/11
>


--
Andreas Abel <>< Du bist der geliebte Mensch.

Department of Computer Science and Engineering
Chalmers and Gothenburg University, Sweden

andrea...@gu.se
http://www2.tcs.ifi.lmu.de/~abel/

Tikhon Jelvis

unread,
Apr 24, 2015, 1:26:50 PM4/24/15
to Andreas Abel, Haskell Libraries, Haskell Cafe
I would be hesitant about adding an Ord instance normally, because there's no clear semantics for it. If we just pass it through to the underlying data structure, it might behave differently depending on how you implement the graph, which is something fgl should ideally abstract over.

Maybe you could provide them in a newtype yourself, in the library? You could call it something like GrKey to make it clear that it has an Ord instance for practical reasons rather than because graphs are meaningfully orderable. This just forces people who need the capability to be a bit more explicit about it.


_______________________________________________
Libraries mailing list
Libr...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries

Alexander Solla

unread,
Apr 24, 2015, 1:47:24 PM4/24/15
to Ivan Lazar Miljenovic, Haskell Libraries, Haskell Cafe
I see it as "morally wrong".  It's like a Monad instance that doesn't obey the monad laws.  The kind of ticking timebomb strong typing is supposed to protect us against.  But that only works if we do our part and don't make non-sense instances.

Can your consumer get away with using a Hashable instance?  (I.e., for use in unordered-containers).  This would be morally correct -- the graph could presumably have a valid Hashable instance.

Jay Sulzberger

unread,
Apr 24, 2015, 3:19:17 PM4/24/15
to Haskell Cafe, Haskell Libraries



On Fri, 24 Apr 2015, Ivan Lazar Miljenovic <ivan.mi...@gmail.com> wrote:

> What is the validity of defining an Ord instance for types for which
> mathematically the `compare` function is partially ordered?
>
> Specifically, I have a pull request for fgl [1] to add Ord instances
> for the graph types (based upon the Ord instances for Data.Map and
> Data.IntMap, which I believe are themselves partially ordered), and
> I'm torn as to the soundness of adding these instances. It might be
> useful in Haskell code (the example given is to use graphs as keys in
> a Map) but mathematically-speaking it is not possible to compare two
> arbitrary graphs.
>
> What are people's thoughts on this? What's more important: potential
> usefulness/practicality or mathematical correctness?
>
> (Of course, the correct answer is to have a function of type a -> a ->
> Maybe Ordering :p)
>
> [1]: https://github.com/haskell/fgl/pull/11
>
> --
> Ivan Lazar Miljenovic

Of course these type-classes (I hope I am using the
word correctly) should be standard:

1. Ord, which is the class of all totally ordered set-like things

2. PoSet, which is the class of all partially ordered set-like things

3. NonStrictPoSet, which is the class of all partially ordered
set-like things, but without the requirement that a <= b and b <= a implies
a Equal b.

4. Things like above, but with the requirement of a Zero, with
the requirement of a One, and the requirement fo both a Zero and a One.

oo--JS.

Barak A. Pearlmutter

unread,
Apr 24, 2015, 4:22:46 PM4/24/15
to haskel...@haskell.org
> I'm confused. What is supposed to be the result of `g1 <= g2` when `g1` and
> `g2` are not comparable according to the partial order?

Surely it would make things less confusing to simply follow existing
precedent already in the standard prelude?

$ ghci
Prelude> let nan=0/0::Double
Prelude> nan==nan
False
Prelude> compare 0 nan
GT
Prelude> compare nan 0
GT
Prelude> compare nan nan
GT
Prelude> 0<=nan
False
Prelude> nan<=0
False
Prelude> nan<=nan
False
Prelude> let infty=1/0::Double
Prelude> infty <= nan
False
Prelude> nan <= infty
False

Perhaps not.

Ertugrul Söylemez

unread,
Apr 24, 2015, 4:26:29 PM4/24/15
to Jay Sulzberger, Haskell Cafe, Haskell Libraries
> 3. NonStrictPoSet, which is the class of all partially ordered
> set-like things, but without the requirement that a <= b and b <= a
> implies a Equal b.

Those are preorders. An antisymmetric preorder is a non-strict poset.

Also it's difficult to capture all of those various order types in
Haskell's class system. A type can have many orders and many underlying
equivalence relations in the case of partial and total orders, and there
are different ways to combine them. For example equality is a partial
order, modular equivalence is a preorder, etc. Those denote bags and
groups more than sequences or trees.

Perhaps it's time to add a type class-like system to Haskell, but for
explicitly passed arguments:

record Relation a where
related :: a -> a -> Bool

unrelated :: a -> a -> Bool
unrelated x y = not (related x y)

func1 :: Relation A -> A -> A -> A
func1 _ x y = ... related x y ...

func2 :: Relation A -> Relation A -> A -> A -> A
func2 r1 r2 x y = ... r1.related x y ... r2.unrelated x y ...

In a lot of cases this is much more appropriate than a type class, and
it would turn many things that are currently types into regular
functions, thus making them a lot more composable:

down :: Ord a -> Ord a
down o =
Ord { compare x y = o.compare y x }
-- The remaining Ord functions are defaulted.

Perhaps all we need is to generalise default definitions to data types
and add module-like dot syntax for records (mainly to preserve infix
operators). Formally speaking there is also little that prevents us
From having associated types in those records that can be used on the
type level.

For actual record types (i.e. single-constructor types) we could even
have specialisation and get a nice performance boost that way, if we ask
for it:

{-# SPECIALISE down someOrder :: Ord SomeType #-}

This would be extremely useful.


> 4. Things like above, but with the requirement of a Zero, with the
> requirement of a One, and the requirement fo both a Zero and a One.

Zero and one as in minBound and maxBound or rather as in Monoid and a
hypothetical Semiring? In the latter case I believe they don't really
belong into an additional class, unless you have some ordering-related
laws for the zeroes and ones. If not, you can always simply use an
Ord+Semiring constraint.

There may be some motivation to make Bounded a subclass of Ord though.


Greets,
Ertugrul
signature.asc

Ertugrul Söylemez

unread,
Apr 24, 2015, 4:32:09 PM4/24/15
to Tom Ellis, haskel...@haskell.org, Haskell Libraries
> I'm confused. What is supposed to be the result of `g1 <= g2` when `g1` and
> `g2` are not comparable according to the partial order?

False.


Greets,
Ertugrul
signature.asc

Mike Izbicki

unread,
Apr 24, 2015, 7:39:33 PM4/24/15
to Ertugrul Söylemez, Tom Ellis, Haskell Libraries, Haskell Cafe
>> I'm confused. What is supposed to be the result of `g1 <= g2` when `g1` and
>> `g2` are not comparable according to the partial order?
>
>False.

The operators aren't a problem for this reason. The real problem is
what does `compare` return?

Carter Schonwald

unread,
Apr 24, 2015, 9:57:17 PM4/24/15
to Ertugrul Söylemez, Haskell Libraries, Haskell Cafe
hrm, wouldn't your proposed extension be largely accomplished by using Record pun and Record WildCards?

eg 

{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE  RecordPuns #-}
module Foo where
data Relation a = Rel{related :: a -> a ->Bool,unrelated :: a -> a -> Bool}

foo :: Relation A -> A -> A -> Bool
foo Rel{..} x y = related x y 

------
or am i over looking something?
I do realize this may not quite be what youre suggesting, and if so, could you help me understand better? :) 


Ertugrul Söylemez

unread,
Apr 25, 2015, 8:51:27 AM4/25/15
to Carter Schonwald, Haskell Libraries, Haskell Cafe
> hrm, wouldn't your proposed extension be largely accomplished by using
> Record pun and Record WildCards?

A part of it would, but it wouldn't preserve operators. For example
instead of `x r.<> y` you would have to write `(<>) r x y`. Also
defaults are not available. Other class features are not accessible,
most notably type-level features like associated types.

The idea is that a record would be completely equivalent to a class with
the only difference being that you define values instead of instances,
that there are no constraints on which values can exist and that those
values must be passed explicitly to functions as regular arguments.
signature.asc

Roman Cheplyaka

unread,
Apr 25, 2015, 10:24:48 AM4/25/15
to Ertugrul Söylemez, Carter Schonwald, Haskell Libraries, Haskell Cafe
On 25/04/15 15:51, Ertugrul Söylemez wrote:
>> hrm, wouldn't your proposed extension be largely accomplished by using
>> Record pun and Record WildCards?
>
> A part of it would, but it wouldn't preserve operators. For example
> instead of `x r.<> y` you would have to write `(<>) r x y`.

Not at all.

{-# LANGUAGE RecordWildCards #-}

import Prelude hiding (sum)

data Monoid a = Monoid { empty :: a, (<>) :: a -> a -> a }

sum :: Num a => Monoid a
sum = Monoid 0 (+)

three :: Integer
three =
let Monoid{..} = sum in
1 <> 2

> Other class features are not accessible,
> most notably type-level features like associated types.

Associated types become additional type variables of the record type.

A class

class C a where
type T a

is essentially equivalent to

class C a t | a -> t

But the functional dependency is not enforceable on the value level
(isn't the whole point of this discussion not to restrict what
"instances" can be defined), so you end up with

class C a t,

a simple MPTC.

> Also defaults are not available.

Now this is a good point.

> The idea is that a record would be completely equivalent to a class with
> the only difference being that you define values instead of instances,
> that there are no constraints on which values can exist and that those
> values must be passed explicitly to functions as regular arguments.

Except we already have regular records (aka data types) which satisfy
90% of the requirements, and adding another language construct to
satisfy those remaining 10% feels wrong to me. I'd rather improve the
existing construct.

Roman


signature.asc

Ertugrul Söylemez

unread,
Apr 25, 2015, 11:12:59 AM4/25/15
to Roman Cheplyaka, Carter Schonwald, Haskell Libraries, Haskell Cafe
>>> hrm, wouldn't your proposed extension be largely accomplished by
>>> using Record pun and Record WildCards?
>>
>> A part of it would, but it wouldn't preserve operators. For example
>> instead of `x r.<> y` you would have to write `(<>) r x y`.
>
> Not at all.
>
> three :: Integer
> three =
> let Monoid{..} = sum in
> 1 <> 2

Puns become tedious and error-prone as soon as you need to refer to
multiple records, when operators are involved. But it's not that
important actually. I can live with the current record syntax.

The most useful features would be defaults, a more suitable syntax for
defining record types and potentially the following:


>> Other class features are not accessible, most notably type-level
>> features like associated types.
>
> Associated types become additional type variables of the record type.

Indeed. However, when the type follows from other type arguments, it
would often be convenient not to spell it out and instead bring an
associated type constructor into scope. This is especially true when
the AT refers to a type that isn't used very often.

record Extractor a where
type Elem a
extract :: a -> Maybe (Elem a, a)

extractTwo
:: (e1 : Extractor a)
-> (e2 : Extractor a)
-> a
-> Maybe (e1.Elem a, e2.Elem a, a)
extractTwo e1 e2 xs0 = do
(x1, xs1) <- e1.extract xs0
(x2, xs2) <- e1.extract xs1
return (x1, x2, xs2)


> But the functional dependency is not enforceable on the value level
> (isn't the whole point of this discussion not to restrict what
> "instances" can be defined), so you end up with
>
> class C a t,
>
> a simple MPTC.

I don't see a reason to enforce a dependency, since there is no
equivalent to instance resolution. Regular unification should cover any
ambiguities, and if it doesn't you need ScopedTypeVariables.


>> The idea is that a record would be completely equivalent to a class
>> with the only difference being that you define values instead of
>> instances, that there are no constraints on which values can exist
>> and that those values must be passed explicitly to functions as
>> regular arguments.
>
> Except we already have regular records (aka data types) which satisfy
> 90% of the requirements, and adding another language construct to
> satisfy those remaining 10% feels wrong to me. I'd rather improve the
> existing construct.

That's actually what I'm proposing. The record syntax would simply be
syntactic sugar for single-constructor data types that is more suitable
for records, especially when defaults and other class-like features are
involved. Most notably it would support layout. There is no reason why
you shouldn't be able to use `data` to achieve the same thing, except
with a clumsier syntax and the option to have multiple constructors.


Greets,
Ertugrul
signature.asc

Carter Schonwald

unread,
Apr 25, 2015, 11:32:50 AM4/25/15
to Ertugrul Söylemez, Haskell Libraries, Haskell Cafe
Isn't your associated type here more like a dependent record field/ existential that we can kinda expose?

This does seem to veer into first class module territory.  Especially wrt needing first class types in a fashion. 

Have you had a chance to peruse the Andreas Rossberg 1ml paper on embedding first class modules into f omega that has been circulating? Perhaps there are ideas There that could be adapted. Especially since core is an augmented f omega

Ertugrul Söylemez

unread,
Apr 25, 2015, 11:43:12 AM4/25/15
to Carter Schonwald, Haskell Cafe
> Isn't your associated type here more like a dependent record field/
> existential that we can kinda expose?

Not quite. There is still a clear distinction between type and value
level. You cannot refer to an AT on the value level or to a member
value on the type level.


> This does seem to veer into first class module territory. Especially
> wrt needing first class types in a fashion.

I think formally there is little difference between a powerful record
system and a first-class module system. However, even in a
non-dependent language a first class module could still expect a value
argument. A record type couldn't do this without essentially making the
language dependent on the way.


> Have you had a chance to peruse the Andreas Rossberg 1ml paper on
> embedding first class modules into f omega that has been circulating?
> Perhaps there are ideas There that could be adapted. Especially since
> core is an augmented f omega

I haven't read it, sorry. My proposal should conform to the current
core language, as it's mostly just a syntax transformation. The only
new semantics would be defaults.
signature.asc

Carter Schonwald

unread,
Apr 25, 2015, 12:39:19 PM4/25/15
to Ertugrul Söylemez, Haskell Cafe
You miss apprehend.  I'm saying that the first class modules encoding in that paper is expressible using a subset of ghc core. 

There IS the subtle issue that type class dictionaries / type classes have different sharing and strictness semantics than normal userland records. 

There's a space Of designs that could provide what you're asking for, and I do agree that it's worth exploring. I guess The main question is whether or not it turns into its own research engineering project. 

Likewise, to what extent aside from syntactic nicety does implicit parameters and the punning extensions not suffice?


On Saturday, April 25, 2015, Ertugrul Söylemez <ert...@gmx.de> wrote:

Ertugrul Söylemez

unread,
Apr 25, 2015, 2:12:31 PM4/25/15
to Carter Schonwald, Haskell Cafe
> Likewise, to what extent aside from syntactic nicety does implicit
> parameters and the punning extensions not suffice?

At the definition and instantiation sites I mostly miss defaults. At
the application sites I would love to have specialisation for certain
arguments. For example I would like to be able to tell GHC that I would
like to have a version of my function `f` with a certain argument
inlined. Note that I don't want to inline `f` itself. Rather I'd like
to preapply certain arguments:

f :: X -> Y -> Z

{-# SPECIALISE f SomeX #-}
{-# SPECIALISE f SomeOtherX #-}

This would generate two specialised versions of `f` with exactly the
given arguments inlined. That way I can get a very efficient `f`
without having to inline it at the application sites. And as long as
`f` is INLINABLE I can put those pragmas pretty much everywhere. I
believe this is exactly what happens for type class dictionaries.

This can (and probably should) be a separate feature though. For some
of my applications I need to inline a huge chunk of code multiple times
to compensate for the lack of this feature.
signature.asc

Carter Schonwald

unread,
Apr 25, 2015, 2:21:29 PM4/25/15
to Ertugrul Söylemez, Haskell Cafe
Could that specialization be accomplished today using eds reflection pkg? I guess not quite in terms of that pre apply pattern. 

This is interesting.  And it's a good example of a larger problem of not enough support for composable specialization with good sharing across the use sites that doesn't require egregious Inlining.  At least for code that isn't Type class driven. 


On Saturday, April 25, 2015, Ertugrul Söylemez <ert...@gmx.de> wrote:

Ertugrul Söylemez

unread,
Apr 25, 2015, 2:43:33 PM4/25/15
to Carter Schonwald, Haskell Cafe
> Could that specialization be accomplished today using eds reflection
> pkg? I guess not quite in terms of that pre apply pattern.

I'm not familiar with that package and couldn't find it on Hackage by a
quick search. But I believe that it can only be done with compiler
support, although with enough hackery you can probably get an ugly
version of it using TH.


> This is interesting. And it's a good example of a larger problem of
> not enough support for composable specialization with good sharing
> across the use sites that doesn't require egregious Inlining. At
> least for code that isn't Type class driven.

Indeed. Specialisation is a really good way to get very fast code
without making your executable size explode. I believe that support for
more fine-grained specialisation should and will improve. I'm not sure
how to make it more composable though.
signature.asc

Carter Schonwald

unread,
Apr 25, 2015, 3:23:11 PM4/25/15
to Ertugrul Söylemez, Haskell Cafe
Here yah go https://hackage.haskell.org/package/reflection
It exploits how dictionary passing works in a pretty robust way that ghc is likely to at some point codify officially. 


On Saturday, April 25, 2015, Ertugrul Söylemez <ert...@gmx.de> wrote:

Ertugrul Söylemez

unread,
Apr 25, 2015, 4:06:19 PM4/25/15
to Carter Schonwald, Haskell Cafe
> Here yah go https://hackage.haskell.org/package/reflection
> It exploits how dictionary passing works in a pretty robust way that
> ghc is likely to at some point codify officially.

Oh, that one. Of course I'm familiar with it, but it's less an
optimisation than a clever way to implement implicit configurations.
signature.asc

Carter Schonwald

unread,
Apr 25, 2015, 4:13:23 PM4/25/15
to Ertugrul Söylemez, Haskell Cafe
are you sure you've evaluated how it interacts with this sort of optimization? I think it actually gets you pretty far!

Ertugrul Söylemez

unread,
Apr 25, 2015, 5:58:27 PM4/25/15
to Carter Schonwald, Haskell Cafe
> are you sure you've evaluated how it interacts with this sort of
> optimization? I think it actually gets you pretty far!

Most of the magic occurs in reification. The trouble is that it expects
a polymorphic function of rank 2 with a nontrivial context (Reifies), a
function that by construction cannot be specialised, unless you
specialise the receiving function (reify) for every application case.
The beauty of reflection is that it's free by virtue of sharing.
Unfortunately sharing is the exact opposite of inlining.

According to a benchmark I've done a few months ago it behaves exactly
as if the reflected value was just a shared argument with no inlining
performed; an expected and reasonable result.

What I'm really after is a sort of controlled inlining. That's pretty
much what instance-based specialisation currently does for dictionaries.
Technically a dictionary is just another argument, so there is no
fundamental reason why we shouldn't have a more general specialiser.
signature.asc

Carter Schonwald

unread,
Apr 25, 2015, 6:17:51 PM4/25/15
to Ertugrul Söylemez, Haskell Cafe
Ok cool. Sounds like we agree. 
Luite and I talked about something similar a few months ago, I'll try to dig up my notes and maybe we can turn this into a ghc feature request or patch


On Saturday, April 25, 2015, Ertugrul Söylemez <ert...@gmx.de> wrote:

wren romano

unread,
Apr 25, 2015, 9:10:26 PM4/25/15
to Haskell Cafe, Haskell Libraries
On Fri, Apr 24, 2015 at 9:06 AM, Ivan Lazar Miljenovic
<ivan.mi...@gmail.com> wrote:
> What is the validity of defining an Ord instance for types for which
> mathematically the `compare` function is partially ordered?

Defining Ord instances for types which are not totally ordered is *wrong*.

For example, due to the existence of NaN values, Double/Float are not
totally ordered and therefore their Ord instances are buggy. In my
logfloat package I have to explicitly add checks to work around the
issues introduced by the buggy Ord Double instance. This is why I
introduced the PartialOrd class, and I'm not the first one to create
such a class. We really ought to have an official PartialOrd class as
part of base/Prelude. The only question is whether to use Maybe
Ordering or a specially defined PartialOrdering type (the latter
optimizing for space and pointer indirection; the former optimizing
for reducing code duplication for manipulating the
Ordering/PartialOrdering types).

--
Live well,
~wren

Ertugrul Söylemez

unread,
Apr 25, 2015, 11:38:10 PM4/25/15
to Carter Schonwald, Haskell Cafe
> Ok cool. Sounds like we agree.
> Luite and I talked about something similar a few months ago, I'll try
> to dig up my notes and maybe we can turn this into a ghc feature
> request or patch

Great, I would definitely help with that!
signature.asc

Kostiantyn Rybnikov

unread,
Apr 27, 2015, 4:30:45 AM4/27/15
to wren romano, Haskell Libraries, haskell-cafe

While this is a bit off-topic, I'd like to add my 5 cents that often adding instances for common type-classes might be "bad" even when it's totally defined for all values, one example is a Monoid instance for HashMap.

So, I'd say that if you might be in doubt -- it's better to not add instance at all, since your users have no ability to remove it from their projects (or redefine).

26 квіт. 2015 04:10 "wren romano" <winterk...@gmail.com> пише:
Reply all
Reply to author
Forward
0 new messages