Re: [Haskell-cafe] Monad of no `return` Proposal (MRP): Moving `return` out of `Monad`

442 views
Skip to first unread message

Simon Thompson

unread,
Oct 5, 2015, 5:59:56 AM10/5/15
to Michał J Gajda, Graham Hutton, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List
Hello all. I write this to be a little provocative, but …

It’s really interesting to have this discussion, which pulls in all sorts of well-made points about orthogonality, teaching, the evolution of the language and so on, but it simply goes to show that the process of evolving Haskell is profoundly broken. 

Other languages do evolve, but in a managed and reflective way. Simply throwing in changes that would have a profound impact on systems that are commercially and potentially safety critical in an à la carte, offhand, way seems like a breakdown of the collective responsibility of the Haskell community to its users and, indirectly, to its future.

If we make claims - I believe rightly - that Haskell is hitting the mainstream, then we need to think about all changes in terms of the costs and benefits of each of them in the widest possible sense. There’s an old fashioned maxim that sums this up in a pithy way: “if it ain’t broke, don’t fix it”.

Simon Thompson



On 5 Oct 2015, at 10:47, Michał J Gajda <mjg...@gmail.com> wrote:

Hi,

As a person who used Haskell in all three capacities (for scientific research, for commercial purpose, and to introduce others to benefits of pure and strongly typed programming), I must voice an supportive voice for this change:
1. Orthogonal type classes are easier to explain.
2. Gradual improvements helps us to generalize further, and this in turn makes education easier.
3. Gradual change that break only a little help to prevent either stagnation (FORTRAN) and big breakage (py3k). That keeps us excited.

That would also call to split TCs into their orthogonal elements: return, ap, bind having the basic TC on their own.

So:
+1, but only if it is possible to have compatibilty mode. I believe that rebindable syntax should allow us to otherwise make our own prelude, if we want such a split. Then we could test it well before it is used by the base library.

That said, I would appreciate Haskell2010 option just like Haskell98 wad, so that we can compile old programs without changes. Even by using some Compat version of standard library. Would that satisfy need for stability?

PS And since all experts were beginners some time ago, I beg that we do not call them "peripheral".
--
  Best regards
    Michał

On Monday, 5 October 2015, Malcolm Wallace <malcolm...@me.com> wrote:
On other social media forums, I am seeing educators who use Haskell as a vehicle for their main work, but would not consider themselves Haskell researchers, and certainly do not have the time to follow Haskell mailing lists, who are beginning to say that these kinds of annoying breakages to the language, affecting their research and teaching materials, are beginning to disincline them to continue using Haskell.  They are feeling like they would be 
(...) 


--
  Pozdrawiam
    Michał
_______________________________________________
Haskell-prime mailing list
Haskel...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime

Simon Thompson | Professor of Logic and Computation 
School of Computing | University of Kent | Canterbury, CT2 7NF, UK
s.j.th...@kent.ac.uk | M +44 7986 085754 | W www.cs.kent.ac.uk/~sjt


Mario Blažević

unread,
Oct 5, 2015, 9:11:13 AM10/5/15
to haskel...@haskell.org
On 15-10-05 05:59 AM, Simon Thompson wrote:
> Hello all. I write this to be a little provocative, but …
>
> ...
> Other languages do evolve, but in a managed and reflective way.

Citation needed.

_______________________________________________
Haskell-Cafe mailing list
Haskel...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe

Sven Panne

unread,
Oct 5, 2015, 9:28:00 AM10/5/15
to Simon Thompson, Michał J Gajda, haskell-prime@haskell.org List, Haskell Libraries, haskell cafe, Graham Hutton
2015-10-05 11:59 GMT+02:00 Simon Thompson <s.j.th...@kent.ac.uk>:
[...] It’s really interesting to have this discussion, which pulls in all sorts of well-made points about orthogonality, teaching, the evolution of the language and so on, but it simply goes to show that the process of evolving Haskell is profoundly broken. [...]

I wouldn't necessarily call the process "broken", but it's a bit annoying: Because of the constant flux of minor changes in the language and the libraries, I've reached the stage where I'm totally unable to tell if my code will work for the whole GHC 7.x series. The only way I see is doing heavy testing on Travis CI and littering the code with #ifdefs after compilation failures. (BTW: Fun exercise: Try using (<>) and/or (<$>) in conjunction with -Wall. Bonus points for keeping the #ifdefs centralized. No clue how to do that...) This is less than satisfactory IMHO, and I would really prefer some other mode for introducing such changes: Perhaps these should be bundled and released e.g. every 2 years as Haskell2016, Haskell2018, etc. This way some stuff which belongs together (AMP, FTP, kicking out return, etc.) comes in slightly larger, but more sensible chunks.

Don't get me wrong: Most of the proposed changes in itself are OK and should be done, it's only the way they are introduced should be improved...

mant...@gsd.uwaterloo.ca

unread,
Oct 5, 2015, 9:44:07 AM10/5/15
to Sven Panne, Simon Thompson, Michał J Gajda, Graham Hutton, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List
Well, there are the *compat packages:

Base-compat
Transformers-compat
Mtl-compat 

Etc. They do centralize the ifdefs and give you compatibility with ‎GHC 7.*. I recently adopted the last two ones and they work like a charm. I am yet to adopt base-compat, so I don't know what the experience is with it. 

Michał

Herbert Valerio Riedel

unread,
Oct 5, 2015, 10:33:04 AM10/5/15
to Sven Panne, Haskell Libraries, Graham Hutton, haskell cafe, haskell-prime@haskell.org List

I think that part of the reason we have seen these changes occur in a
"constant flux" rather than in bigger coordinated chunks is that faith
in the Haskell Report process was (understandably) abandoned. And
without the Haskell Report as some kind of "clock generator" with which
to align/bundle related changes into logical units, changes occur
whenever they're proposed and agreed upon (which may take several
attempts as we've seen with the AMP and others).

I hope that the current attempt to revive the Haskell Prime process will
give us a chance to clean up the unfinished intermediate `base-4.8`
situation we're left with now after AMP, FTP et al, as the next Haskell
Report revision provides us with a milestone to work towards.

That being said, there's also the desire to have changes field-tested by
a wide audience on a wide range before integrating them into a Haskell
Report. Also I'm not sure if there would be less complaints if
AMP/FTP/MFP/MRP/etc as part of a new Haskell Report would be switched on
all at once in e.g. `base-5.0`, breaking almost *every* single package
out there at once.

For language changes we have a great way to field-test new extensions
before integrating them into the Report via `{-# LANGUAGE #-}` pragmas
in a nicely modular and composable way (i.e. a package enabling a
certain pragma doesn't require other packages to use it as well) which
have proven to be quite popular.

However, for the library side we lack a comparable mechanism at this
point. The closest we have, for instance, to support an isolated
Haskell2010 legacy environment is to use RebindableSyntax which IMO
isn't good enough in its current form[1]. And then there's the question
whether we want a Haskell2010 legacy environment that's isolated or
rather shares the types & typeclasses w/ `base`. If we require sharing
types and classes, then we may need some facility to implicitly
instanciate new superclasses (e.g. implicitly define Functor and
Applicative if only a Monad instance is defined). If we don't want to
share types & classes, we run into the problem that we can't easily mix
packages which depend on different revisions of the standard-library
(e.g. one using `base-4.8` and others which depend on a legacy
`haskell2010` base-API). One way to solve this could be to mutually
exclude depending on both , `base-4.8` and `haskell2010`, in the same
install-plan (assuming `haskell2010` doesn't depend itself on
`base-4.8`)

In any case, I think we will have to think hard how to address
language/library change management in the future, especially if the
Haskell code-base continues to grow. Even just migrating the code base
between Haskell Report revisions is a problem. An extreme example
is the Python 2->3 transition which the Python ecosystem is still
suffering from today (afaik). Ideas welcome!

[1]: IMO, we need something to be used at the definition site providing
desugaring rules, rather than requiring the use-site to enable a
generalised desugaring mechanism; I've been told that Agda has an
interesting solution to this in its base libraries via
{-# LANGUAGE BUILTIN ... #-} pragmas.


Regards,
H.V.Riedel

Gershom B

unread,
Oct 5, 2015, 10:33:45 AM10/5/15
to Graham Hutton, Simon Thompson, Michał J Gajda, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List
On October 5, 2015 at 6:00:00 AM, Simon Thompson (s.j.th...@kent.ac.uk) wrote:
> Hello all. I write this to be a little provocative, but …
>
> It’s really interesting to have this discussion, which pulls in all sorts of well-made
> points about orthogonality, teaching, the evolution of the language and so on, but it
> simply goes to show that the process of evolving Haskell is profoundly broken.
>
> Other languages do evolve, but in a managed and reflective way. Simply throwing in changes
> that would have a profound impact on systems that are commercially and potentially safety
> critical in an à la carte, offhand, way seems like a breakdown of the collective responsibility
> of the Haskell community to its users and, indirectly, to its future.

Hi Simon. I do in fact think this is provocative :-P

I want to object here to your characterization of what has been going on as “simply throwing in changes”. The proposal seems very well and carefully worked through to provide a good migration strategy, even planning to alter the source of GHC to ensure that adequate hints are given for the indefinite transition period.

I also want to object to the idea that these changes would have “a profound impact on systems”. As it stands, and I think this is an important criteria in any change, when “phase 2” goes into affect, code that has compiled before may cease to compile until a minor change is made. However, code that continues to compile will continue to compile with the same behavior.

Now as to process itself, this is a change to core libraries. It has been proposed on the libraries list, which seems appropriate, and a vigorous discussion has ensued. This seems like a pretty decent process to me thus far. Do you have a better one in mind?

—Gershom

P.S. as a general point, I sympathize with concerns about breakage resulting from this, but I also think that the migration strategy proposed is good, and if people are concerned about breakage I think it would be useful if they could explain where they feel the migration strategy is insufficient to allay their concerns.

Bryan O'Sullivan

unread,
Oct 5, 2015, 10:59:42 AM10/5/15
to Gershom B, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, Haskell Libraries, haskell cafe
I would like to suggest that the bar for breaking all existing libraries, books, papers, and lecture notes should be very high; and that the benefit associated with such a breaking change should be correspondingly huge.

This proposal falls far short of both bars, to the extent that I am astonished and disappointed it is being seriously discussed – and to general approval, no less – on a date other than April 1. Surely some design flaws have consequences so small that they are not worth fixing.

I'll survive if it goes through, obviously, but it will commit me to a bunch of pointless make-work and compatibility ifdefs. I've previously expressed my sense that cross-version compatibility is a big tax on library maintainers. This proposal does not give me confidence that this cost is being taken seriously.

Thanks,
Bryan.

Gershom B

unread,
Oct 5, 2015, 11:10:41 AM10/5/15
to Bryan O'Sullivan, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, Haskell Libraries, haskell cafe
On October 5, 2015 at 10:59:35 AM, Bryan O'Sullivan (b...@serpentine.com) wrote:
> I would like to suggest that the bar for breaking all existing libraries, books, papers,
> and lecture notes should be very high; and that the benefit associated with such a breaking
> change should be correspondingly huge.
>

My understanding of the argument here, which seems to make sense to me, is that the AMP already introduced a significant breaking change with regards to monads. Books and lecture notes have already not caught up to this, by and large. Hence, by introducing a further change, which _completes_ the general AMP project, then by the time books and lecture notes are all updated, they will be able to tell a much nicer story than the current one?

As for libraries, it has been pointed out, I believe, that without CPP one can write instances compatible with AMP, and also with AMP + MRP. One can also write code, sans CPP, compatible with pre- and post- AMP.

So the reason for choosing to not do MRP simultaneous with AMP was precisely to allow a gradual migration path where, sans CPP, people could write code compatible with the last three versions of GHC, as the general criteria has been.

So without arguing the necessity or not, I just want to weigh in with a technical opinion that if this goes through, my _estimation_ is that there will be a smooth and relatively painless migration period, the sky will not fall, good teaching material will remain good, those libraries that bitrot will tend to do so for a variety of reasons more significant than this, etc.

It is totally reasonable to have a discussion on whether this change is worth it at all. But let’s not overestimate the cost of it just to further tip the scales :-)

—gershom

Johan Tibell

unread,
Oct 5, 2015, 11:13:09 AM10/5/15
to haskell cafe, Haskell Libraries
Perhaps we should weigh the +1 and -1s in this thread with the number of lines of Haskell written by the voter? ;)

_______________________________________________
Libraries mailing list
Libr...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries

Henning Thielemann

unread,
Oct 5, 2015, 1:11:36 PM10/5/15
to Johan Tibell, Haskell Libraries, haskell cafe

On Mon, 5 Oct 2015, Johan Tibell wrote:

> Perhaps we should weigh the +1 and -1s in this thread with the number of
> lines of Haskell written by the voter? ;)

My prefered measure would the number of Haskell packages hosted at
hub.darcs.net. :-)

Dimitri DeFigueiredo

unread,
Oct 5, 2015, 1:28:08 PM10/5/15
to haskel...@haskell.org
+1

I think this idea is good and should not be taken lightly. I'm a newcomer to the community and currently hold a grand total of *zero* open source contributions. Obviously, I would like to change this soon, but I think it is very *unfair* and makes absolutely no sense to have the standard one person one vote rule for decisions involving the libraries.

Let the code produced vote. Maybe weight them by downloads?

Dimitri

John Lato

unread,
Oct 5, 2015, 2:33:59 PM10/5/15
to Dimitri DeFigueiredo, haskel...@haskell.org

I think code-based weighting should not be considered at all. There's a non-trivial amount of proprietary code that wouldn't be counted, and likely can't be accounted for accurately. I have some unknown amount of code at my previous employer (which does include Monad instances) that now has to be maintained by others. I may have said +1 (and I'm reconsidering), but the current maintainers probably don't want the extra make-work this involves.

There's an argument that this proposal involves very little up-front breakage. This is true but disingenuous, because the breakage will still happen in the future unless changes are made and committed. The entire maintenance load should be considered, not just the immediate load.

Bryan, nothing I've seen since I started using Haskell makes me think that the libraries committee or ghc devs value back compatibility much at all.  My favorite example is RecursiveDo, which was deprecated in favor of DoRec, only for that to be reversed in the next ghc release. Of course that was more irritating than breaking, but it's indicative of the general value placed on maintenance programming.

Gregory Collins

unread,
Oct 5, 2015, 2:34:42 PM10/5/15
to Gershom B, haskell-prime@haskell.org List, Graham Hutton, Haskell Libraries, Michał J Gajda, haskell cafe

On Mon, Oct 5, 2015 at 8:09 AM, Gershom B <gers...@gmail.com> wrote:
My understanding of the argument here, which seems to make sense to me, is that the AMP already introduced a significant breaking change with regards to monads. Books and lecture notes have already not caught up to this, by and large. Hence, by introducing a further change, which _completes_ the general AMP project, then by the time books and lecture notes are all updated, they will be able to tell a much nicer story than the current one?

This is a multi-year, "boil the ocean"-style project, affecting literally every Haskell user, and the end result after all of this labor is going to be... a slightly spiffier bike shed?

Strongly -1 from me also. My experience over the last couple of years is that every GHC release breaks my libraries in annoying ways that require CPP to fix:

~/personal/src/snap λ  find . -name '*.hs' | xargs egrep '#if.*(MIN_VERSION)|(GLASGOW_HASKELL)' | wc -l
64

As a user this is another bikeshedding change that is not going to benefit me at all. Maintaining a Haskell library can be an exasperating exercise of running on a treadmill to keep up with busywork caused by changes to the core language and libraries. My feeling is starting to become that the libraries committee is doing as much (if not more) to cause problems and work for me than it is doing to improve the common infrastructure.

G
--
Gregory Collins <gr...@gregorycollins.net>

Nathan Bouscal

unread,
Oct 5, 2015, 2:50:54 PM10/5/15
to haskel...@haskell.org
I'm a strong +1 on this (and on Edward's addendum to it), and I object to the characterization of this change as having only trivial benefits. Simplicity and correctness are never trivial. In my view, disagreement on how this change is made (migration strategy, etc.) is completely reasonable, but disagreement on whether to make it is not.

There have been a lot of objections based on the idea that learners will consult books that are out of date, but the number of learners affected by this is dwarfed by the number of learners who will use updated materials and be confused by this strange historical artifact. Permanently-enshrined historical artifacts accrete forever and cause linear confusion, whereas outdated materials are inevitably replaced such that the amount of confusion remains constant.

There was also an argument that Haskell has a “window of opportunity”, implying that breakages are more likely to cost us future valued community members than historical artifacts are. I couldn't disagree more. If it weren't for Haskell's past willingness to make changes when we learned better ways of doing things, I doubt I would presently be using the language. I would much rather add a marginal community member with a strong preference for cleanliness, simplicity, and correctness than one with a strong preference against making occasional small changes to their code.

Sven Panne

unread,
Oct 5, 2015, 2:59:12 PM10/5/15
to Gershom B, haskell-prime@haskell.org List, Graham Hutton, Haskell Libraries, Michał J Gajda, haskell cafe
2015-10-05 17:09 GMT+02:00 Gershom B <gers...@gmail.com>:
On October 5, 2015 at 10:59:35 AM, Bryan O'Sullivan (b...@serpentine.com) wrote:
[...] As for libraries, it has been pointed out, I believe, that without CPP one can write instances compatible with AMP, and also with AMP + MRP. One can also write code, sans CPP, compatible with pre- and post- AMP. [...]

Nope, at least not if you care about -Wall: If you take e.g. (<$>) which is now part of the Prelude, you can't simply import some compatibility module, because GHC might tell you (rightfully) that that import is redundant, because (<$>) is already visible through the Prelude. So you'll have to use CPP to avoid that import on base >= 4.8, be it from it Data.Functor, Control.Applicative or some compat-* module. And you'll have to use CPP in each and every module using <$> then, unless I miss something obvious. AFAICT all transitioning guides ignore -Wall and friends...

Johan Tibell

unread,
Oct 5, 2015, 3:01:54 PM10/5/15
to Gregory Collins, Gershom B, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, Haskell Libraries, haskell cafe
On the libraries I maintain and have a copy of on my computer right now: 329

Erik Hesselink

unread,
Oct 5, 2015, 3:02:58 PM10/5/15
to Sven Panne, Gershom B, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, Haskell Libraries, haskell cafe
Does the hack mentioned on the GHC trac [1] work for this? It seems a
bit fragile but that page says it works and it avoids CPP.

Erik

[1] https://ghc.haskell.org/trac/ghc/wiki/Migration/7.10#GHCsaysTheimportof...isredundant

Johan Tibell

unread,
Oct 5, 2015, 3:06:05 PM10/5/15
to Erik Hesselink, Gershom B, Haskell Libraries, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, haskell cafe
On Mon, Oct 5, 2015 at 9:02 PM, Erik Hesselink <hess...@gmail.com> wrote:
On 5 October 2015 at 20:58, Sven Panne <sven...@gmail.com> wrote:
> 2015-10-05 17:09 GMT+02:00 Gershom B <gers...@gmail.com>:
>>
>> [...] As for libraries, it has been pointed out, I believe, that without
>> CPP one can write instances compatible with AMP, and also with AMP + MRP.
>> One can also write code, sans CPP, compatible with pre- and post- AMP. [...]
>
> Nope, at least not if you care about -Wall: If you take e.g. (<$>) which is
> now part of the Prelude, you can't simply import some compatibility module,
> because GHC might tell you (rightfully) that that import is redundant,
> because (<$>) is already visible through the Prelude. So you'll have to use
> CPP to avoid that import on base >= 4.8, be it from it Data.Functor,
> Control.Applicative or some compat-* module. And you'll have to use CPP in
> each and every module using <$> then, unless I miss something obvious.
> AFAICT all transitioning guides ignore -Wall and friends...

Does the hack mentioned on the GHC trac [1] work for this? It seems a
bit fragile but that page says it works and it avoids CPP.

No it doesn't, if you also don't want closed import lists (which you should).
 

Christopher Allen

unread,
Oct 5, 2015, 3:08:15 PM10/5/15
to Johan Tibell, Gershom B, Haskell Libraries, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, haskell cafe
I'm writing a book (http://haskellbook.com/) with my coauthor. It is up to date with GHC 7.10. AMP made things better, not harder, with respect to teaching Haskell. BBP required some explanation of "ignore this type, we're asserting a different one", but the positives are still better than the negatives.

Please don't use existing or forthcoming books as an excuse to do or not-do things. Do what's right for the users of the language.

_______________________________________________
Libraries mailing list
Libr...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries




--
Chris Allen
Currently working on http://haskellbook.com

Marcin Mrotek

unread,
Oct 5, 2015, 3:21:58 PM10/5/15
to John Lato, Haskell-Cafe
> There's an argument that this proposal involves very little up-front
> breakage. This is true but disingenuous, because the breakage will still
> happen in the future unless changes are made and committed. The entire
> maintenance load should be considered, not just the immediate load.

I don't have much to say about breakage (and thus I've refrained from
voting), as a maintainer of just 7 packages of which exactly zero are
compatible with anything else than the latest GHC (well, no one
complained, I'm rather lazy, and if it's any defense I can't even run
GHC 7.10.1 on my Arch Linux box because of inappropriate versions of
dynamically linked libraries), but a honest question: does this
proposal entail anything else than replacing every "return" with
"pure" and conditional compilation of "return = pure" in Monad
instances?

Best regards,
Marcin Mrotek

mant...@gsd.uwaterloo.ca

unread,
Oct 5, 2015, 5:21:21 PM10/5/15
to haskell cafe, Haskell Libraries, haskell-prime@haskell.org List
I loved the statement made by Ryan Trinkle in his recent talk [1]: "since we cannot predict all the changes needed the future, we must have an architecture that will support any change" (it's roughly like that, not an exact quote, about 4minutes in).

Having to use IFDEFs is a sign that the language itself is not expressive enough to deal with these kinds of changes and therefore it's not having the right "architecture". Is Backpack going to provide the necessary support, (e.g., ability to mix in package fragments with extensions instead of making breaking changes)? What kind of language features are still missing?

Anyway, mistakes were made in the past, because had we known any better we would have done better. Nothing should prevent us from fixing mistakes.

Regarding IFDEFS, all of them disappeared from my code once I used transformers-compat and mtl-compat. ‎Also the trick with import Prelude worked-you just have to list what you hide instead of what you import (which is weird...). 

--
Michał

[1] http://www.infoq.com/presentations/functional-techniques-complexity

Adam Foltzer

unread,
Oct 5, 2015, 5:23:18 PM10/5/15
to Herbert Valerio Riedel, haskell-prime@haskell.org List, Graham Hutton, Haskell Libraries, haskell cafe
Thanks for splitting this off, as it really deserves its own conversation.

% find cryptol -name '*.hs' | xargs egrep '#if.*(MIN_VERSION)|(GLASGOW_HASKELL)' | wc -l
      49
% find saw-script -name '*.hs' | xargs egrep '#if.*(MIN_VERSION)|(GLASGOW_HASKELL)' | wc -l
     242

I introduced most of these in order to accommodate AMP, and now I learn that there is another proposal that is considered to be part-and-parcel with AMP where I will have to make yet more changes to the same code and presumably introduce another layer of #ifdefs. As proposed, I will spend 2*n hours implementing, testing, and releasing these changes. Had both changes been bundled, it would have been 2*(n+ε).

 Also I'm not sure if there would be less complaints if AMP/FTP/MFP/MRP/etc as part of a new Haskell Report would be switched on all at once in e.g. `base-5.0`, breaking almost *every* single package out there at once.

I doubt the number of complaints-per-change would be fewer, but I'm strongly in favor of moving away from what feels like a treadmill that doesn't value the time of developers and that doesn't account for the more-than-sum-of-parts cost of the "constant flux".

Thanks,
Adam

Bryan Richter

unread,
Oct 5, 2015, 6:19:04 PM10/5/15
to mant...@gsd.uwaterloo.ca, Haskell Libraries, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, haskell cafe
On Mon, Oct 5, 2015 at 06:43-0700, mantkiew wrote:
>
> Well, there are the *compat packages:
>
> Base-compat
> Transformers-compat
> Mtl-compat 
>
> Etc. They do centralize the ifdefs and give you compatibility with
> GHC 7.*. I recently adopted the last two ones and they work like
> a charm. I am yet to adopt base-compat, so I don't know what the
> experience is with it.

Hang on a moment, are you saying that all the people writing to argue
that these changes would require them to write dozens more #ifdef's
actually don't have to write any at all? I never knew what the *-compat
packages were all about. If that's what they're designed to do, I have a
feeling they have not gotten *nearly* enough exposure.

[Apologies for possible cross-posting; this thread jumped into my inbox
from I-know-not-where and already has half a dozen CCs attached.]

Alexander Berntsen

unread,
Oct 5, 2015, 6:29:43 PM10/5/15
to haskel...@haskell.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

On 05/10/15 20:50, Nathan Bouscal wrote:
> There have been a lot of objections based on the idea that
> learners will consult books that are out of date, but the number of
> learners affected by this is dwarfed by the number of learners who
> will use updated materials and be confused by this strange
> historical artifact. Permanently-enshrined historical artifacts
> accrete forever and cause linear confusion, whereas outdated
> materials are inevitably replaced such that the amount of confusion
> remains constant.
Thank you for making this point

I would be very saddened if the appeal to history (i.e. technical
debt) would halt our momentum. That's what happens to most things both
in and out of computer science. And it's honestly depressing.
- --
Alexander
alex...@plaimi.net
https://secure.plaimi.net/~alexander
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQIcBAEBCgAGBQJWEvnNAAoJENQqWdRUGk8BrqEQAKcWj2Gv/4gVzTq++m1lU+r1
9TxdBXG+V+y66yyqpZAv4tOOCVtkDYR6/qUGtpYO5cdmh8mYKh5PUvb/p/l1AUQl
Ug8gVO+u+yvwkVif8Jhhl+e8JqYGPgH6+lUvA8VE47VNkYGKMsNlXFYPik8Sc22w
6EhS7SRhR57quOclQw2NRIxS4F3ZqE7YKkXETId9QBtder9e6OEYdc4pQivcr46H
FHzK3ybRF80U/3lKivPFo/114ICrS0l/Mneqf2ITLso6HFAZXhms5RzuSOaxLSbI
xAV2k9gRv6cPWdMgx7DCjiOOsNc78peAcqwlEdQ5dJWGs5fu70hsKqNAL3LYCmRC
YTcC2F1kJmuKYucHzfDLFYiVgbn03ehZkkx4b9NFQyHwj8rBNn4E4JspjOR/ej9w
p3e3lGCj/Voouq+bIb5AAlp01Bioxew/+ewQeI739js9b9LE0wZQvFbYfngxdmf4
Z7IADHsfou7xtiChXbSkOlOEI3mDYTXXxeTSmF/OY7HVnCReCKtVa0Aj5j9G116V
LrMeUegOFMazlbpyG2GGvp7zD/3xTH3v6zpNcj8ijsCIXtch7ygebA5ecXZ0m30s
y6VoVMPkQtHdAaaO5qi7MY+/cSNAiJdEcKR4hSZxPrFqUsiOJ006FMhh1PcSRjBx
3IMLL+8mPsvTnfWDj+NY
=Mwbu
-----END PGP SIGNATURE-----

Tony Morris

unread,
Oct 5, 2015, 6:31:35 PM10/5/15
to haskel...@haskell.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

+1

On 06/10/15 08:29, Alexander Berntsen wrote:
> On 05/10/15 20:50, Nathan Bouscal wrote:
>> There have been a lot of objections based on the idea that
>> learners will consult books that are out of date, but the number
>> of learners affected by this is dwarfed by the number of
>> learners who will use updated materials and be confused by this
>> strange historical artifact. Permanently-enshrined historical
>> artifacts accrete forever and cause linear confusion, whereas
>> outdated materials are inevitably replaced such that the amount
>> of confusion remains constant.
> Thank you for making this point
>
> I would be very saddened if the appeal to history (i.e. technical
> debt) would halt our momentum. That's what happens to most things
> both in and out of computer science. And it's honestly depressing.
> _______________________________________________ Haskell-Cafe
> mailing list Haskel...@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQEcBAEBCAAGBQJWEvo3AAoJEFkczQCvdvv0BQMH/2oxlo35Y4oFw2ZYMMjmb8es
Asm7IrQCOoYRH10mgFhZiNPsZakRrEBsk6QD6iyIk6DGGoA7YBuQ2DwOE3Hzr2Ih
jelQbB0Lqs6P3bj5EibZ1qad+g0zR9RWkjp+7R+zsbhQ3cu4WITT0EI5nwLIacE3
xKmT9WuWcd166rRJrpFSs7gCzQwtWPHropQgnXttx/85Uw6zxA//EimUqsIaLPI7
Yu8IlxqpbICgq7uWni1f1EVajNEIk4qewezrlahrJuBMcBqZ7jAknMIO06UIGFjv
ZZm5lBRc+c++XmwSAVdo+5StiILzHftrlqsW3dVoJLvFFKdKYRDhd3auA+PB/Oo=
=ND0C

Gregory Collins

unread,
Oct 5, 2015, 8:40:55 PM10/5/15
to Bryan Richter, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, Haskell Libraries, haskell cafe

On Mon, Oct 5, 2015 at 3:18 PM, Bryan Richter <b...@chreekat.net> wrote:
Hang on a moment, are you saying that all the people writing to argue
that these changes would require them to write dozens more #ifdef's
actually don't have to write any at all?

Um, no, it usually isn't anything like that. Here's a sampling of some of the things I've used CPP for in the past few years:
  • After GHC 7.4, when using a newtype in FFI imports you need to import the constructor, i.e. "import Foreign.C.Types(CInt(..))" --- afaik CPP is the only way to shut up warnings everywhere
  • defaultTimeLocale moved from System.Locale to Data.Time.Format in time-1.5 (no compat package for this, afaik)
  • one of many various changes to Typeable in the GHC 7.* series (deriving works better now, mkTyCon vs mkTyCon3, etc)
  • Do I have to hide "catch" from Prelude, or not? It got moved, and "hiding" gives an error if the symbol you're trying to hide is missing. Time to break out the CPP (and curse myself for not just using the qualified import in the first place)
  • Do I get monoid functions from Prelude or from Data.Monoid? Same w/ Applicative, Foldable, Word. I don't know where anything is supposed to live anymore, or which sequence of imports will shut up spurious warnings on all four versions of GHC I support, so the lowest-friction fix is: break out the #ifdef spackle
  • ==# and friends return Int# instead of Bool after GHC 7.8.1
  • To use functions like "tryReadMVar", "unsafeShiftR", and "atomicModifyIORef'" that are in recent base versions but not older ones (this is a place where CPP use is actually justified)
--
Gregory Collins <gr...@gregorycollins.net>

wren romano

unread,
Oct 5, 2015, 8:49:28 PM10/5/15
to haskell-prime@haskell.org List, Haskell Libraries, haskell cafe
On Mon, Oct 5, 2015 at 5:23 PM, Adam Foltzer <acfo...@gmail.com> wrote:
>> Also I'm not sure if there would be less complaints if
>> AMP/FTP/MFP/MRP/etc as part of a new Haskell Report would be switched on all
>> at once in e.g. `base-5.0`, breaking almost *every* single package out there
>> at once.
>
> I doubt the number of complaints-per-change would be fewer, but I'm strongly
> in favor of moving away from what feels like a treadmill that doesn't value
> the time of developers and that doesn't account for the
> more-than-sum-of-parts cost of the "constant flux".

Broadly speaking, I'm a "fix it now rather than later" sort of person
in Haskell because I've seen how long things can linger before finally
getting fixed (even when everyone agrees on what the fix should be and
agrees that it should be done). However, as I mentioned in the
originating thread, I think that —at this point— when it comes to
AMP/FTP/MFP/MRP/etc we should really aim for the haskell' committee to
work out a comprehensive solution (as soon as possible), and then
enact all the changes at once when switching to
Haskell201X/base-5.0/whatevs. I understand the motivations for wanting
things to be field-tested before making it into the report, but I
don't think having a series of rapid incremental changes is the
correct approach here. Because we're dealing with the Prelude and the
core classes, the amount of breakage (and CPP used to paper over it)
here is much higher than our usual treadmill of changes; so we should
take that into account when planning how to roll the changes out.

--
Live well,
~wren

Michał Antkiewicz

unread,
Oct 5, 2015, 8:49:52 PM10/5/15
to Bryan Richter, haskell cafe
All I am saying is that these various *compat packages manage to
encapsulate the ifdefs to some degree.

Here's the record of my struggle [1]. Then came Edward K. and told me
to use transformers-compat [2]. Then Adam B. chimed in suggesting
mtl-compat [3]. That's it. All my IFDEFs related to this topic were
gone.

I looked at base-compat [4] and it's very interesting. The big question is:

**Why cannot normal base be written the same way as base-compat? **
It would then automatically provide compatibility across all GHC versions.

Michał

[1] https://www.reddit.com/r/haskell/comments/3gqqu8/depending_on_both_mtl2131_for_stackage_lts2_and/
[2] https://www.reddit.com/r/haskell/comments/3gqqu8/depending_on_both_mtl2131_for_stackage_lts2_and/cu0l6nf
[3] https://www.reddit.com/r/haskell/comments/3gqqu8/depending_on_both_mtl2131_for_stackage_lts2_and/cu5g73q
[4] https://hackage.haskell.org/package/base-compat

Phil Ruffwind

unread,
Oct 5, 2015, 9:00:47 PM10/5/15
to Adam Foltzer, haskell-prime@haskell.org List, Graham Hutton, Herbert Valerio Riedel, Haskell Libraries, haskell cafe
Having so many #ifdefs isn't by itself a major problem. Yes, it does
introduce a small increase in compilation time and the size of the
codebase. The real cost is the developer time: every developer has to
come up with these these #ifdef clauses from scratch for every change
that gets made, tailored to their specific code. As more and more get
added, it becomes more and more of a confusing mess.

It makes me wonder if this can be automated somehow. It would be nice
to have a mechanism to alleviate this cost so that most developers
downstream (provided that the code was written in a reasonable manner)
only need to make a minimal effort to keep up, while still being able
to write code that works for a reasonably large range of GHC versions.
The burden of breaking changes right now is on the downstream
developers, but perhaps there could be a way to shift most of that
upstream to avoid this large duplication of effort.

Haskell should be allowed to evolve, but there also needs to be a
counterweight mechanism that provides stability in the face of
constant changes. It would be something similar in spirit to
base-compat, but I don't think a library package alone is powerful
enough to solve the problem: a missing 'return' for example is not
something a library can just patch in.

I don't have any preference for "lots of small changes" vs "one big
change": in the former, there is a lot of overhead needed to keep
track of and fix these small changes; in the latter, there is a risk
of introducing a rift that fragments the community (cf Python 2 vs 3).
Maybe something in-between would be the best.

ami...@gmail.com

unread,
Oct 5, 2015, 9:20:23 PM10/5/15
to Alexander Berntsen, haskel...@haskell.org
IMO, the "tech debt" you're talking about feels very small. We've already made the change that return = pure by default. The historical baggage that this proposal cleans up is just the fact that legacy code is able to define its own "return" without breaking (which must be the same as the definition of pure anyway).
I am also moving from +0.5 to +0 on this.

Tom

ami...@gmail.com

unread,
Oct 5, 2015, 9:24:11 PM10/5/15
to Phil Ruffwind, Herbert Valerio Riedel, Graham Hutton, haskell-prime@haskell.org List, Haskell Libraries, haskell cafe
Another problem with #ifdefs (especially machine-generated ones) is that it makes code much harder to read. One of the things I love about Haskell is the ability to read code and literally see an author describe how they're thinking about the domain. #ifdefs make life less fun :)

Tom

Tony Morris

unread,
Oct 5, 2015, 9:52:34 PM10/5/15
to haskel...@haskell.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

- -- I am going to do some logging, yay!
data Logs a =
Logs a [a]
deriving (Eq, Show)

- -- one log message
singlelog ::
a
-> Logs a
singlelog a =
Logs a []

- -- two log messages
twologs ::
a
-> a
-> Logs a
twologs a1 a2 =
Logs a1 [a2]

class Semigroup a where
(<>) ::
a
-> a
-> a

- -- I can append logs
instance Semigroup (Logs a) where
Logs h1 t1 <> Logs h2 t2 =
Logs h1 (t1 ++ h2 : t2)

- -- I can map on Logs
instance Functor Logs where
fmap f (Logs h t) =
Logs (f h) (fmap f t)

- -- I will collect logs with a value
data WriteLogs l a =
WriteLogs (Logs l) a
deriving (Eq, Show)

- -- I can map the pair of logs and a value
instance Functor (WriteLogs l) where
fmap f (WriteLogs l a) =
WriteLogs l (f a)

singlewritelog ::
l
-> a
-> WriteLogs l a
singlewritelog l a =
WriteLogs (singlelog l) a

- -- Monad without return
class Bind f where
(-<<) ::
(a -> f b)
-> f a
-> f b

- -- Can I Applicativate WriteLogs? Let's see.
instance Applicative (WriteLogs l) where
-- Well that was easy.
WriteLogs l1 f <*> WriteLogs l2 a =
WriteLogs (l1 <> l2) (f a)
pure a =
WriteLogs (error "wait, what goes here?") a
-- Oh I guess I cannot Applicativate WriteLogs, but I can Apply them!

- -- Well there goes that idea.
- -- instance Monad (WriteLogs l) where

- -- Wait a minute, can I bind WriteLogs?
instance Bind (WriteLogs l) where
-- Of course I can!
f -<< WriteLogs l1 a =
let WriteLogs l2 b = f a
in WriteLogs (l1 <> l2) b

- -- OK here goes ...
myprogram ::
WriteLogs String Int
myprogram =
-- No instance for (Monad (WriteLogs String))
-- RAR!, why does do-notation require extraneous constraints?!
-- Oh that's right, Haskell is broken.
-- Oh well, I guess I need to leave Prelude turned off and rewrite
the base libraries.
do a <- singlewritelog "message" 18
b <- WriteLogs (twologs "hi" "bye") 73
WriteLogs (singlelog "exit") (a * b)

- -- One day, one day soon, I can move on.


On 06/10/15 11:20, ami...@gmail.com wrote:
> IMO, the "tech debt" you're talking about feels very small. We've
> already made the change that return = pure by default. The
> historical baggage that this proposal cleans up is just the fact
> that legacy code is able to define its own "return" without
> breaking (which must be the same as the definition of pure
> anyway). I am also moving from +0.5 to +0 on this.
>
> Tom
>
>
>> El 5 oct 2015, a las 18:29, Alexander Berntsen
>> <alex...@plaimi.net> escribió:
>>

>>>> On 05/10/15 20:50, Nathan Bouscal wrote: There have been a
>>>> lot of objections based on the idea that learners will
>>>> consult books that are out of date, but the number of
>>>> learners affected by this is dwarfed by the number of
>>>> learners who will use updated materials and be confused by
>>>> this strange historical artifact. Permanently-enshrined
>>>> historical artifacts accrete forever and cause linear
>>>> confusion, whereas outdated materials are inevitably replaced
>>>> such that the amount of confusion remains constant.
> Thank you for making this point
>
> I would be very saddened if the appeal to history (i.e. technical
> debt) would halt our momentum. That's what happens to most things
> both in and out of computer science. And it's honestly depressing.

>> _______________________________________________ Haskell-Cafe
>> mailing list Haskel...@haskell.org
>> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
> _______________________________________________ Haskell-Cafe
> mailing list Haskel...@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
>

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iQEcBAEBCAAGBQJWEylQAAoJEFkczQCvdvv08KMIAIGEwj6dQqnk8Z3zjFC1Vpvb
LTXdnzlcXCMXmIdzr9RaSGUKo52b3BPaP6EgFDJm8U/CJYQ/X8FAyy0gmKVJlru4
JQc4Y+CcGqz7+UwfYRlOOJDFNscigvGDj33N3hp3G/HuWfvllWJSx9n7gTqSrnXS
W/jTDN3sntJWiCdC+A5rLoqzH3eZ2LhwB0iL26DSfE1OLPyBK2kignKCjnMtRbEq
xY5vjx7xLQKzApRARIrBdDNVuXVRy+QQyGTGmdOKaLscNNrMzewcUr8LZLRG+6W7
RKA12y3etcVXGRlACHNn67mUKJIlKWX5PZSVsj07SZNWp3eyDctHfKuqYonfgJU=
=G8G2

Rustom Mody

unread,
Oct 5, 2015, 11:44:09 PM10/5/15
to Haskell Cafe
On Tue, Oct 6, 2015 at 4:29 AM, Doug McIlroy <do...@cs.dartmouth.edu> wrote:
> littering the code with #ifdefs

>  Bonus points for keeping the #ifdefs centralized

Littering is the right word. "Bonus" is merely less negative points.

It is ironic that the beautiful Haskell should progress by
adopting the worst feature of C. #ifdef is a sticking-plaster
for non-portable code. It is inserted at global level, usually
to effect changes at the bottom of the code hierarchy. (Maybe
in Haskell it should be a monad, in competition with IO for
the outermost layer.)

Plan 9 showed the way out of ifdef, by putting (non-ifdef-ed)
inescapably nonportable code, such as endianity, in compilation
units and #include files in a distinct part of the file system
tree selected by the shell.

Another trick is to use plain if rather than #if or #ifdef,
and let the compiler optimize away the unwanted side of the
branch.


In any event, #ifdef is, as Simon evidently agrees, telling
evidence of non-portability. It is hard to imagine an uglier
"solution" to keeping up with the drip-drip-drip of Haskell
evolution.

 
The python 2→3 transition may have caused some pain to some. However it would have been far more difficult if they had not provided tools like
Since the change here is trivial and pervasive why is such a tool not being considered?
[Or have I missed the discussion?]

In particular we can envisage a tool say 8to10 that has these modes (command-line flags)
--portable
--readable
--info

with the idea that
- portable is most unreadable (ifdef litter)
- readable is not portable -- generate a copy of the whole source tree that will have the changes and not be compatible for ghc < 7.8
  This is for those library authors that prefer to maintain a clean code base and hell with earlier ghc versions
- info summarizes which files would change and how so that a suitable reorganization of the files/directories can be done prior to the switch
  This is for those library authors that would like to take the trouble to have a compatibility layer and have code as readable as possible overall

[Whether pure is really preferable to return is another matter --
For me one of the strongest criticisms of the python 2→3 switch is this that if they were going to break things anyhow why was the cleanup not more far-reaching?

Lets keep this note in these parentheses :-)

]

Ivan Lazar Miljenovic

unread,
Oct 6, 2015, 3:04:14 AM10/6/15
to Gregory Collins, Haskell Libraries, haskell cafe
On 6 October 2015 at 11:40, Gregory Collins <gr...@gregorycollins.net> wrote:
>
> defaultTimeLocale moved from System.Locale to Data.Time.Format in time-1.5
> (no compat package for this, afaik)

http://hackage.haskell.org/package/time-locale-compat

--
Ivan Lazar Miljenovic
Ivan.Mi...@gmail.com
http://IvanMiljenovic.wordpress.com

Herbert Valerio Riedel

unread,
Oct 6, 2015, 3:10:29 AM10/6/15
to Johan Tibell, Gershom B, Haskell Libraries, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, haskell cafe
On 2015-10-05 at 21:01:16 +0200, Johan Tibell wrote:
> On Mon, Oct 5, 2015 at 8:34 PM, Gregory Collins <gr...@gregorycollins.net>
[...]

>> Strongly -1 from me also. My experience over the last couple of years is
>> that every GHC release breaks my libraries in annoying ways that require
>> CPP to fix:
>>
>> ~/personal/src/snap λ find . -name '*.hs' | xargs egrep
>> '#if.*(MIN_VERSION)|(GLASGOW_HASKELL)' | wc -l
>> 64

[...]

> On the libraries I maintain and have a copy of on my computer right now: 329


Although this was already pointed out to you in a response to a Tweet of
yours, I'd like to expand on this here to clarify:


You say that you stick to the 3-major-ghc-release support-window
convention for your libraries. This is good, because then you don't need
any CPP at all! Here's why:

So when GHC 8.2 is released, your support-window requires you to support
GHC 7.10 and GHC 8.0 in addition to GHC 8.2.

At this point you'll be happy that you can start dropping those #ifdefs
you added for GHC 7.10 in your code in order to adapt to FTP & AMP.

And when you do *that*, you can also drop all your `return = pure`
methods overrides. (Because we prepared for MRP already in GHC 7.10 by
introducing the default implementation for `return`!)

This way, you don't need to introduce any CPP whatsoever due to MRP!


Finally, since we're not gonna remove `return` in GHC 8.2 anyway, as GHC
8.2 was just the *earliest theoretical* possible GHC in which this
*could* have happened. Realistically, this would happen at a much later
point, say GHC 8.6 or even later! Therefore, the scheme above would
actually work for 5-year time-windows! And there's even an idea on the
table to have a lawful `return = pure` method override be tolerated by
GHC even when `return` has already moved out of `Monad`!


PS: I'm a bit disappointed you seem to dismiss this proposal right away
categorically without giving us a chance to address your
concerns. The proposal is not a rigid all-or-nothing thing that
can't be tweaked and revised. That's why we're having these
proposal-discussions in the first place (rather than doing blind
+1/-1 polls), so we can hear everyone out and try to maximise the
agreement (even if we will never reach 100% consensus on any
proposal).

So please, keep on discussing!

Mike Meyer

unread,
Oct 6, 2015, 3:58:45 AM10/6/15
to Bardur Arantsson, Haskell-Cafe
On Tue, Oct 6, 2015 at 2:40 AM Bardur Arantsson <sp...@scientician.net> wrote:
>    - To use functions like "tryReadMVar", "unsafeShiftR", and

>    "atomicModifyIORef'" that are in recent base versions but not older ones
>    (this is a place where CPP use is actually justified)
Well, yeah, new functions don't magically appear in old versions. I
don't anybody expects that :).

Right - they have to be backported, and imply maintenance of multiple branches.

Python has been mentioned a couple of times. They did that, backporting functions that mimiced Python 3 functions to later Python 2 versions, which made the tools mentioned possible, or at least more capable.

I'll refer those wanting fewer breaking changes to Python. Getting language changes accepted by the PYthon community is hard. You need a use case for your change, you need to show that the case can't be handled cleanly by the language as it exists today, that it's common enough to justify the change, and that it won't tempt people to write uglier code than the current solution. If you want to break old code, all those bars get raised - a lot.

Or did, anyway, The result of following that was the Python 2/Python 3 split, which I don't think anybody thought was a good thing. Unavoidable, maybe - Python 3 had to break backwards compatibility, so they cleaned up everything along the way, and created tools to migrate code, and committed to maintaining both versions for an extended period of time. And the community was pretty badly splintered by this.

The one thing I think they got right is documenting the language change process with the entire PEP process. We seem to have wiki pages, with no editorial oversight, and pages show up in google searches after they've been incorporated into the language in a different form. Not quite as good a thing.

Herbert Valerio Riedel

unread,
Oct 6, 2015, 4:31:41 AM10/6/15
to Rustom Mody, Haskell Cafe
On 2015-10-06 at 05:43:43 +0200, Rustom Mody wrote:

[...]

> The python 2→3 transition may have caused some pain to some. However it
> would have been far more difficult if they had not provided tools like
> 2to3 and six:
> https://docs.python.org/2/library/2to3.html
> http://pythonhosted.org/six/
>
> Since the change here is trivial and pervasive why is such a tool not being
> considered?
> [Or have I missed the discussion?]

It is being considered! :-)

I'm in the process of collecting rewrite recipes and samples on

https://github.com/hvr/Hs2010To201x

Alan is looking into how HaRe can be leveraged for this. I've refrained
from bringing this up, as I'm still waiting to reach a proof-of-concept
stage where we can confidently say that this actually works.


> In particular we can envisage a tool say 8to10 that has these modes
> (command-line flags)
> --portable
> --readable
> --info
>
> with the idea that
> - portable is most unreadable (ifdef litter)
> - readable is not portable -- generate a copy of the whole source tree that
> will have the changes and not be compatible for ghc < 7.8
> This is for those library authors that prefer to maintain a clean code
> base and hell with earlier ghc versions
> - info summarizes which files would change and how so that a suitable
> reorganization of the files/directories can be done prior to the switch
> This is for those library authors that would like to take the trouble to
> have a compatibility layer and have code as readable as possible overall
>
> [Whether pure is really preferable to return is another matter --
> For me one of the strongest criticisms of the python 2→3 switch is this
> that if they were going to break things anyhow why was the cleanup not more
> far-reaching?
>
> Lets keep this note in these parentheses :-)
>
> ]

> _______________________________________________
> Haskell-Cafe mailing list
> Haskel...@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe

--
"Elegance is not optional" -- Richard O'Keefe

Ben Gamari

unread,
Oct 6, 2015, 4:44:47 AM10/6/15
to Sven Panne, Gershom B, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, Haskell Libraries, haskell cafe
This is a fair point that comes up fairly often. The fact that CPP is
required to silence redundant import warnings is quite unfortunate.
Others languages have better stories in this area. One example is Rust,
which has a quite flexible `#[allow(...)]` pragma which can be used to
acknowledge and silence a wide variety of warnings and lints [1].

I can think of a few ways (some better than others) how we might
introduce a similar idea for import redundancy checks in Haskell,

1. Attach a `{-# ALLOW redundant_import #-}` pragma to a definition,

-- in Control.Applicative
{-# ALLOW redundant_import (<$>) #-}
(<$>) :: (a -> b) -> f a -> f b
(<$>) = fmap

asking the compiler to pretend that any import of the symbol did not
exist when looking for redundant imports. This would allow library
authors to appropriately mark definitions when they are moved,
saving downstream users from having to make any change whatsoever.

2. Or alternatively we could make this a idea a bit more precise,

-- in Control.Applicative
{-# ALLOW redundant_import Prelude.(<$>) #-}
(<$>) :: (a -> b) -> f a -> f b
(<$>) = fmap

Which would ignore imports of `Control.Applicative.(<$>)` only if
`Prelude.(<$>)` were also in scope.

3. Attach a `{-# ALLOW redundancy_import #-}` pragma to an import,

import {-# ALLOW redundant_import #-} Control.Applicative

-- or perhaps
import Control.Applicative
{-# ALLOW redundant_import Control.Applicative #-}

allowing the user to explicitly state that they are aware that this
import may be redundant.

4. Attach a `{-# ALLOW redundancy_import #-}` pragma to a name in an
import list,

import Control.Applicative ((<$>) {-# ALLOW redundant_import #-})

allowing the user to explicitly state that they are aware that this
imported function may be redundant.

In general I'd like to reiterate that many of the comments in this
thread describe genuine sharp edges in our language which have presented
a real cost in developer time during the AMP and and FTP transitions. I
think it is worth thinking of ways to soften these edges; we may be
surprised how easy it is to fix some of them.

- Ben


[1] https://doc.rust-lang.org/stable/reference.html#lint-check-attributes
signature.asc

Kosyrev Serge

unread,
Oct 6, 2015, 5:30:23 AM10/6/15
to Ben Gamari, Gershom B, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, Haskell Libraries, haskell cafe
Ben Gamari <b...@smart-cactus.org> writes:
> This is a fair point that comes up fairly often. The fact that CPP is
> required to silence redundant import warnings is quite unfortunate.
> Others languages have better stories in this area. One example is Rust,
> which has a quite flexible `#[allow(...)]` pragma which can be used to
> acknowledge and silence a wide variety of warnings and lints [1].
>
> I can think of a few ways (some better than others) how we might
> introduce a similar idea for import redundancy checks in Haskell,
>
> 1. Attach a `{-# ALLOW redundant_import #-}` pragma to a definition,
...

> 2. Or alternatively we could make this a idea a bit more precise,
> {-# ALLOW redundant_import Prelude.(<$>) #-}
...

> 3. Attach a `{-# ALLOW redundancy_import #-}` pragma to an import,
> import {-# ALLOW redundant_import #-} Control.Applicative
...

> 4. Attach a `{-# ALLOW redundancy_import #-}` pragma to a name in an
> import list,
> import Control.Applicative ((<$>) {-# ALLOW redundant_import #-})

What I don't like about this solution is how specific it is -- the gut
instinct that it can't be the last such extension, if we were to start
replacing CPP piecemeal.

And after a while, we'd accomodate a number of such extensions..
and they would keep coming.. until it converges to a trainwreck.

I think that what instead needs to be done, is this:

1. a concerted effort to summarize *all* uses of CPP in Haskell code
2. a bit of forward thinking, to include desirables that would be easy
to get with a more generic solution

Personally, I think that any such effort, approached with generality
that would be truly satisfying, should inevitably converge to an
AST-level mechanism.

..but then I remember how CPP can be used to paper over incompatible
syntax changes..

Hmm..

--
с уважениeм / respectfully,
Косырев Серёга

Henrik Nilsson

unread,
Oct 6, 2015, 7:32:46 AM10/6/15
to haskell-prime@haskell.org List, Haskell Libraries, haskell cafe
Dear all,

Executive Summary: Please let us defer further discussion
and ultimate decision on MRP to the resurrected HaskellPrime
committee

While we can discuss the extent of additional breakage
MRP would cause, the fact remains it is a further
breaking change. A survey of breakage to books as
Herbert did is certainly valuable (thanks!), but
much breakage will (effectively) remain unquantifiable.

It is also clear from the discussions over the last
couple of weeks, on the Haskell libraries list as well
as various other forums and social media, that MRP is
highly contentions.

This begs two questions:

1. Is the Haskell Libraries list and informal voting process
really an appropriate, or even acceptable, way to adopt
such far-reaching changes to what effectively amounts to
Haskell itself?

2. Why the hurry to push MRP through?

As to question 1, to Graham Hutton's and my knowledge,
the libraries list and its voting process was originally
set up for 3rd-party libraries in fptools. It seems to
have experienced some form of "mission creep" since.
Maybe that is understandable given that there was no
obvious alternative as HaskellPrime has been defunct
for a fair few years. But, as has been pointed out in a
number of postings, a lot of people with very valuable
perspectives are also very busy, and thus likely to
miss a short discussion period (as has happened in the
past in relation to the Burning the Bridges proposal)
and also have very little time for engaging in long and
complicated e-mail discussions that, from their
perspective, happen at a completely random point in
time and for which they thus have not had a chance to
set aside time even if they wanted to participate.

Just as one data point, AMP etc. mostly passed Graham
and me by simply because a) we were too busy to notice
and b) we simply didn't think there was a mandate for
such massive overhauls outside of a process like
HaskellPrime. And we are demonstrably not alone.

This brings us to question 2. Now that HaskellPrime is
being resurrected, why the hurry to push MRP through?
Surely HaskellPrime is the forum where breaking
changes like MRP should be discussed, allowing as much
time as is necessary and allowing for an as wide range
of perspectives as possible to properly be taken into
account?

The need to "field test" MRP prior to discussing
it in HaskellPrime has been mentioned. Graham and I
are very sceptical. In the past, at least in the
past leading up to Haskell 2010 or so, the community
at large was not roped in as involuntary field testers.

If MRP is pushed through now, with a resurrection of
HaskellPrime being imminent, Graham and I strongly believe
that risks coming across to a very large part of the
Haskell community as preempting proper process by facing
the new HaskellPrime committee with (yet another) fait
accompli.

Therefore, please let us defer further discussion and
ultimate decision on MRP to the resurrected
HaskellPrime committee, which is where it properly
belongs. Otherwise, the Haskell community itself might
be one of the things that MRP breaks.

Best regards,

/Henrik

--
Henrik Nilsson
School of Computer Science
The University of Nottingham
n...@cs.nott.ac.uk




This message and any attachment are intended solely for the addressee
and may contain confidential information. If you have received this
message in error, please send it back to me, and immediately delete it.

Please do not use, copy or disclose the information contained in this
message or in any attachment. Any views or opinions expressed by the
author of this email do not necessarily reflect the views of the
University of Nottingham.

This message has been checked for viruses but the contents of an
attachment may still contain software viruses which could damage your
computer system, you are advised to perform your own checks. Email
communications with the University of Nottingham may be monitored as
permitted by UK legislation.

Augustsson, Lennart

unread,
Oct 6, 2015, 7:57:41 AM10/6/15
to Henrik Nilsson, haskell-prime@haskell.org List, Haskell Libraries, haskell cafe
To question 1 my answer is NO! I think voting to decide these kind of issues a terrible idea; we might as well throw dice.
This email and any attachments are confidential and may also be privileged. If you are not the intended recipient, please delete all copies and notify the sender immediately. You may wish to refer to the incorporation details of Standard Chartered PLC, Standard Chartered Bank and their subsidiaries at http://www.standardchartered.com/en/incorporation-details.html

Insofar as this communication contains any market commentary, the market commentary has been prepared by sales and/or trading desk of Standard Chartered Bank or its affiliate. It is not and does not constitute research material, independent research, recommendation or financial advice. Any market commentary is for information purpose only and shall not be relied for any other purpose, and is subject to the relevant disclaimers available at http://wholesalebanking.standardchartered.com/en/utility/Pages/d-mkt.aspx

Insofar as this e-mail contains the term sheet for a proposed transaction, by responding affirmatively to this e-mail, you agree that you have understood the terms and conditions in the attached term sheet and evaluated the merits and risks of the transaction. We may at times also request you to sign on the term sheet to acknowledge in respect of the same.

Please visit http://wholesalebanking.standardchartered.com/en/capabilities/financialmarkets/Pages/doddfrankdisclosures.aspx for important information with respect to derivative products.

Erik Hesselink

unread,
Oct 6, 2015, 8:06:44 AM10/6/15
to Henrik Nilsson, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List
I was always under the impression that +1/-1 was just a quick
indicator of opinion, not a vote, and that it was the core libraries
committee that would make the final call if enough consensus was
reached to enact the change.

Erik

On 6 October 2015 at 13:32, Henrik Nilsson

Jan-Willem Maessen

unread,
Oct 6, 2015, 8:25:44 AM10/6/15
to Ben Gamari, Gershom B, Haskell Libraries, haskell-prime@haskell.org List, Graham Hutton, Michał J Gajda, haskell cafe
One obvious solution I haven't seen mentioned is the ability to add nonexistent identifier to a hiding clause (these identifiers might presumably exist in some other version of the import):

import Prelude hiding ((<$>))

I can see the argument for marking such imports with a pragma, though it gets a bit ugly.

-Jan-Willem Maessen

Herbert Valerio Riedel

unread,
Oct 6, 2015, 9:01:02 AM10/6/15
to Erik Hesselink, Haskell Libraries, Henrik Nilsson, haskell cafe, haskell-prime@haskell.org List
On 2015-10-06 at 14:06:11 +0200, Erik Hesselink wrote:
> I was always under the impression that +1/-1 was just a quick
> indicator of opinion, not a vote, and that it was the core libraries
> committee that would make the final call if enough consensus was
> reached to enact the change.

I'd like to point out, that the core libraries committee ought to
continue to do so (as hinted at in [1]) in its function as a Haskell
Prime sub-Committee (c.f. sub-teams in the Rust community[2]).

While there will be surely overlap of interests, contributions,
cross-reviewing and discussion, the principal task and responsibility of
the new sought members is to concentrate on the language part of the
Haskell Report where quite a bit of work is awaiting them.

Cheers,
hvr

[1]: https://mail.haskell.org/pipermail/haskell-prime/2015-September/003936.html
[2]: https://www.rust-lang.org/team.html

Bardur Arantsson

unread,
Oct 6, 2015, 12:25:14 PM10/6/15
to haskel...@haskell.org, libr...@haskell.org, haskel...@haskell.org
On 10/06/2015 11:11 AM, Johan Tibell wrote:
> It might be enough to just add a NOWARN <warning type> pragma that acts on
> a single line/expression. I've seen it in both C++ and Python linters and
> it works reasonably well and it's quite general.

+1. Simple is good and can hopefully also be backported to older GHC
releases. (Provided someone's willing to do said releases, obviously.)

Herbert Valerio Riedel

unread,
Oct 6, 2015, 12:48:16 PM10/6/15
to Johan Tibell, libr...@haskell.org, haskel...@haskell.org, haskel...@haskell.org
On 2015-10-06 at 10:10:01 +0200, Johan Tibell wrote:
[...]
>> You say that you stick to the 3-major-ghc-release support-window
>> convention for your libraries. This is good, because then you don't need
>> any CPP at all! Here's why:
>>
>> [...]
>>
>
> So what do I have to write today to have my Monad instances be:
>
> * Warning free - Warnings are useful. Turning them off or having spurious
> warnings both contribute to bugs.

Depends on the warnings. Some warnings are more of an advisory kind
(hlint-ish). I wouldn't consider redundant imports a source of
bugs. Leaving off a top-level signature shouldn't be a source of
correctness bugs either. Also, warnings about upcoming changes
(i.e. deprecation warnings) also are not necessarily a bug, but rather a
way to raise awareness of API changes.

At the other end of the spectrum are more serious warnings I'd almost
consider errors, such as failing to define a non-defaulting method or
violating a MINIMAL pragma specification, as that can lead to bottoms
either in the form of runtime errors or even worse, hanging computations
(in case of cyclic definitions).

IMO, GHC should classify its warnings into severities/categories or
introduce some compromise between -Wall and not-Wall.

> * Use imports that either are qualified or have explicit import lists -
> Unqualified imports makes code more likely to break when dependencies add
> exports.
> * Don't use CPP.

That being said, as how to write your Monad instances today with GHC
7.10 w/o CPP, while supporting at least GHC 7.4/7.6/7.8/7.10: This
*does* work (admittedly for an easy example, but this can be
generalised):


--8<---------------cut here---------------start------------->8---
module MyMaybe where

import Control.Applicative (Applicative(..))
import Prelude (Functor(..), Monad(..), (.))
-- or alternatively: `import qualified Prelude as P`

data Maybe' a = Nothing' | Just' a

instance Functor Maybe' where
fmap f (Just' v) = Just' (f v)
fmap _ Nothing' = Nothing'

instance Applicative Maybe' where
pure = Just'
f1 <*> f2 = f1 >>= \v1 -> f2 >>= (pure . v1)

instance Monad Maybe' where
Nothing' >>= _ = Nothing'
Just' x >>= f = f x

return = pure -- "deprecated" since GHC 7.10

--8<---------------cut here---------------end--------------->8---

This example above compiles -Wall-clean and satisfies all your 3 stated
requirements afaics. I do admit this probably not what you had in mind.

But to be consistent, if you want to avoid unqualified imports at all
costs in order to have full control of what gets imported into your
namespace, you shouldn't tolerate an implicit unqualified wildcard
`Prelude` import either. As `Prelude` can, even if seldom, gain new
exports as well (or even loose them -- `interact` *could* be a candidate
for removal at some point).


> Neither AMP or MRP includes a recipe for this in their proposal.

That's because -Wall-hygiene (w/o opting out of harmless) warnings
across multiple GHC versions is not considered a show-stopper.

In the specific case of MRP, I can offer you a Wall-perfect transition
scheme by either using `ghc-options: -fno-mrp-warnings` in your
cabal-file, or if that doesn't satisfy you, we can delay phase1
(i.e. redundant return-def warnings) to GHC 8.2:

Now you can continue to write `return = pure` w/o GHC warning bothering
you until GHC 8.2, at which point the 3-year-window will reach to GHC
7.10.

Then starting with GHC 8.2 you can drop the `return` definition, and
keep your


More generally though, we need more language-features and/or
modifications to the way GHC triggers warnings to make such
refactorings/changes to the libraries -Wall-perfect as well.

Beyond what Ben already suggested in another post, there was also the
more general suggestion to implicitly suppress warnings when you
explicitly name an import. E.g.

import Control.Applicative (Applicative(..))

would suppress the redundant-import warning for Applicative via Prelude,
because we specifically requested Applicative, so we don't mind that
Prelude re-exports the same symbol.


> AMP got one post-facto on the Wiki. It turns out that the workaround
> there didn't work (we tried it in Cabal and it conflicted with one of
> the above requirements.)

Yes, that unqualified `import Prelude`-last trick mentioned on the Wiki
breaks down for more complex imports with (redundant) explicit import
lists.

However, the Maybe-example above works at the cost of a wordy
Prelude-import, but it's more robust, as you pin down exactly which
symbol you expect to get from each module.



> The problem by discussions is that they are done between two groups with
> quite a difference in experience. On one hand you have people like Bryan,
> who have considerable contributions to the Haskell ecosystem and much
> experience in large scale software development (e.g. from Facebook). On the
> other hand you have people who don't. That's okay. We've all been at the
> latter group at some point of our career.
[...]

At the risk of stating the obvious: I don't think it matters from which
group a given argument comes from as its validity doesn't depend on the
messenger. Neither does it matter whether an argument is repeated
several times or stated only once. Also, every argument deserves to be
considered regardless of its origin or frequency.


-- hvr

Herbert Valerio Riedel

unread,
Oct 6, 2015, 1:26:24 PM10/6/15
to Johan Tibell, libr...@haskell.org, haskel...@haskell.org, haskel...@haskell.org

I hit "send" too early, so here's the incomplete section completed:

On 2015-10-06 at 18:47:08 +0200, Herbert Valerio Riedel wrote:

[...]

> In the specific case of MRP, I can offer you a Wall-perfect transition
> scheme by either using `ghc-options: -fno-mrp-warnings` in your
> cabal-file, or if that doesn't satisfy you, we can delay phase1
> (i.e. redundant return-def warnings) to GHC 8.2:
>
> Now you can continue to write `return = pure` w/o GHC warning bothering
> you until GHC 8.2, at which point the 3-year-window will reach to GHC
> 7.10.
>
> Then starting with GHC 8.2 you can drop the `return` definition, and

...keep supporting a 3-year-window back till GHC 7.10 (which
incorporates AMP and doesn't need `return` explicitly defined anymore)
without CPP.

And since you don't define `return` anymore, you don't get hit by the
MRP warning either, which would start with GHC 8.2.

GHC can keep providing as long as we want it to, and consider `return`
being an extra method of `Monad` simply a GHC-ism.

Future Haskell books and learning materials will hopefully be based on
the next Haskell Report incorporating the AMP and stop referring to the
historical `return` accident (which I consider badly named anyway from a
pedagogically perspective).

Code written unaware of `return` being a method of Monad will work
anyway just fine.

Do you see any problems with this scheme?

Sven Panne

unread,
Oct 6, 2015, 1:42:01 PM10/6/15
to Herbert Valerio Riedel, Johan Tibell, Libraries, Haskell Cafe, haskell-prime@haskell.org Prime
2015-10-06 18:47 GMT+02:00 Herbert Valerio Riedel <h...@gnu.org>:
[...] That being said, as how to write your Monad instances today with GHC

7.10 w/o CPP, while supporting at least GHC 7.4/7.6/7.8/7.10: This
*does* work (admittedly for an easy example, but this can be
generalised):


--8<---------------cut here---------------start------------->8---
module MyMaybe where

import Control.Applicative (Applicative(..))
import Prelude (Functor(..), Monad(..), (.))
-- or alternatively: `import qualified Prelude as P`
[...]

--8<---------------cut here---------------end--------------->8---

This example above compiles -Wall-clean and satisfies all your 3 stated
requirements afaics. I do admit this probably not what you had in mind.

OK, so the trick is that you're effectively hiding Applicative from the Prelude (which might be a no-op). This "works" somehow, but is not satisfactory IMHO for several reasons:

   * If you explicitly import all entities from Prelude, your import list will typically get *very* long and unreadable. Furthermore, if that's the suggested technique, what's the point of having a Prelude at all?

   * Some people see qualified imports as the holy grail, but having to prefix tons of things with "P." is IMHO very ugly. Things are even worse for operators: The whole notion of operators in itself is totally useless and superfluous *except* for a single reason: Readability. And exactly that gets destroyed when you have to qualify them, so I would (sadly) prefer some #ifdef hell, if that gives me readable code elsewhere.

   * With the current trend of moving things to the Prelude, I can envision a not-so-distant future where the whole Control.Applicative module will be deprecated. As it is now, it's mostly superfluous and/or contains only stuff which might better live somewhere else.
 
[...] That's because -Wall-hygiene (w/o opting out of harmless) warnings

across multiple GHC versions is not considered a show-stopper.

That's your personal POV, I'm more leaning towards "-Wall -Werror". I've seen too many projects where neglecting warning over an extended period of time made fixing them basically impossible at the end. Anyway, I think that a sane ecosystem should allow *both* POVs, the sloppy one and the strict one.
 
[...] Beyond what Ben already suggested in another post, there was also the

more general suggestion to implicitly suppress warnings when you
explicitly name an import. E.g.

  import Control.Applicative (Applicative(..))

would suppress the redundant-import warning for Applicative via Prelude,
because we specifically requested Applicative, so we don't mind that
Prelude re-exports the same symbol. [...]

Uh, oh... That would be bad, because one normally wants to see redundant imports. Without the compiler telling me, how should I find out which are redundant? Manually trying to remove them step by step? :-/

Cheers,
   S.

David Feuer

unread,
Oct 6, 2015, 2:12:52 PM10/6/15
to Henrik Nilsson, Haskell Libraries, haskel...@haskell.org, haskell-prime@haskell.org List


On Oct 6, 2015 7:32 AM, "Henrik Nilsson" <Henrik....@nottingham.ac.uk> wrote:
> Executive Summary: Please let us defer further discussion
> and ultimate decision on MRP to the resurrected HaskellPrime
> committee

Many more people are on this mailing list than will be chosen for the committee. Those who are not chosen have useful perspectives as well.

>   1. Is the Haskell Libraries list and informal voting process
>      really an appropriate, or even acceptable, way to adopt
>      such far-reaching changes to what effectively amounts to
>      Haskell itself?

As others have said, no one wants that.

> But, as has been pointed out in a
> number of postings, a lot of people with very valuable
> perspectives are also very busy, and thus likely to
> miss a short discussion period (as has happened in the
> past in relation to the Burning the Bridges proposal)

The Foldable/Traversable BBP indeed was not as well discussed as it should have been. AMP, on the other hand, was discussed extensively and publicly for months. I understand that some people need months of notice to prepare to participate in a discussion. Unfortunately, I don't think those people can always be included. Life moves too quickly for that. I do think it might be valuable to set up a moderated, extremely low volume mailing list for discussion of only the most important changes, with its messages forwarded to the general list.

> The need to "field test" MRP prior to discussing
> it in HaskellPrime has been mentioned. Graham and I
> are very sceptical. In the past, at least in the
> past leading up to Haskell 2010 or so, the community
> at large was not roped in as involuntary field testers.

No, and Haskell 2010 was, by most measures, a failure. It introduced no new language features (as far as I can recall) and only a few of the most conservative library changes imaginable. Standard Haskell has stagnated since 1998, 17 years ago. Haskell 2010 did not reflect the Haskell people used in their research our practical work then, and I think people are justified in their concern that the next standard may be similarly disappointing. One of the major problems is the (understandable, and in many ways productive) concentration of development effort in a single compiler. When there is only one modern Haskell implementation that is commonly used, it's hard to know how changes to the standard will affect other important implementation techniques, and therefore hard to justify any substantial changes. That was true in 2010, and it is, if anything, more true now.

> Therefore, please let us defer further discussion and
> ultimate decision on MRP to the resurrected
> HaskellPrime committee, which is where it properly
> belongs. Otherwise, the Haskell community itself might
> be one of the things that MRP breaks.

I hope not. Haskell has gained an awful lot from the researchers, teachers, and developers who create it and use it. I hope we can work out an appropriate balance of inclusion, caution, and speed.

Malcolm Wallace

unread,
Oct 6, 2015, 3:02:37 PM10/6/15
to Herbert Valerio Riedel, Johan Tibell, libr...@haskell.org, haskel...@haskell.org, haskel...@haskell.org

On 6 Oct 2015, at 17:47, Herbert Valerio Riedel wrote:

>
>> The problem by discussions is that they are done between two groups with
>> quite a difference in experience. On one hand you have people like Bryan,
>> who have considerable contributions to the Haskell ecosystem and much
>> experience in large scale software development (e.g. from Facebook). On the
>> other hand you have people who don't. That's okay. We've all been at the
>> latter group at some point of our career.
> [...]
>
> At the risk of stating the obvious: I don't think it matters from which
> group a given argument comes from as its validity doesn't depend on the
> messenger.

In that case, I think you are misunderstanding the relevance of Johan's argument here. Let me try to phrase it differently. Some people who can reasonably claim to have experience with million-line plus codebases are warning that this change is too disruptive, and makes maintenance harder than it ought to be. On the other hand, of the people who say the change is not really disruptive, none of them have (yet?) made claims to have experience of the maintenance of extremely large-scale codebases. The authority of the speaker does matter in technical arguments of this nature: people without the relevant experience are simply unqualified to make guesses about the impact.

Regards,
Malcolm

Herbert Valerio Riedel

unread,
Oct 6, 2015, 3:12:33 PM10/6/15
to Johan Tibell, libr...@haskell.org, haskel...@haskell.org, haskel...@haskell.org

I hit "send" too early, so here's the incomplete section completed:

On 2015-10-06 at 18:47:08 +0200, Herbert Valerio Riedel wrote:

[...]

> In the specific case of MRP, I can offer you a Wall-perfect transition
> scheme by either using `ghc-options: -fno-mrp-warnings` in your
> cabal-file, or if that doesn't satisfy you, we can delay phase1
> (i.e. redundant return-def warnings) to GHC 8.2:
>
> Now you can continue to write `return = pure` w/o GHC warning bothering
> you until GHC 8.2, at which point the 3-year-window will reach to GHC
> 7.10.
>
> Then starting with GHC 8.2 you can drop the `return` definition, and

...keep supporting a 3-year-window back till GHC 7.10 (which
incorporates AMP and doesn't need `return` explicitly defined anymore)
without CPP.

And since you don't define `return` anymore, you don't get hit by the
MRP warning either, which would start with GHC 8.2.

GHC can keep providing as long as we want it to, and consider `return`
being an extra method of `Monad` simply a GHC-ism.

Future Haskell books and learning materials will hopefully be based on
the next Haskell Report incorporating the AMP and stop referring to the
historical `return` accident (which I consider badly named anyway from a
pedagogically perspective).

Code written unaware of `return` being a method of Monad will work
anyway just fine.

Do you see any problems with this scheme?

Adam Bergmark

unread,
Oct 6, 2015, 3:16:32 PM10/6/15
to Bardur Arantsson, libraries, haskell-cafe, haskel...@haskell.org
On Tue, Oct 6, 2015 at 6:16 PM, Bardur Arantsson <sp...@scientician.net> wrote:
On 10/06/2015 11:11 AM, Johan Tibell wrote:
> It might be enough to just add a NOWARN <warning type> pragma that acts on
> a single line/expression. I've seen it in both C++ and Python linters and
> it works reasonably well and it's quite general.

+1. Simple is good and can hopefully also be backported to older GHC
releases. (Provided someone's willing to do said releases, obviously.)

I've thought of this in the past and it would be great to have. Another possible use case is to allow this for extensions as well, e.g. "Only allow UndecidableInstances for this declaration"

Mikhail Glushenkov

unread,
Oct 6, 2015, 3:32:30 PM10/6/15
to Herbert Valerio Riedel, Johan Tibell, Haskell Libraries, haskell-cafe, haskel...@haskell.org
Hi,

On 6 October 2015 at 19:03, Herbert Valerio Riedel <h...@gnu.org> wrote:
>> In the specific case of MRP, I can offer you a Wall-perfect transition
>> scheme by either using `ghc-options: -fno-mrp-warnings` in your
>> cabal-file, [...]

Apropos, is there a similar option for AMP warnings? I would rather
use that than CPP.

Christopher Allen

unread,
Oct 6, 2015, 4:16:47 PM10/6/15
to Malcolm Wallace, haskell-prime@haskell.org List, Haskell Libraries, Haskell Cafe, Herbert Valerio Riedel
Do those participating in this thread think sentiments like this are constructive or inclusive? Is this how we encourage participation from newer members of the community?

Framing this debate in terms of a programming pecking order is unprofessional. Many times, those higher in the ranks will prefer a more conservative approach, as experienced surgeons once resisted the introduction of the autoclave. 

The problem isn't the change; it's what the change costs you. Provide data and make your case. Talk about what it _costs_ you, show evidence for that cost, and describe what would make the change acceptable. Do it without talking down to a constructed "other" of the people who've neglected to make the same status display you've injected into this conversation. That could be valuable input to the discussion, so we would could weigh costs and benefits as a community. 

There _are_ costs associated with going ahead with MRP, especially for those with large 1mm LOC industrial codebases. This is partly why I'm lukewarm on the change, but I believe it needs to happen sooner or later and waiting for more 1mm LOC codebases to be born isn't going to make it any better. The suggestions that we consider the example of 2to3 I believe have been more constructive, particularly since we have this lovely language which lends itself so nicely to static analysis anyway.





--
Chris Allen
Currently working on http://haskellbook.com

Herbert Valerio Riedel

unread,
Oct 6, 2015, 4:18:01 PM10/6/15
to Mikhail Glushenkov, Johan Tibell, Haskell Libraries, haskell-cafe, haskel...@haskell.org
Hi,

On 2015-10-06 at 21:32:19 +0200, Mikhail Glushenkov wrote:
> On 6 October 2015 at 19:03, Herbert Valerio Riedel <h...@gnu.org> wrote:
>>> In the specific case of MRP, I can offer you a Wall-perfect transition
>>> scheme by either using `ghc-options: -fno-mrp-warnings` in your
>>> cabal-file, [...]
>
> Apropos, is there a similar option for AMP warnings? I would rather
> use that than CPP.

Sure, added in GHC 7.8:

| In GHC 7.10, Applicative will become a superclass of Monad,
| potentially breaking a lot of user code. To ease this transition, GHC
| now generates warnings when definitions conflict with the
| Applicative-Monad Proposal (AMP).
|
| A warning is emitted if a type is an instance of Monad but not of
| Applicative, MonadPlus but not Alternative, and when a local function
| named join, <*> or pure is defined.
|
| The warnings are enabled by default, and can be controlled using the
| new flag -f[no-]warn-amp.


However, if you use that now with GHC 7.10 you get a warning:

| $ ghc-7.10.2 -fno-warn-amp
|
| on the commandline: Warning:
| -fno-warn-amp is deprecated: it has no effect, and will be removed in GHC 7.12

so you you'll need to guard that with something like

if impl(ghc == 7.8.*) ghc-options: -fno-warn-amp

Tom Ellis

unread,
Oct 6, 2015, 4:36:54 PM10/6/15
to haskel...@haskell.org, Haskell Libraries
On Mon, Oct 05, 2015 at 05:40:43PM -0700, Gregory Collins wrote:
> - defaultTimeLocale moved from System.Locale to Data.Time.Format in
> time-1.5 (no compat package for this, afaik)

http://hackage.haskell.org/package/time-compat

Tom Ellis

unread,
Oct 6, 2015, 4:39:08 PM10/6/15
to haskel...@haskell.org, Haskell Libraries
On Mon, Oct 05, 2015 at 05:40:43PM -0700, Gregory Collins wrote:
> On Mon, Oct 5, 2015 at 3:18 PM, Bryan Richter <b...@chreekat.net> wrote:
> > Hang on a moment, are you saying that all the people writing to argue
> > that these changes would require them to write dozens more #ifdef's
> > actually don't have to write any at all?
>
> Um, no, it usually isn't anything like that. Here's a sampling of some of
> the things I've used CPP for in the past few years:
>
> - After GHC 7.4, when using a newtype in FFI imports you need to import
> the constructor, i.e. "import Foreign.C.Types(CInt(..))" --- afaik CPP is
> the only way to shut up warnings everywhere
> - defaultTimeLocale moved from System.Locale to Data.Time.Format in
> time-1.5 (no compat package for this, afaik)
> - one of many various changes to Typeable in the GHC 7.* series
> (deriving works better now, mkTyCon vs mkTyCon3, etc)
> - Do I have to hide "catch" from Prelude, or not? It got moved, and
> "hiding" gives an error if the symbol you're trying to hide is missing.
> Time to break out the CPP (and curse myself for not just using the
> qualified import in the first place)
> - Do I get monoid functions from Prelude or from Data.Monoid? Same w/
> Applicative, Foldable, Word. I don't know where anything is supposed to
> live anymore, or which sequence of imports will shut up spurious warnings
> on all four versions of GHC I support, so the lowest-friction fix is: break
> out the #ifdef spackle
> - ==# and friends return Int# instead of Bool after GHC 7.8.1
> - To use functions like "tryReadMVar", "unsafeShiftR", and
> "atomicModifyIORef'" that are in recent base versions but not older ones
> (this is a place where CPP use is actually justified)

In fact I think all of these apart from the FFI one could be solved with a
-compat package, could they not?

Mark Lentczner

unread,
Oct 6, 2015, 5:15:41 PM10/6/15
to Johan Tibell, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List

On Thu, Sep 24, 2015 at 2:43 PM, Herbert Valerio Riedel <h...@gnu.org> wrote:
TLDR: To complete the AMP, turn `Monad(return)` method into a
      top-level binding aliasing `Applicative(pure)`.

Sure... if we had a language that no one uses and could keep reforming like putty until it is perfect. But we don't.

A modest proposal:

We can't keep tinkering with a language and it's libraries like this AND have a growing ecosystem that serves an ever widening base, including the range from newcomer to commercial deployment. SO - Why let's do all the language tinkering in GHC 8 there can be as many prereleases of that as needed until it is just right. ...and leave GHC 7 (7.10? roll back to 7.8.4?) for all of us doing essential and dependable libraries, commercial work, projects on Haskell that we don't want to have to go back and #ifdefs to twice a year just to keep running, educators, people writing books. We can keep improving GHC 7 as needed, and focus on bugs, security issues, patches, cross compatibility, etc.

Think of it as Perl 6 or Python 3 for Haskell.

- Mark

On Tue, Oct 6, 2015 at 1:12 AM, Johan Tibell <johan....@gmail.com> wrote:
(Resending with smaller recipient list to avoid getting stuck in the moderator queue.)

On Tue, Oct 6, 2015 at 9:10 AM, Herbert Valerio Riedel <h...@gnu.org> wrote:
On 2015-10-05 at 21:01:16 +0200, Johan Tibell wrote:
> On the libraries I maintain and have a copy of on my computer right now: 329


Although this was already pointed out to you in a response to a Tweet of
yours, I'd like to expand on this here to clarify:


You say that you stick to the 3-major-ghc-release support-window
convention for your libraries. This is good, because then you don't need
any CPP at all! Here's why:

[...]

So what do I have to write today to have my Monad instances be:

 * Warning free - Warnings are useful. Turning them off or having spurious warnings both contribute to bugs.
 * Use imports that either are qualified or have explicit import lists - Unqualified imports makes code more likely to break when dependencies add exports.
 * Don't use CPP.

Neither AMP or MRP includes a recipe for this in their proposal. AMP got one post-facto on the Wiki. It turns out that the workaround there didn't work (we tried it in Cabal and it conflicted with one of the above requirements.)

PS: I'm a bit disappointed you seem to dismiss this proposal right away
    categorically without giving us a chance to address your
    concerns. The proposal is not a rigid all-or-nothing thing that
    can't be tweaked and revised.  That's why we're having these
    proposal-discussions in the first place (rather than doing blind
    +1/-1 polls), so we can hear everyone out and try to maximise the
    agreement (even if we will never reach 100% consensus on any
    proposal).

    So please, keep on discussing!
The problem by discussions is that they are done between two groups with quite a difference in experience. On one hand you have people like Bryan, who have considerable contributions to the Haskell ecosystem and much experience in large scale software development (e.g. from Facebook). On the other hand you have people who don't. That's okay. We've all been at the latter group at some point of our career.

What's frustrating is that people don't take a step bad and realize that they might be in the latter group and should perhaps listen to those in the former. This doesn't happen, instead we get lots of "C++ and Java so bad and we don't want to be like them." Haskell is not at risk of becoming C++ or Java (which are a large improvement compared to the languages came before them). We're at risk of missing our window of opportunity. I think that would be a shame, as I think Haskell is a step forward compared to those languages and I would like to see more software that used be written in Haskell.

We've been through this many times before on the libraries list. I'm not going to win an argument on this mailing list. Between maintaining libraries you all use and managing a largish team at Google, I don't have much time for a discussion which approaches a hundred emails and is won by virtue of having lots of time to write emails.

-- Johan

Gregory Collins

unread,
Oct 6, 2015, 5:30:04 PM10/6/15
to Tom Ellis, Haskell Cafe, Haskell Libraries

On Tue, Oct 6, 2015 at 1:39 PM, Tom Ellis <tom-lists-has...@jaguarpaw.co.uk> wrote:
In fact I think all of these apart from the FFI one could be solved with a
-compat package, could they not?

Who cares? In practice, the programs break and I have to fix them. Most of the time, CPP is the lowest-friction solution -- if I rely on a -compat package, first I have to know it exists and that I should use it to fix my compile error, and then I've added an additional non-platform dependency that I'm going to have to go back and clean up in 18 months. Usually, to be honest, actually the procedure is that the new RC comes out and I get github pull requests from hvr@ :-) :-)

In response to the other person who asked "why do you want to support so many GHC versions anyways?" --- because I don't hate my users, and don't want to force them to run on the upgrade treadmill if they don't have to? Our policy is to support the last 4 major GHC versions (or 2 years, whichever is shorter). And if we support a version of GHC, I want our libraries to compile on it without warnings, I don't think that should mystify anyone.

--
Gregory Collins <gr...@gregorycollins.net>

Andrey Chudnov

unread,
Oct 6, 2015, 5:53:06 PM10/6/15
to haskell-cafe
Herbert, unfortunately your logic would break if there is another
invasive library change somewhere between 7.10 and 8.2 as that would
require introducing a whole another set of #ifdef's. I've been using GHC
since 6.2. Since then new versions of GHC have been breaking builds of
foundational packages every 1-2 releases. I'm actually all for decisive
and unapologetic language evolution, but there should be a safety net so
there's less risk of breakage. And, the main sentiment in the discussion
(which, I admit, I have followed very loosely) seems to be that #ifdef's
are a poor choice for such a net.

So, forgive me if that has been discussed, but what has happened to the
`haskell2010` package that is supposed to provide a compatibility layer
for the standard library? Are people using it? Why hasn't it been
updated since March 2014? Is it really impossible to provide a legacy
Haskell2010 base library compatibility layer with AMP in play?

Perhaps it's my ignorance speaking, but I really think if packages like
`haskell2010` and `haskell98` were actively maintained and used, we
wouldn't have issues like that. Then you could say: "if you depend on
the GHC `base` package directly, your portability troubles are well
deserved".

On 10/06/2015 03:10 AM, Herbert Valerio Riedel wrote:
> So when GHC 8.2 is released, your support-window requires you to support
> GHC 7.10 and GHC 8.0 in addition to GHC 8.2.
>
> At this point you'll be happy that you can start dropping those #ifdefs
> you added for GHC 7.10 in your code in order to adapt to FTP & AMP.

José Manuel Calderón Trilla

unread,
Oct 6, 2015, 6:57:01 PM10/6/15
to Haskell Cafe, Haskell Libraries, haskel...@haskell.org
Hello all,

I agree with Henrik, I'm very keen on giving the new Haskell committee a shot.

While some may not think that Haskell2010 was a success, I think it would be difficult to argue that Haskell98 was anything but a resounding success (even if you don't think the language was what it could have been!). Haskell98 stabilized the constant changes of the proceeding 7 years. The stability brought with it books and courses, and the agreed-upon base of the language allowed _research_ to flourish as well. Having an agreed base allowed the multiple implementations to experiment with different methods of implementing what the standard laid out.

Many of us here learned from those texts or those courses. It's easy online to say that materials being out of date isn't a big deal, but it can turn people off the language when the code they paste into ghci doesn't work. We use Haskell for the compilers course at York; Haskell is the means, not the end, so having to update the materials frequently is a significant cost. It can be difficult to defend the choice of using Haskell when so much time is spent on something that 'isn't the point' of the course.

Does that mean that we should never change the language? Of course not, but this constant flux within Haskell is very frustrating. Maybe Haskell2010 wasn't what everyone wanted it to be, but that does not mean the _idea_ of a committee is without merit. Having controlled, periodic changes that are grouped together and thought through as a coherent whole is a very useful thing. One of the insights of the original committee was that there would always be one chair at any point in time. The chair of the committee had final say on any issue. This helped keep the revisions coherent and ensured that Haskell made sense as a whole.

Lastly, I'd like to quote Prof. Runciman from almost exactly 22 years ago when the issue of incompatible changes
came up. His thoughts were similar to Johan's:

On 1993-10-19 at 14:12:30 +0100, Colin Runciman wrote:
> As a practical suggestion, if any changes for version 1.3 could make
> some revision of a 1.2 programs necessary, let's have a precise
> stand-alone specification of these revisions and how to make them.
> It had better be short and simple.  Many would prefer it to be empty.
> Perhaps it should be implemented in Haskell compilers?


Overall I don't see the rush for these changes, let's try putting our faith in a new Haskell committee, whomever it is comprised of.

Best wishes,

José Manuel

P.S. A year ago Prof. Hinze sent me some Miranda code of his from 1995 as I was studying his thesis. I was able to run the code without issue, allowing me to be more productive in my research ;-)

Mike Meyer

unread,
Oct 6, 2015, 7:24:31 PM10/6/15
to Mark Lentczner, Johan Tibell, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List
On Tue, Oct 6, 2015 at 4:15 PM Mark Lentczner <mark.le...@gmail.com> wrote:

On Thu, Sep 24, 2015 at 2:43 PM, Herbert Valerio Riedel <h...@gnu.org> wrote:
TLDR: To complete the AMP, turn `Monad(return)` method into a
      top-level binding aliasing `Applicative(pure)`.

Sure... if we had a language that no one uses and could keep reforming like putty until it is perfect. But we don't.

A modest proposal:

We can't keep tinkering with a language and it's libraries like this AND have a growing ecosystem that serves an ever widening base, including the range from newcomer to commercial deployment. SO - Why let's do all the language tinkering in GHC 8 there can be as many prereleases of that as needed until it is just right. ...and leave GHC 7 (7.10? roll back to 7.8.4?) for all of us doing essential and dependable libraries, commercial work, projects on Haskell that we don't want to have to go back and #ifdefs to twice a year just to keep running, educators, people writing books. We can keep improving GHC 7 as needed, and focus on bugs, security issues, patches, cross compatibility, etc.

I'm just curious how much you think this would help, assuming that your solution would imply not upgrading to 8 until you're ready to. After all, you can already simply not upgrade now, and create (and distribute) fixes for bugs, security issues, cross-compatibility for 7 as you see fit.

While that's a popular thing to do in lots of systems (but if we don't it. for gnus sake let's not adopt the inane parity implies stability numbering convention), it leaves two major issues unaddressed.

#1, developer time. You need to get the people doing the work now to divide their efforts into the two branches.I don't know what percentage of that work is volunteer time, but I expect the answer is "most of it". If they aren't interested doing that now, what do you expect to change their mind?

#2, everything else in the ecosystem. If you need updates to a library that require the branch you're not using, where does that leave you?

Gershom B

unread,
Oct 6, 2015, 9:19:20 PM10/6/15
to Johan Tibell, Herbert Valerio Riedel, libr...@haskell.org, haskel...@haskell.org, haskel...@haskell.org
Dear all,

I think this discussion has gotten quite heated for reasons not related to the concrete MRP proposal, which, to be honest, I considered quite modest in terms of both scope and impact.

Instead, I think it is a proxy for lots of remaining frustration and anxiety over the poor handling over the Foldable Traversable Proposal. I would like to remind everyone that due to the broad discussions and concerns over the proposal, a very rare, careful poll of Haskell users was taken, announced broadly in many channels. [1] The poll, overwhelmingly, revealed a mandate for the FTP. The breakdown of that mandate was 87% in favor among hobbyists and 79% in favor among non-hobbyists (who constituted a majority of those polled). 

I. Generalities

That said, even the _best_ poll was not a substitute for a better earlier discussion. The handling of the AMP and FTP, which I think was heroic in terms of minimizing breakage while accomplishing long-desired change also still could have been better. As a whole, the work accomplished the mandate of allowing code to be written backwards-compatible without requiring CPP. However, it did not also seek to prevent warnings. This in itself was an enormous step forward from changes in the past which have _not_ even managed to prevent the need for CPP. At the time, I think it was not recognized how much desire there would be for things that were _both_ CPP free and _also_ warning-free for 3 releases.

I think one of the great elements of progress in the current discussion is that there is now a proposal on the table which recognizes this, and seeks to accomplish this change in accordance with this desire. It is not the world’s most important change, but the recognition that change should seek to be both CPP _and_ warning free is a good recognition, and I’m sure it will be taken into account in future proposals as well.

I don’t think it is useful to continue to have abstract discussions on the conflict between desire for incremental improvement versus the need to minimize pain on maintainers. We might as well continue to argue about the need for purely functional programming versus the need to print “hello world” to the console. Rather, we should put our collective minds together as collaborators and colleagues to accomplish _both_, and to come up with solutions that should work for everyone. To the extent this discussion has been about that, I think it has been useful and positive. However, to the extent this discussion insists, on either side, on the shallow idea that we must treat “improvement” versus “stability” as irreconcilable factions in necessary conflict, then I fear it will be a missed opportunity.

II. Particulars

With that in mind, I think the _concrete_ voices of concern have been the most useful. Gregory Collins’ list of issues requiring CPP should be very sobering. Of note, I think they point to areas where the core libraries committee has not paid _enough_ attention (or perhaps has not been sufficiently empowered: recall that not all core libraries fall under its maintenance [2]). Things like the newtype FFI issue, the changes to prim functions, the splitup of old-time and the changes to exception code were _not_ vetted as closely as the AMP and FTP were, or as the MRP is currently being. I don’t know all the reasons for this, but I suspect they just somewhat slipped under the radar. In any case, if all those changes were as carefully engineered as the MRP proposal has been, then imho things would have been much smoother. So, while this discussion may be frustrating, it nonetheless in some ways provides a model of how people have sought to do better and be more proactive with careful discussion of changes. This is much appreciated.

Personally, since the big switch to extensible exceptions back prior in 6.10, and since the split-base nonsense prior to that, very few changes to the core libraries have really caused too much disruption in my code. Since then, the old-time cleanup was the worst, and the big sin there was that time-locale-compat was only written some time after the fact by a helpful third-party contributor and not engineered from the start. (I will note that the time library is one of the core libraries that is _not_ maintained by the core libraries committee). 

Outside of that, the most disruptive changes to my code that I can recall have been from changes to the aeson library over the years — particularly but not only regarding its handling of doubles. I don’t begrudge these changes — they iteratively arrived at a _much_ better library than had they not been made. [3] After than, I made a few changes regarding Happstack and Snap API changes if I recall. Additionally, the addition of “die” to System.Exit caused a few name clashes. My point is simply that there are many packages outside of base that also move, and “real” users with “real” code will these days often have quite a chain of dependencies, and will encounter movement and change from across many of them. So if we say “base never changes” that does not mean “packages will never break” — it just means that base will not have the same opportunity to improve that other packages do, which will eventually lead to frustration, just as it did in the past and in the leadup to the BBP.

III. Discussions

Further, since there has been much discussion of a window of opportunity, I would like to offer a counterpoint to the (sound) advice that we take into consideration voices with long experience in Haskell. The window of opportunity is, by definition, regarding takeup of Haskell by new users. And so if newer users favor certain changes, then it is good evidence that those changes will help with uptake among other new users. So, if they are good changes on their own, then the fact that they are appealing to newer users should be seen as a point in their favor, rather than a reason to dismiss those opinions. But if we are in a situation where we see generations of adopters pitted against one another, then we already have deeper problems that need to be sorted out.

Regarding where and how to have these discussions — the decision was made some time ago (I believe at the start of the initial Haskell Prime process if not sooner, so circa 2009?) that the prime committee would focus on language extensions and not library changes, and that those changes would be delegated to the libraries@ list. The lack of structure to the libraries@ list is what prompted the creation of the libraries committee, whose ultimately responsibility it is to decide on and shepherd through these changes, in consultation with others and ideally driven by broad consensus. Prior to this structure, things broke even more, imho, and simultaneously the things that were widely desired were still not implemented. So I thank the libraries committee for their good work so far.

So, it may be that the process of community discussion on core libraries changes is not best suited for the libraries@ list. But if not there, Where? I worry that the proliferation of lists will not improve things here. Those involved with Haskell have multiplied (this is good). The voices to take into account have multiplied (this is good). Necessarily, this means that there will just be _more_ stuff, and making sure that everyone can filter to just that part they want to is difficult. Here, perhaps, occasional libraries-related summary addenda to the ghc newsletter could be appropriate? Or is there another venue we should look towards? “Chair’s reports” to the Haskell Weekly News maybe?

IV. Summing up

We should bear in mind after all that this is just about cleaning up a redundant typeclass method (albeit one in a very prominent place) and hardly the hill anyone would want to die on [4]. Nonetheless, I think it would be a good sign of progress and collaboration if we can find a way to implement a modest change like this in a way that everyone finds acceptable vis a vis a sufficiently slow pace, the lack of a need for CPP and the lack of any induced warnings. On the other hand, other opportunities will doubtless present themselves in the future.

Best,
Gershom

[1] https://mail.haskell.org/pipermail/libraries/2015-February/025009.html
[2] https://wiki.haskell.org/Library_submissions#The_Core_Libraries
[3] and in any case I am sure Bryan would be the last to want us to treat him as some sort of “guru” on these matters. 
[4] for those in search of better hills to die on, this is a list of some good ones: http://www.theawl.com/2015/07/hills-to-die-on-ranked 

P.S. In case there is any question, this email, as all emails I write that do not state otherwise, is not being written in any particular capacity regarding the various infra-related hats I wear, but is just an expression of my own personal views.

Mike Meyer

unread,
Oct 6, 2015, 10:25:19 PM10/6/15
to Gershom B, Johan Tibell, Herbert Valerio Riedel, haskell-...@projects.haskell.org, libr...@haskell.org, haskel...@haskell.org, haskel...@haskell.org
There's been a lot of complaints about the way things have worked in the past, with things going to fast, or to slow, or the right people not being heard, or notices not going to the right places, etc.

As far as I can tell, the current process is that someone make a proposal on some mailing list, that gets debated by those who find out about it, maybe a wiki page gets set up and announced to those who already know about the proposal, and then it either happens or not.

There's actually quite a bit of experience with dealing with such. I've dealt with the IETF RFC process and the Python PEP process, and both of them worked better than that. So I think we can do better.

I'd like to suggest we adapt something a bit more formal that any process that changes a developer-visible API should have to go through. Note that bug fixes, most security fixes, and other things that don't change the API wouldn't be subject to this requirement. However, any change in an API, even one that's 100% backward compatible, not possibly breaking any code, would be. Initial thoughts on scope are anything in the Haskell Platform, as that seems to be a minimal definition of "Haskell ecosystem". Further, anything on hackage should be able to avail itself of the process.

My concrete, though very broad proposal, with lots of details to be filled in yet, is that we need:

1) A wiki page that lists all proposals being considered, along with their current status and other relevant information.

2) A set of requirements a proposal must meet in order to be listed on that page.

3) An announcements list that only has announcements of things being added to the list. Anybody who has time to vote on a proposal should have time to be on this list.

4) An editorial group responsible for maintaining the list and providing guidance on meeting the requirements to get on it.

The first three are easy. The fourth one is the killer. Somebody to do the work is the stumbling block for most proposals. This doesn't require deep technical knowledge of Haskell or the current ecosystem, but the ability to implement a process and judge things based on form and not content, Since it's my proposal, I'll volunteer as the first editor. Hopefully, others with better reputation will alsob e available.

If adopted, the first two things on the list need to be a description of the process, followed shortly by a description of the requirements to be met.

Mark Lentczner

unread,
Oct 7, 2015, 2:37:06 AM10/7/15
to Mike Meyer, Johan Tibell, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List
Mike -

You might look up the phrase "A modest proposal".

... for gnus sake let's not adopt the inane parity implies stability numbering convention ...

6.10 .. 6.12 .. 7.0 .. 7.2 .. 7.4 .. 7.8 .. 7.10

- Mark 

Mark Lentczner

unread,
Oct 7, 2015, 2:45:41 AM10/7/15
to Mike Meyer, haskell-prime@haskell.org List, Gershom B, haskell-...@projects.haskell.org, Herbert Valerio Riedel, Johan Tibell, Haskell Libraries, haskell

On Tue, Oct 6, 2015 at 7:24 PM, Mike Meyer <m...@mired.org> wrote:
I've dealt with the IETF RFC process and the Python PEP process, and both of them worked better than that.

While both those are good examples of mostly working organizations shepherding foundational technical standard(s) along... there is one thing more important than their processes: Their stance. Both organizations have a very strong engineering discipline of keeping deployed things working without change. I don't think it is enough to simply model their process.

Until about three years ago, the Haskell community also had such a discipline. It has been steadily eroding over the last few years.

- Mark


Mike Meyer

unread,
Oct 7, 2015, 3:22:10 AM10/7/15
to Mark Lentczner, haskell-prime@haskell.org List, Gershom B, haskell-...@projects.haskell.org, Herbert Valerio Riedel, Johan Tibell, Haskell Libraries, haskell
On Wed, Oct 7, 2015 at 1:45 AM Mark Lentczner <mark.le...@gmail.com> wrote:
On Tue, Oct 6, 2015 at 7:24 PM, Mike Meyer <m...@mired.org> wrote:
I've dealt with the IETF RFC process and the Python PEP process, and both of them worked better than that.

While both those are good examples of mostly working organizations shepherding foundational technical standard(s) along... there is one thing more important than their processes: Their stance. Both organizations have a very strong engineering discipline of keeping deployed things working without change. I don't think it is enough to simply model their process.

Well, until Python 3, anyway.

My goal wasn't to recreate the engineering discipline that deployed things keep working without change as you upgrade the ecosystem, it's to provide a mechanism so the community can more easily engage with the evolution of the ecosystem. Hopefully this will make it easier for the community to move things forward in a desirable manner. But it's a process, and leaves the question of whether the desire is for more stability or a less stagnant language up to the users of the process.

I don't necessarily want to model the IETF or PEP processes. Those are a starting point. I tried to abstract the initial points out enough that the final result could be either one of them, or something totally unrelated that's a better fit for the Haskell community.

Herbert Valerio Riedel

unread,
Oct 7, 2015, 3:35:33 AM10/7/15
to Sven Panne, Johan Tibell, Libraries, Haskell Cafe, haskell-prime@haskell.org Prime
On 2015-10-06 at 19:41:51 +0200, Sven Panne wrote:
> 2015-10-06 18:47 GMT+02:00 Herbert Valerio Riedel <h...@gnu.org>:
>
>> [...] That being said, as how to write your Monad instances today with GHC
>> 7.10 w/o CPP, while supporting at least GHC 7.4/7.6/7.8/7.10: This
>> *does* work (admittedly for an easy example, but this can be
>> generalised):
>>
>>
>> --8<---------------cut here---------------start------------->8---
>> module MyMaybe where
>>
>> import Control.Applicative (Applicative(..))
>> import Prelude (Functor(..), Monad(..), (.))
>> -- or alternatively: `import qualified Prelude as P`
>> [...]
>> --8<---------------cut here---------------end--------------->8---
>>
>> This example above compiles -Wall-clean and satisfies all your 3 stated
>> requirements afaics. I do admit this probably not what you had in mind.
>>
>
> OK, so the trick is that you're effectively hiding Applicative from the
> Prelude (which might be a no-op). This "works" somehow, but is not
> satisfactory IMHO for several reasons:

[...]

Btw, I've also seen the trick below, in which you use the aliased `A.`
prefix just once so GHC considers the import non-redundant, and don't
have to suffer from prefixed operators in the style of `A.<*>`.

Is this any better?

--8<---------------cut here---------------start------------->8---
import Control.Applicative as A (Applicative(..))

data Maybe' a = Nothing' | Just' a

instance Functor Maybe' where
fmap f (Just' v) = Just' (f v)
fmap _ Nothing' = Nothing'

instance A.Applicative Maybe' where
pure = Just'
f1 <*> f2 = f1 >>= \v1 -> f2 >>= (pure . v1)

instance Monad Maybe' where
Nothing' >>= _ = Nothing'
Just' x >>= f = f x

return = pure -- "deprecated" since GHC 7.10
--8<---------------cut here---------------end--------------->8---

Sven Panne

unread,
Oct 7, 2015, 4:07:46 AM10/7/15
to Herbert Valerio Riedel, Johan Tibell, Libraries, Haskell Cafe, haskell-prime@haskell.org Prime
2015-10-07 9:35 GMT+02:00 Herbert Valerio Riedel <h...@gnu.org>:
Btw, I've also seen the trick below, in which you use the aliased `A.`
prefix just once so GHC considers the import non-redundant, and don't
have to suffer from prefixed operators in the style of `A.<*>`.

Is this any better? [...]

While not perfect, it's much better than having to fiddle around with Prelude imports. Although there's the slight danger that somebody else (or the author 1 year later) looks at the code and has a WTF-moment... ;-) To be honest, while it's somehow obvious how it works when you read it, I've never seen that trick. Perhaps stuff like this belongs into some general "Porting Guide", along with its alternatives. It's general enough that it should not be buried in some AMP/FTP/return/... transitioning guide.

Cheers,
   S.

Lorenzo Gatti

unread,
Oct 7, 2015, 4:49:36 AM10/7/15
to Haskell-cafe, mant...@gsd.uwaterloo.ca, libr...@haskell.org, haskel...@haskell.org, graham...@nottingham.ac.uk, mjg...@gmail.com, haskel...@haskell.org, b...@chreekat.net


On Tuesday, October 6, 2015 at 12:19:04 AM UTC+2, Bryan Richter wrote:
Hang on a moment, are you saying that all the people writing to argue
that these changes would require them to write dozens more #ifdef's
actually don't have to write any at all? I never knew what the *-compat
packages were all about. If that's what they're designed to do, I have a
feeling they have not gotten *nearly* enough exposure.
 

This is a significant elephant in the room: evidently nobody in charge cares about migration, or nobody is actually in charge at all. For extra humiliation compare the introduction of Python 3, with the accompanying comprehensive source code conversion tools ("2to3" and "3to2") and the well-designed built-in compatibility libraries.
Not promoting and cementing the use of compatibility libraries is only a facet of a deeper problem: there seems to be nobody seriously caring for a good "standard library" for practical use, but there are many other symptoms:
  • The Prelude is small (some would say elegantly minimalist, but I say "batteries excluded"), vaguely defined (a bunch of traditional basic FP stuff, but what is included? What *should be* included?) and largely replaced by more general ways to perform the same task. It might be right for an interactive shell like GHCi, but for writing programs much more is needed.
  • Built in data types are rudimentary: cons lists are low performance but they get all the good syntax, optimistic conversion from characters to bytes assuming they are some vaguely defined extended ASCII is still encouraged, records are ridiculously limited (e.g. distinct field names within a module: finding a decent Data syntax workaround should have higher priority than clever lens libraries), efficient array and mapping types are relegated into very optional libraries.
  • Hackage should be a repository for specialized libraries, like PyPI or CPAN, but it also contains basic functionality that should belong in a standard library, typically split into many competing, incompatible and possibly obsolete modules and module clans. For example, do you want to affiliate your large file parsing project with the Pipes clan or with the Conduit clan? Choose wisely, because while using two similar libraries in object oriented languages like Python or C# or Java typically implies a simple and boring conversion between similar and straightforward data structures (unless, as is often the case, the libraries were designed to respect standard interfaces or to emulate each other), in Haskell combining libraries tends to involve transformers that do not exist between subtly different type classes you don't care about.
  • I've yet to see any adequately documented Haskell library: a toy example of the easiest case or a brief statement of the general concept appear to be the best one can hope for. I consider allergy to documentation and practical examples evidence that such libraries are academy-grade proofs of concept, only working decently (for some purposes, maybe) because they are made by smart and competent people after all, but not practical, useful and reliable software that I should be comfortable using. In most other programming language communities, documentation is more important than code because bad code can be improved but bad documentation turns away potential users forever.
Until Haskell gets a serious standard library with the accompanying serious documentation, like e.g. Python, and Haskell luminaries redirect their hacking efforts from making cool experiments to maintaining practical libraries, worrying about GHC evolution is more than bikeshedding: it's psychotic denial.

Simon Peyton Jones

unread,
Oct 7, 2015, 11:37:06 AM10/7/15
to Mike Meyer, Mark Lentczner, Johan Tibell, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List

I think there are several different conversations going on at once in this thread.  I think it’s worth keeping them separate.

 

·         Haskell Prime.  The intention there is to take a set of language features that are already in wide use in GHC (i.e. have demonstrably proved valuable), work out any dark corners, formalise them, and embody the result in a new Haskell Report.   That puts a useful stake in the ground, empowers alternative implementations of Haskell, gives confidence all round.

 

I think it’d be unusual for the Haskell Prime committee to specify a new feature of the language, one that had not been field tested.

·         Libraries.  AMP, BBP, and Monad(return) are small but fundamental changes to the core libraries.  I think there was a consensus (not universal in the case of BBP) that the change was good.  Yet AMP and BBP (esp) were controversial.  The issues were mostly around how to make the transition; and, given that the transition is expensive, whether the cost/benefit tradeoff justifies the change.  The question of moving ‘return’ out of Monad is in this category.

The Core Libraries Committee was formed explicitly to handle this stuff. So my prior assumption was that the CLC would handle the Monad(return) question, not Haskell Prime.

 

Mark’s suggestion of a “stable” GHC 7.10 and a new GHC 8.0 is a reasonable one. But I for one would find it hard to promise to back-port every bug fix, especially as the two code bases diverge (which they will). 

 

Here is another idea.  GHC knows very little about Monad.  It would take work, but it probably wouldn’t be impossible to make the same GHC work with two different ‘base’ libraries, each with a different definitions of the Monad class.  That would not solve the problem: you still could not use one library that used old Monad with another library that used new Monad.   But at least it’d decouple it from which version of GHC you were using.  I stress: it would take some work to make this go, and I’d prefer not to do this.

 

Returning to ‘return’, my instinct is that when these pervasive breaking library changes come up, we should batch them into larger clumps.  The “treadmill” complaint is real: small change A made me re-release my library; then small change B came along; and so on.  Perhaps if we saved them up this would be less of an issue, for two reasons.  First, the work happens once rather than many times.  Second, the benefits of the change is the sum of the benefits of the little component changes, and so is more attractive to library authors and library clients.

 

That line of thinking would suggest that the Core Libraries Committee might want to maintain a list of well-worked out agreed changes that are being “saved up” for execution at some later date.

 

Simon

Gregory Collins

unread,
Oct 7, 2015, 12:09:23 PM10/7/15
to Gershom B, Haskell Libraries, haskell-prime@haskell.org List, Haskell Cafe, Herbert Valerio Riedel
On Tue, Oct 6, 2015 at 6:18 PM, Gershom B <gers...@gmail.com> wrote:
 

With that in mind, I think the _concrete_ voices of concern have been the most useful. Gregory Collins’ list of issues requiring CPP should be very sobering. Of note, I think they point to areas where the core libraries committee has not paid _enough_ attention (or perhaps has not been sufficiently empowered: recall that not all core libraries fall under its maintenance [2]). Things like the newtype FFI issue, the changes to prim functions, the splitup of old-time and the changes to exception code were _not_ vetted as closely as the AMP and FTP were, or as the MRP is currently being. I don’t know all the reasons for this, but I suspect they just somewhat slipped under the radar.

In fact, more often than I would like, I can recall arguing against a particular change on the grounds that it would break user code, and in Haskell land this is a battle I usually lose. Usually the argument on the other side boils down to expediency or hygiene/aesthetics -- it's difficult to engineer a change to some core infra in a way that minimizes impact on people downstream, and it takes a long time. Often "this change is going to cause a small amount of work for all of my users" is something that seems to not be taken into consideration at all.

For this particular proposal, every user will have some small amount of work w to do (to read the change notes, understand why 'return' is going away, train yourself to use "pure" now instead of "return" like you've been using for 15 years, etc). It might feel like w is small and so the change isn't burdensome, but n is literally everyone who uses the language, so the total work w*n is going to amount to quite a few person-hours. I just want to make sure that everyone is keeping that in mind and weighing that effort against the benefits.

Outside of that, the most disruptive changes to my code that I can recall have been from changes to the aeson library over the years — particularly but not only regarding its handling of doubles. I don’t begrudge these changes — they iteratively arrived at a _much_ better library than had they not been made. [3] After than, I made a few changes regarding Happstack and Snap API changes if I recall. Additionally, the addition of “die” to System.Exit caused a few name clashes. My point is simply that there are many packages outside of base that also move, and “real” users with “real” code will these days often have quite a chain of dependencies, and will encounter movement and change from across many of them. So if we say “base never changes” that does not mean “packages will never break” — it just means that base will not have the same opportunity to improve that other packages do, which will eventually lead to frustration, just as it did in the past and in the leadup to the BBP.

Culturally, we have a problem with library authors of all stripes being too cavalier about breaking user programs: we definitely lean towards "move fast and break things" vs "stay stable and don't make work for users". As you write more and more Haskell code, you depend on more and more of these libraries, and this means that once you go beyond a certain threshold you will be spending a significant amount of your time just running to keep up with the treadmill. Personally I just don't have enough time for writing Haskell code as I used to (or I would like), so I would say for me that the treadmill tax is now probably exceeding 50% of my total hours invested.

Greg

Erik Hesselink

unread,
Oct 7, 2015, 12:38:27 PM10/7/15
to Gregory Collins, Herbert Valerio Riedel, Haskell Libraries, Haskell Cafe, Gershom B, haskell-prime@haskell.org List
On 7 October 2015 at 18:09, Gregory Collins <gr...@gregorycollins.net> wrote:
> For this particular proposal, every user will have some small amount of work
> w to do (to read the change notes, understand why 'return' is going away,
> train yourself to use "pure" now instead of "return" like you've been using
> for 15 years, etc).

While I don't think it detracts from your argument, it seems you
misread the original proposal. At no point will it remove `return`
completely. It would be moved out of the `Monad` class and be made
into a top-level definition instead, so you would still be able to use
it.

Erik

Michael Orlitzky

unread,
Oct 7, 2015, 1:34:28 PM10/7/15
to haskel...@haskell.org
(replying to no one in particular)

This problem isn't specific to Haskell. In every other language, you
have projects that support major versions of toolkits, compilers,
libraries and whatnot. And there's already a tool for it: git.

Instead of using #ifdef to handle four different compilers, keep a
branch for each. Git is designed to make this easy, and it's usually
trivial to merge changes from the master branch back into e.g. the
ghc-7.8 branch. That way the code in your master branch stays clean.
When you want to stop supporting an old GHC, delete that branch.

Francesco Ariis

unread,
Oct 7, 2015, 1:50:05 PM10/7/15
to haskel...@haskell.org
On Wed, Oct 07, 2015 at 01:34:21PM -0400, Michael Orlitzky wrote:
> Instead of using #ifdef to handle four different compilers, keep a
> branch for each. Git is designed to make this easy, and it's usually
> trivial to merge changes from the master branch back into e.g. the
> ghc-7.8 branch. That way the code in your master branch stays clean.
> When you want to stop supporting an old GHC, delete that branch.

I suspect `cabal install your-library` without CPP would explode in
the face of (some of) your users though.

Taru Karttunen

unread,
Oct 7, 2015, 2:02:49 PM10/7/15
to Michael Orlitzky, haskel...@haskell.org
On 07.10 13:34, Michael Orlitzky wrote:
> (replying to no one in particular)
>
> This problem isn't specific to Haskell. In every other language, you
> have projects that support major versions of toolkits, compilers,
> libraries and whatnot. And there's already a tool for it: git.
>
> Instead of using #ifdef to handle four different compilers, keep a
> branch for each. Git is designed to make this easy, and it's usually
> trivial to merge changes from the master branch back into e.g. the
> ghc-7.8 branch. That way the code in your master branch stays clean.
> When you want to stop supporting an old GHC, delete that branch.

Isn't this hard to do correctly for libraries with Hackage
and Cabal and narrowly versioned dependencies and deep
import graphs?

E.g. when adding a new feature to the library and merging
it back to the ghc-7.8 branch the versions needed for
features vs compiler support could end up causing
complex dependency clauses for users of such libraries.

- Taru Karttunen

Michael Orlitzky

unread,
Oct 7, 2015, 2:06:35 PM10/7/15
to haskel...@haskell.org
On 10/07/2015 01:48 PM, Francesco Ariis wrote:
> On Wed, Oct 07, 2015 at 01:34:21PM -0400, Michael Orlitzky wrote:
>> Instead of using #ifdef to handle four different compilers, keep a
>> branch for each. Git is designed to make this easy, and it's usually
>> trivial to merge changes from the master branch back into e.g. the
>> ghc-7.8 branch. That way the code in your master branch stays clean.
>> When you want to stop supporting an old GHC, delete that branch.
>
> I suspect `cabal install your-library` without CPP would explode in
> the face of (some of) your users though.

The different branches would have different cabal files saying with
which version of GHC they work. If the user has ghc-7.8 installed, then
cabal-install (or at least, any decent package manager) should pick the
latest version supporting ghc-7.8 to install.

Michael Orlitzky

unread,
Oct 7, 2015, 2:12:54 PM10/7/15
to haskel...@haskell.org
On 10/07/2015 02:02 PM, Taru Karttunen wrote:
>>
>> Instead of using #ifdef to handle four different compilers, keep a
>> branch for each. Git is designed to make this easy, and it's usually
>> trivial to merge changes from the master branch back into e.g. the
>> ghc-7.8 branch. That way the code in your master branch stays clean.
>> When you want to stop supporting an old GHC, delete that branch.
>
> Isn't this hard to do correctly for libraries with Hackage
> and Cabal and narrowly versioned dependencies and deep
> import graphs?

It can be...

> E.g. when adding a new feature to the library and merging
> it back to the ghc-7.8 branch the versions needed for
> features vs compiler support could end up causing
> complex dependency clauses for users of such libraries.
>

but can you think of an example where you don't have the same problem
with #ifdef?

If I need to use a new version of libfoo and the new libfoo only
supports ghc-7.10, then I'm screwed either way right?

David Thomas

unread,
Oct 7, 2015, 3:50:50 PM10/7/15
to Michael Orlitzky, Haskell Cafe
This sounds like the right approach. To whatever degree existing
tooling makes this difficult, maybe we can fix existing tooling?

Mark Lentczner

unread,
Oct 7, 2015, 5:13:32 PM10/7/15
to Erik Hesselink, Gershom B, haskell-prime@haskell.org List, Herbert Valerio Riedel, Haskell Libraries, Haskell Cafe

On Wed, Oct 7, 2015 at 9:38 AM, Erik Hesselink <hess...@gmail.com> wrote:
While I don't think it detracts from your argument, it seems you
misread the original proposal. At no point will it remove `return`
completely. It would be moved out of the `Monad` class and be made
into a top-level definition instead, so you would still be able to use
it.

Then why bother?
If you don't intend to regard code that uses "return" as old, out-dated, in need of updating, etc....
If you don't intend to correct people on #haskell to use pure instead of return...
If you don't tsk tsk all mentions of it in books.... 
If you don't intend to actually deprecate it.
Why bother?

But seriously, why do you think that "you would still be able to use it"? That is true for only the simplest of code - and untrue for anyone who has a library that defines a Monad - or anyone who has a library that they want to keep "up to date". Do you really want to have a library where all your "how to use this" code has return in the examples? Shouldn't now be pure? Do I now need -XCPP just for Haddock? and my wiki page? And what gets shown in Hackage? This is just a nightmare for a huge number of libraries, and especially many commonly used ones.

Why bother!

Bardur Arantsson

unread,
Oct 7, 2015, 5:35:16 PM10/7/15
to haskel...@haskell.org, libr...@haskell.org, haskel...@haskell.org
On 10/07/2015 08:36 AM, Mark Lentczner wrote:
> Mike -
>
> You might look up the phrase "A modest proposal".
>

While I personally appreciated it, I don't think irony/sarcasm/satire is
particularly appropriate given the state of discussion at this point ;).

Regards,

Manuel Gómez

unread,
Oct 7, 2015, 6:43:00 PM10/7/15
to Mark Lentczner, Haskell Libraries, Haskell Cafe, haskell-prime@haskell.org List
On Wed, Oct 7, 2015 at 4:43 PM, Mark Lentczner <mark.le...@gmail.com> wrote:
> If you don't intend to actually deprecate it.
> Why bother?
>
> But seriously, why do you think that "you would still be able to use it"?
> That is true for only the simplest of code - and untrue for anyone who has a
> library that defines a Monad - or anyone who has a library that they want to
> keep "up to date". Do you really want to have a library where all your "how
> to use this" code has return in the examples? Shouldn't now be pure? Do I
> now need -XCPP just for Haddock? and my wiki page? And what gets shown in
> Hackage? This is just a nightmare for a huge number of libraries, and
> especially many commonly used ones.
>
> Why bother!

This is explained in the original proposal. In particular, it
eliminates opportunities for errors and simplifies ApplicativeDo. I
don’t believe anyone has proposed removing return from base. The only
proposed change is turning return into a stand-alone function instead
of a method in Monad. There is no proposal for removing return.

Brandon Allbery

unread,
Oct 7, 2015, 7:05:51 PM10/7/15
to Bardur Arantsson, Haskell Libraries, haskell-cafe, Haskell-prime Mailing List

On Wed, Oct 7, 2015 at 4:54 PM, Bardur Arantsson <sp...@scientician.net> wrote:
Please consider that the the way practical development really happens[2]

...among web developers, who of course are the only real developers?

Have you considered that there are developers who are not web developers?
The past day has convinced me that the web devs have relegated everyone else to fake-non-programmer status and actively want them out of the community because fake programmers don't benefit you real programmers.

I had heard that the financial users generally refused to have anything to do with the Haskell community.
Now I know why.

I wonder how many of them, if any indeed are left after past breaking changes, are in the process of switching to OCaml. I'm sure you consider that a good thing, because they're obviously just holding back "real programmers".

--
brandon s allbery kf8nh                               sine nomine associates
allb...@gmail.com                                  ball...@sinenomine.net
unix, openafs, kerberos, infrastructure, xmonad        http://sinenomine.net

Bardur Arantsson

unread,
Oct 7, 2015, 7:39:14 PM10/7/15
to haskel...@haskell.org, libr...@haskell.org, haskel...@haskell.org
On 10/08/2015 01:05 AM, Brandon Allbery wrote:
> On Wed, Oct 7, 2015 at 4:54 PM, Bardur Arantsson <sp...@scientician.net>
> wrote:
>
>> Please consider that the the way practical development really happens[2]
>
>
> ....among web developers, who of course are the only real developers?
>

Nononono, I only provided that as an example. I'm well aware that there
are whole other ecosystems. (I, for example, am currently doing full-stack.)

> Have you considered that there are developers who are not web developers?
> The past day has convinced me that the web devs have relegated everyone
> else to fake-non-programmer status and actively want them out of the
> community because fake programmers don't benefit you real programmers.
>

Re-read your own words. You're doing exactly the same, just in reverse.

> I had heard that the financial users generally refused to have anything to
> do with the Haskell community.
> Now I know why.

I think Standard Charatered might disagree, but... whatevs.

>
> I wonder how many of them, if any indeed are left after past breaking
> changes, are in the process of switching to OCaml. I'm sure you consider
> that a good thing, because they're obviously just holding back "real
> programmers".
>

O'calm down! :)

The world isn't going to end over this.

(And... if you still think it is: Provide quantifiable data: Tell us how
many of *your* MLoC will be affected. Tell us... whatever you can
without violating NDAs, etc. "We" *will* apprectiate and take this into
account.)

Regards,

Mike Meyer

unread,
Oct 7, 2015, 7:39:39 PM10/7/15
to Brandon Allbery, Bardur Arantsson, Haskell Libraries, haskell-cafe, Haskell-prime Mailing List
On Wed, Oct 7, 2015 at 6:05 PM Brandon Allbery <allb...@gmail.com> wrote:
On Wed, Oct 7, 2015 at 4:54 PM, Bardur Arantsson <sp...@scientician.net> wrote:
Please consider that the the way practical development really happens[2]
...among web developers, who of course are the only real developers?
[...]
I had heard that the financial users generally refused to have anything to do with the Haskell community.
Now I know why.

I'm curious - do "practical" developers really feel like they have to rush out and update their tool chain whenever a new version of part of it comes out? Most of the projects I've worked on considered the language version as a fixed part of the technology stack, and almost never updated it. Even when using Python, which valued not breaking working code more than it's own zen. But changing anything that potentially affected all the code in a working project was pretty much never done, and always involved a lot of effort.

So the worst headache I got from language evolution was from trying to remember which set of features I had available for each project. No, that's second - the biggest one was from arguments about when we should adopt a new version. But breaking working code pretty much didn't happen.

Richard A. O'Keefe

unread,
Oct 7, 2015, 11:58:28 PM10/7/15
to Henrik Nilsson, Haskell Libraries, haskell cafe, haskell-prime@haskell.org List
The change to make Monad a special case of Applicative has been
a long time coming, and it has been a long time coming precisely
because it's going to break things.

I feel ambivalent about this. As soon as the question was
raised, it was clear that it *would have been* nicer had
Haskell been this way originally. I'm not looking forward to
the consequences of the change.


On 7/10/2015, at 12:32 am, Henrik Nilsson <Henrik....@nottingham.ac.uk> wrote:
> While we can discuss the extent of additional breakage
> MRP would cause, the fact remains it is a further
> breaking change. A survey of breakage to books as
> Herbert did is certainly valuable (thanks!), but
> much breakage will (effectively) remain unquantifiable.

This suggests one way in which one of the costs of the change
could be reduced. Not just to survey the breakage, but to
actually put revised material up on the Haskell.org web site,
with a prominent link from the home page.

Richard A. O'Keefe

unread,
Oct 8, 2015, 1:04:08 AM10/8/15
to Mike Meyer, Bardur Arantsson, Haskell Libraries, haskell-cafe, Haskell-prime Mailing List

On 8/10/2015, at 12:39 pm, Mike Meyer <m...@mired.org> wrote:
>
> I'm curious - do "practical" developers really feel like they have to rush out and update their tool chain whenever a new version of part of it comes out? Most of the projects I've worked on considered the language version as a fixed part of the technology stack, and almost never updated it. Even when using Python, which valued not breaking working code more than it's own zen. But changing anything that potentially affected all the code in a working project was pretty much never done, and always involved a lot of effort.

My computer spends more of its time installing new versions of Java 1.8
than it does running Java. An application I found very useful stopped
working when Java 1.6 went away, but Java 1.6 had to go away because of
(in)security concerns. How anyone managed to write a program in Java
1.6 that would not run in 1.8 I hope never to know.

(No, rebuilding from sources didn't work either. Tried that.)

Tony Morris

unread,
Oct 8, 2015, 1:26:06 AM10/8/15
to haskel...@haskell.org
Java is not backward compatible, even between minor release.

Rustom Mody

unread,
Oct 8, 2015, 3:01:43 AM10/8/15
to Simon Peyton Jones, haskell-prime@haskell.org List, Haskell Libraries, haskell cafe
 

 

From: Haskell-Cafe [mailto:haskell-ca...@haskell.org] On Behalf Of Mike Meyer
Sent: 07 October 2015 00:24
To: Mark Lentczner; Johan Tibell
Cc: Haskell Libraries; haskell cafe; haskel...@haskell.org List
Subject: Re: [Haskell-cafe] MRP, 3-year-support-window, and the non-requirement of CPP (was: Monad of no `return` Proposal (MRP): Moving `return` out of `Monad`)

 

On Tue, Oct 6, 2015 at 4:15 PM Mark Lentczner <mark.le...@gmail.com> wrote:


On Thu, Sep 24, 2015 at 2:43 PM, Herbert Valerio Riedel <h...@gnu.org> wrote:

TLDR: To complete the AMP, turn `Monad(return)` method into a
      top-level binding aliasing `Applicative(pure)`.

 

Sure... if we had a language that no one uses and could keep reforming like putty until it is perfect. But we don't.

 

A modest proposal:

 

We can't keep tinkering with a language and it's libraries like this AND have a growing ecosystem that serves an ever widening base, including the range from newcomer to commercial deployment. SO - Why let's do all the language tinkering in GHC 8 there can be as many prereleases of that as needed until it is just right. ...and leave GHC 7 (7.10? roll back to 7.8.4?) for all of us doing essential and dependable libraries, commercial work, projects on Haskell that we don't want to have to go back and #ifdefs to twice a year just to keep running, educators, people writing books. We can keep improving GHC 7 as needed, and focus on bugs, security issues, patches, cross compatibility, etc.

 

On Wed, Oct 7, 2015 at 9:06 PM, Simon Peyton Jones <sim...@microsoft.com> wrote:

I think there are several different conversations going on at once in this thread.  I think it’s worth keeping them separate.

<snip>

Returning to ‘return’, my instinct is that when these pervasive breaking library changes come up, we should batch them into larger clumps.  The “treadmill” complaint is real: small change A made me re-release my library; then small change B came along; and so on.  Perhaps if we saved them up this would be less of an issue, for two reasons.  First, the work happens once rather than many times.  Second, the benefits of the change is the sum of the benefits of the little component changes, and so is more attractive to library authors and library clients.

 

That line of thinking would suggest that the Core Libraries Committee might want to maintain a list of well-worked out agreed changes that are being “saved up” for execution at some later date.

 

Simon

 

This whole discussion is tilted towards the software engineering side.
Of course from this side if someone keeps changing the underbelly in small but significant ways that has multiplicative effects on higher levels, people who have to manage the higher affected levels will protest.

However I'd like to respectfully submit that the software engineering angle is not the only one; there's also the teaching angle.

When the Haskell report first came out in 1990 this dual purpose was very clear in the first paras.
Over years as Haskell has progressed from being an academic language to a 'Real World' one this balance has been lost.
Whether Monads is really rocket science or its just ill-designed misnamed stuff that looks like a mess because no heavy duty vacuum cleaner has been applied, no one knows for sure because there's little pedagogic experience with any alternative 'cleaned out Haskell'.

However this is just to point out that that possibility may be there.

And from the pedagogic angle Haskell may be neat but functional programming is probably more significant and programming in general even more so.

In this respect here are two posts:
and its sequel: http://blog.languager.org/2015/06/functional-programming-moving-target.html
which is a more indepth analysis of FP in ACM curriculum 2013 and how pedagogic realities that were generally understood some decades ago have been increasingly forgotten today.

For the specific case at hand -- return vs pure -- Ive no specific opinion
However on a wider front, the confusion and frustration of the thousands of beginners who have to see the forbidding meaningless 'Functor' and 'Monad' instead of something more accessible like Mappable/Computational is IMHO a factor for Haskell not getting as much success as it should.

Hopefully the community will take spj's call for significant batched-up changes seriously

Mario Lang

unread,
Oct 8, 2015, 4:21:34 AM10/8/15
to Haskell Cafe, Haskell Libraries, haskel...@haskell.org
José Manuel Calderón Trilla <jm...@jmct.cc> writes:

> Many of us here learned from those texts or those courses. It's easy online
> to say that materials being out of date isn't a big deal, but it can turn
> people off the language when the code they paste into ghci doesn't
> work.

FWIW, I am a newbie, and currently learning a lot from the web.
I've had a different feeling. Whenever I read something like "This is
historical, and should have been fixed, but isn't yet", it were these
sentences that almost turned me off the language, because I had a
feeling that there is some stagnation going on.

From the POV of a newbie, I am all for fixing historical hickups.
However, I realize that I have different priorities compared to
long-time users and people needing to maintain a lot of existing code.

Just my 0.01€

--
CYa,
⡍⠁⠗⠊⠕

Tony Morris

unread,
Oct 8, 2015, 4:52:30 AM10/8/15
to haskel...@haskell.org
Many of my students are distracted by this too. This is why it is turned
off and a practical type-class hierarchy is used to demonstrate concepts.

As for use in production, it's been used for coming up to 10 years. I
will wait.

On 08/10/15 18:21, Mario Lang wrote:
> José Manuel Calderón Trilla <jm...@jmct.cc> writes:
>
>> Many of us here learned from those texts or those courses. It's easy online
>> to say that materials being out of date isn't a big deal, but it can turn
>> people off the language when the code they paste into ghci doesn't
>> work.
>
> FWIW, I am a newbie, and currently learning a lot from the web.
> I've had a different feeling. Whenever I read something like "This is
> historical, and should have been fixed, but isn't yet", it were these
> sentences that almost turned me off the language, because I had a
> feeling that there is some stagnation going on.
>
> From the POV of a newbie, I am all for fixing historical hickups.
> However, I realize that I have different priorities compared to
> long-time users and people needing to maintain a lot of existing code.
>
> Just my 0.01€
>

Kosyrev Serge

unread,
Oct 8, 2015, 5:39:50 AM10/8/15
to Tony Morris, haskel...@haskell.org
Tony Morris <tonym...@gmail.com> writes:
> Java is not backward compatible, even between minor release.

Is there an understanding of how they manage to qualify as an
industrially-acceptable language in spite of this?

--
с уважениeм / respectfully,
Косырев Серёга

Tony Morris

unread,
Oct 8, 2015, 5:41:57 AM10/8/15
to Kosyrev Serge, haskel...@haskell.org
Successful marketing.

On 08/10/15 19:39, Kosyrev Serge wrote:
> Tony Morris <tonym...@gmail.com> writes:
>> Java is not backward compatible, even between minor release.
>
> Is there an understanding of how they manage to qualify as an
> industrially-acceptable language in spite of this?
>

Ben Gamari

unread,
Oct 8, 2015, 9:55:22 AM10/8/15
to Michael Orlitzky, haskel...@haskell.org
Michael Orlitzky <mic...@orlitzky.com> writes:

> (replying to no one in particular)
>
> This problem isn't specific to Haskell. In every other language, you
> have projects that support major versions of toolkits, compilers,
> libraries and whatnot. And there's already a tool for it: git.
>
> Instead of using #ifdef to handle four different compilers, keep a
> branch for each. Git is designed to make this easy, and it's usually
> trivial to merge changes from the master branch back into e.g. the
> ghc-7.8 branch. That way the code in your master branch stays clean.
> When you want to stop supporting an old GHC, delete that branch.
>
I don't find this option terribly appealing. As a maintainer I would far
prefer maintaining a bit of CPP than a proliferation of branches, with
all of the cherry-picking and potential code divergence that they
bring.

Really though, the point I've been trying to make throughout is that I
think that much of the CPP that we are currently forced to endure isn't
even strictly necessary. With a few changes to our treatment of warnings
and some new pragmas (aimed at library authors), we can greatly reduce
the impact that library interface changes have on users.

Cheers,

- Ben
signature.asc

Richard Eisenberg

unread,
Oct 8, 2015, 1:05:30 PM10/8/15
to Ben Gamari, haskell-cafe@haskell.org Cafe

On Oct 8, 2015, at 9:55 AM, Ben Gamari <b...@smart-cactus.org> wrote:

> With a few changes to our treatment of warnings
> and some new pragmas (aimed at library authors), we can greatly reduce
> the impact that library interface changes have on users.

My loose following of the interweaved threads has led me to this same conclusion. Have you paid close enough attention to list exactly what these changes should be? I have not. But I'd love to find a general solution to the migration problem so that we can continue to tinker with our beloved language without fear of flames burning down the house.

Richard

Joachim Breitner

unread,
Oct 8, 2015, 2:05:55 PM10/8/15
to haskel...@haskell.org
Hi,

Am Donnerstag, den 08.10.2015, 13:05 -0400 schrieb Richard Eisenberg:
> On Oct 8, 2015, at 9:55 AM, Ben Gamari <b...@smart-cactus.org> wrote:
> > With a few changes to our treatment of warnings
> > and some new pragmas (aimed at library authors), we can greatly
> > reduce
> > the impact that library interface changes have on users.
>
> My loose following of the interweaved threads has led me to this same
> conclusion. Have you paid close enough attention to list exactly what
> these changes should be? I have not. But I'd love to find a general
> solution to the migration problem so that we can continue to tinker
> with our beloved language without fear of flames burning down the
> house.

how willing are we to make the compiler smarter and more complicate to
make sure old code does what it originally meant to do, as long as it
is among the (large number close to 100)% common case?

For example, in this case (as was suggested in some trac ticket, I
believe), ignore a method definition for a method that
* is no longer in the class and
* where a normal function is in scope and
* these are obvious equivalent
where obvious may mean various things, depending on our needs.

One could think this further: If the compiler sees a set of Applicative
and Monad instances for the same type in the same module, and it has
"return = something useful" and "pure = return", can’t it just do what
we expect everyone to do manually and change that to "pure = something
useful" and remove return?

In other words, can we replace pain and hassle for a lot of people by
implementation work (and future maintenance cost) for one or a few
people?

Or will that lead to another hell where code does no longer mean what
it says?

Greetings,
Joachim

--
Joachim “nomeata” Breitner
ma...@joachim-breitner.dehttp://www.joachim-breitner.de/
Jabber: nom...@joachim-breitner.de • GPG-Key: 0xF0FBF51F
Debian Developer: nom...@debian.org

signature.asc

Michael Orlitzky

unread,
Oct 8, 2015, 2:53:05 PM10/8/15
to haskel...@haskell.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

On 10/08/2015 09:55 AM, Ben Gamari wrote:
> Michael Orlitzky <mic...@orlitzky.com> writes:
>
>> (replying to no one in particular)
>>
>> This problem isn't specific to Haskell. In every other language,
>> you have projects that support major versions of toolkits,
>> compilers, libraries and whatnot. And there's already a tool for
>> it: git.
>>
>> Instead of using #ifdef to handle four different compilers, keep
>> a branch for each. Git is designed to make this easy, and it's
>> usually trivial to merge changes from the master branch back into
>> e.g. the ghc-7.8 branch. That way the code in your master branch
>> stays clean. When you want to stop supporting an old GHC, delete
>> that branch.
>>
> I don't find this option terribly appealing. As a maintainer I
> would far prefer maintaining a bit of CPP than a proliferation of
> branches, with all of the cherry-picking and potential code
> divergence that they bring.

It's really not that bad. You only need a separate branch when things
are actually incompatible, so while you may support four versions of
GHC, you might only need two branches.

And the difference between the branches should be exactly what you
have wrapped in an #ifdef right now:

a) If you're modifying something in an #ifdef, then the change only
goes in one branch, and you don't have to cherry-pick anything.

b) If the change wouldn't fall within an #ifdef, then it affects code
common to all your branches, and the merge will be trivial.

It's annoying to have to do that for every change, so don't. Keep a
pile of common changes in your master branch, and then rebase/merge
the other branches right before you make a release.

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0

iQJ8BAEBCgBmBQJWFruJXxSAAAAAAC4AKGlzc3Vlci1mcHJAbm90YXRpb25zLm9w
ZW5wZ3AuZmlmdGhob3JzZW1hbi5uZXQxNEU5RDcyRDdCMUFGREVGQzBCNDFDMUY2
RjQ4RDNEQTA1QzJEQURCAAoJEG9I09oFwtrbPB0P/37UFFBHqgN/Pspr26bdK4Hp
j5TdF5RqP8BeiBg/shJyc1xrq4uB2wQZfrExCwfXssJYz+/Cuiw77MSYkay/ks3s
LemyT6wIAypmjDOKHwzXw9mWU7oS9iL3CSPvZB07HLB4JNHpzBC0hAx3ZuMucyg7
xWFWgmlR5i1k069OCs+dgkfXotyt0zGtt17pw8YbX3X29/SOy9Y4K3+L6kfV8pOW
dN/3/DIDakBDfLLLJG/pc57xq5GnTd77sCLNHrheWkybB3leW9t00Zq4erjBDyWt
O7eO1jjTHoTo/S1iDWYGiy6zPI1dI+jDowUDLrZfIeAURw81ymqbfQlujwqoLB4j
kWnaBDpT5JDhKZ3ZMWOcPtCGlUGbXIYh986s22jXfRhvO0dFNwwhbQOQdQMEUN74
XCggw4APIAHnA7lfg2s33bVOJr/d8XumnOCHD+7IEWYc+25lDWr42Ens7LOVJzv1
COh3JKAPbbWFpwmU2yKdowomiZglNZj9QW27e7x33ZU0rLITS2CdV/zmY5TLqf4S
v40jJcQMt2ZSCW0X8HBpHdGG6tQxUWcYZR8kpbxoaoQgwwqYa+vN0aXyMd5tG3Bp
cHyTDfya+Kt6lVa23kvs2YPXjUAXvSnoSBL646gpYvRvrx6L+T7Dd9UUNFGNg+8k
+Dw1hH2mSYJ852ZreY9e
=qH2g
-----END PGP SIGNATURE-----

David Thomas

unread,
Oct 8, 2015, 5:12:48 PM10/8/15
to Michael Orlitzky, Haskell Cafe
Right. With a nest of #ifdefs, you still have the same number of
branches (at least - there may be some combinations of options that
you don't support and it won't be obvious), they're just implicit and
harder to work with.

Taru Karttunen

unread,
Oct 8, 2015, 6:05:43 PM10/8/15
to David Thomas, Haskell Cafe
On 08.10 14:12, David Thomas wrote:
> Right. With a nest of #ifdefs, you still have the same number of
> branches (at least - there may be some combinations of options that
> you don't support and it won't be obvious), they're just implicit and
> harder to work with.

Have you got an example of a library that works with branches
for GHC-versions while maintaining feature-parity across
branches with a versioning scheme that works in practice with
Hackage?

- Taru Karttunen

Ben

unread,
Oct 8, 2015, 6:32:46 PM10/8/15
to haskel...@haskell.org
I like the idea of a separate translator that understands how to make the obvious changes required by such minor improvements (remove/add definitions of return, remove/add imports of Applicative, etc), and having that applied by cabal when the code is built.

Then library authors could write normal code conforming to *one* release, instead of figuring out the clever import gymnastics or CPP required to code simultaneously against several almost-compatible interfaces (or forking).

Some meta data in a cabal file could hopefully be all you need to allow your code to be adjustible forward or backward a few dialects.
It is loading more messages.
0 new messages