On 5 Oct 2015, at 10:47, Michał J Gajda <mjg...@gmail.com> wrote:Hi,As a person who used Haskell in all three capacities (for scientific research, for commercial purpose, and to introduce others to benefits of pure and strongly typed programming), I must voice an supportive voice for this change:1. Orthogonal type classes are easier to explain.2. Gradual improvements helps us to generalize further, and this in turn makes education easier.3. Gradual change that break only a little help to prevent either stagnation (FORTRAN) and big breakage (py3k). That keeps us excited.That would also call to split TCs into their orthogonal elements: return, ap, bind having the basic TC on their own.
So:+1, but only if it is possible to have compatibilty mode. I believe that rebindable syntax should allow us to otherwise make our own prelude, if we want such a split. Then we could test it well before it is used by the base library.That said, I would appreciate Haskell2010 option just like Haskell98 wad, so that we can compile old programs without changes. Even by using some Compat version of standard library. Would that satisfy need for stability?PS And since all experts were beginners some time ago, I beg that we do not call them "peripheral".--Best regardsMichał
On Monday, 5 October 2015, Malcolm Wallace <malcolm...@me.com> wrote:On other social media forums, I am seeing educators who use Haskell as a vehicle for their main work, but would not consider themselves Haskell researchers, and certainly do not have the time to follow Haskell mailing lists, who are beginning to say that these kinds of annoying breakages to the language, affecting their research and teaching materials, are beginning to disincline them to continue using Haskell. They are feeling like they would be
(...)
--
Pozdrawiam
Michał
_______________________________________________
Haskell-prime mailing list
Haskel...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime
[...] It’s really interesting to have this discussion, which pulls in all sorts of well-made points about orthogonality, teaching, the evolution of the language and so on, but it simply goes to show that the process of evolving Haskell is profoundly broken. [...]
I think that part of the reason we have seen these changes occur in a
"constant flux" rather than in bigger coordinated chunks is that faith
in the Haskell Report process was (understandably) abandoned. And
without the Haskell Report as some kind of "clock generator" with which
to align/bundle related changes into logical units, changes occur
whenever they're proposed and agreed upon (which may take several
attempts as we've seen with the AMP and others).
I hope that the current attempt to revive the Haskell Prime process will
give us a chance to clean up the unfinished intermediate `base-4.8`
situation we're left with now after AMP, FTP et al, as the next Haskell
Report revision provides us with a milestone to work towards.
That being said, there's also the desire to have changes field-tested by
a wide audience on a wide range before integrating them into a Haskell
Report. Also I'm not sure if there would be less complaints if
AMP/FTP/MFP/MRP/etc as part of a new Haskell Report would be switched on
all at once in e.g. `base-5.0`, breaking almost *every* single package
out there at once.
For language changes we have a great way to field-test new extensions
before integrating them into the Report via `{-# LANGUAGE #-}` pragmas
in a nicely modular and composable way (i.e. a package enabling a
certain pragma doesn't require other packages to use it as well) which
have proven to be quite popular.
However, for the library side we lack a comparable mechanism at this
point. The closest we have, for instance, to support an isolated
Haskell2010 legacy environment is to use RebindableSyntax which IMO
isn't good enough in its current form[1]. And then there's the question
whether we want a Haskell2010 legacy environment that's isolated or
rather shares the types & typeclasses w/ `base`. If we require sharing
types and classes, then we may need some facility to implicitly
instanciate new superclasses (e.g. implicitly define Functor and
Applicative if only a Monad instance is defined). If we don't want to
share types & classes, we run into the problem that we can't easily mix
packages which depend on different revisions of the standard-library
(e.g. one using `base-4.8` and others which depend on a legacy
`haskell2010` base-API). One way to solve this could be to mutually
exclude depending on both , `base-4.8` and `haskell2010`, in the same
install-plan (assuming `haskell2010` doesn't depend itself on
`base-4.8`)
In any case, I think we will have to think hard how to address
language/library change management in the future, especially if the
Haskell code-base continues to grow. Even just migrating the code base
between Haskell Report revisions is a problem. An extreme example
is the Python 2->3 transition which the Python ecosystem is still
suffering from today (afaik). Ideas welcome!
[1]: IMO, we need something to be used at the definition site providing
desugaring rules, rather than requiring the use-site to enable a
generalised desugaring mechanism; I've been told that Agda has an
interesting solution to this in its base libraries via
{-# LANGUAGE BUILTIN ... #-} pragmas.
Regards,
H.V.Riedel
Hi Simon. I do in fact think this is provocative :-P
I want to object here to your characterization of what has been going on as “simply throwing in changes”. The proposal seems very well and carefully worked through to provide a good migration strategy, even planning to alter the source of GHC to ensure that adequate hints are given for the indefinite transition period.
I also want to object to the idea that these changes would have “a profound impact on systems”. As it stands, and I think this is an important criteria in any change, when “phase 2” goes into affect, code that has compiled before may cease to compile until a minor change is made. However, code that continues to compile will continue to compile with the same behavior.
Now as to process itself, this is a change to core libraries. It has been proposed on the libraries list, which seems appropriate, and a vigorous discussion has ensued. This seems like a pretty decent process to me thus far. Do you have a better one in mind?
—Gershom
P.S. as a general point, I sympathize with concerns about breakage resulting from this, but I also think that the migration strategy proposed is good, and if people are concerned about breakage I think it would be useful if they could explain where they feel the migration strategy is insufficient to allay their concerns.
This proposal falls far short of both bars, to the extent that I am astonished and disappointed it is being seriously discussed – and to general approval, no less – on a date other than April 1. Surely some design flaws have consequences so small that they are not worth fixing.
I'll survive if it goes through, obviously, but it will commit me to a bunch of pointless make-work and compatibility ifdefs. I've previously expressed my sense that cross-version compatibility is a big tax on library maintainers. This proposal does not give me confidence that this cost is being taken seriously.
Thanks,
Bryan.
> Haskell-prime mailing list
> Haskel...@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime
My understanding of the argument here, which seems to make sense to me, is that the AMP already introduced a significant breaking change with regards to monads. Books and lecture notes have already not caught up to this, by and large. Hence, by introducing a further change, which _completes_ the general AMP project, then by the time books and lecture notes are all updated, they will be able to tell a much nicer story than the current one?
As for libraries, it has been pointed out, I believe, that without CPP one can write instances compatible with AMP, and also with AMP + MRP. One can also write code, sans CPP, compatible with pre- and post- AMP.
So the reason for choosing to not do MRP simultaneous with AMP was precisely to allow a gradual migration path where, sans CPP, people could write code compatible with the last three versions of GHC, as the general criteria has been.
So without arguing the necessity or not, I just want to weigh in with a technical opinion that if this goes through, my _estimation_ is that there will be a smooth and relatively painless migration period, the sky will not fall, good teaching material will remain good, those libraries that bitrot will tend to do so for a variety of reasons more significant than this, etc.
It is totally reasonable to have a discussion on whether this change is worth it at all. But let’s not overestimate the cost of it just to further tip the scales :-)
—gershom
_______________________________________________
Libraries mailing list
Libr...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries
I think code-based weighting should not be considered at all. There's a non-trivial amount of proprietary code that wouldn't be counted, and likely can't be accounted for accurately. I have some unknown amount of code at my previous employer (which does include Monad instances) that now has to be maintained by others. I may have said +1 (and I'm reconsidering), but the current maintainers probably don't want the extra make-work this involves.
There's an argument that this proposal involves very little up-front breakage. This is true but disingenuous, because the breakage will still happen in the future unless changes are made and committed. The entire maintenance load should be considered, not just the immediate load.
Bryan, nothing I've seen since I started using Haskell makes me think that the libraries committee or ghc devs value back compatibility much at all. My favorite example is RecursiveDo, which was deprecated in favor of DoRec, only for that to be reversed in the next ghc release. Of course that was more irritating than breaking, but it's indicative of the general value placed on maintenance programming.
My understanding of the argument here, which seems to make sense to me, is that the AMP already introduced a significant breaking change with regards to monads. Books and lecture notes have already not caught up to this, by and large. Hence, by introducing a further change, which _completes_ the general AMP project, then by the time books and lecture notes are all updated, they will be able to tell a much nicer story than the current one?
~/personal/src/snap λ find . -name '*.hs' | xargs egrep '#if.*(MIN_VERSION)|(GLASGOW_HASKELL)' | wc -l64
On October 5, 2015 at 10:59:35 AM, Bryan O'Sullivan (b...@serpentine.com) wrote:
[...] As for libraries, it has been pointed out, I believe, that without CPP one can write instances compatible with AMP, and also with AMP + MRP. One can also write code, sans CPP, compatible with pre- and post- AMP. [...]
On 5 October 2015 at 20:58, Sven Panne <sven...@gmail.com> wrote:
> 2015-10-05 17:09 GMT+02:00 Gershom B <gers...@gmail.com>:
>>
>> [...] As for libraries, it has been pointed out, I believe, that without
>> CPP one can write instances compatible with AMP, and also with AMP + MRP.
>> One can also write code, sans CPP, compatible with pre- and post- AMP. [...]
>
> Nope, at least not if you care about -Wall: If you take e.g. (<$>) which is
> now part of the Prelude, you can't simply import some compatibility module,
> because GHC might tell you (rightfully) that that import is redundant,
> because (<$>) is already visible through the Prelude. So you'll have to use
> CPP to avoid that import on base >= 4.8, be it from it Data.Functor,
> Control.Applicative or some compat-* module. And you'll have to use CPP in
> each and every module using <$> then, unless I miss something obvious.
> AFAICT all transitioning guides ignore -Wall and friends...
Does the hack mentioned on the GHC trac [1] work for this? It seems a
bit fragile but that page says it works and it avoids CPP.
_______________________________________________
Libraries mailing list
Libr...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries
Having to use IFDEFs is a sign that the language itself is not expressive enough to deal with these kinds of changes and therefore it's not having the right "architecture". Is Backpack going to provide the necessary support, (e.g., ability to mix in package fragments with extensions instead of making breaking changes)? What kind of language features are still missing?
Anyway, mistakes were made in the past, because had we known any better we would have done better. Nothing should prevent us from fixing mistakes.
Regarding IFDEFS, all of them disappeared from my code once I used transformers-compat and mtl-compat. Also the trick with import Prelude worked-you just have to list what you hide instead of what you import (which is weird...).
--
Michał
[1] http://www.infoq.com/presentations/functional-techniques-complexity
Also I'm not sure if there would be less complaints if AMP/FTP/MFP/MRP/etc as part of a new Haskell Report would be switched on all at once in e.g. `base-5.0`, breaking almost *every* single package out there at once.
Hang on a moment, are you saying that all the people writing to arguethat these changes would require them to write dozens more #ifdef'sactually don't have to write any at all?
Broadly speaking, I'm a "fix it now rather than later" sort of person
in Haskell because I've seen how long things can linger before finally
getting fixed (even when everyone agrees on what the fix should be and
agrees that it should be done). However, as I mentioned in the
originating thread, I think that —at this point— when it comes to
AMP/FTP/MFP/MRP/etc we should really aim for the haskell' committee to
work out a comprehensive solution (as soon as possible), and then
enact all the changes at once when switching to
Haskell201X/base-5.0/whatevs. I understand the motivations for wanting
things to be field-tested before making it into the report, but I
don't think having a series of rapid incremental changes is the
correct approach here. Because we're dealing with the Prelude and the
core classes, the amount of breakage (and CPP used to paper over it)
here is much higher than our usual treadmill of changes; so we should
take that into account when planning how to roll the changes out.
--
Live well,
~wren
Here's the record of my struggle [1]. Then came Edward K. and told me
to use transformers-compat [2]. Then Adam B. chimed in suggesting
mtl-compat [3]. That's it. All my IFDEFs related to this topic were
gone.
I looked at base-compat [4] and it's very interesting. The big question is:
**Why cannot normal base be written the same way as base-compat? **
It would then automatically provide compatibility across all GHC versions.
Michał
[1] https://www.reddit.com/r/haskell/comments/3gqqu8/depending_on_both_mtl2131_for_stackage_lts2_and/
[2] https://www.reddit.com/r/haskell/comments/3gqqu8/depending_on_both_mtl2131_for_stackage_lts2_and/cu0l6nf
[3] https://www.reddit.com/r/haskell/comments/3gqqu8/depending_on_both_mtl2131_for_stackage_lts2_and/cu5g73q
[4] https://hackage.haskell.org/package/base-compat
Tom
Tom
> Libraries mailing list
> Libr...@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries
- -- I am going to do some logging, yay!
data Logs a =
Logs a [a]
deriving (Eq, Show)
- -- one log message
singlelog ::
a
-> Logs a
singlelog a =
Logs a []
- -- two log messages
twologs ::
a
-> a
-> Logs a
twologs a1 a2 =
Logs a1 [a2]
class Semigroup a where
(<>) ::
a
-> a
-> a
- -- I can append logs
instance Semigroup (Logs a) where
Logs h1 t1 <> Logs h2 t2 =
Logs h1 (t1 ++ h2 : t2)
- -- I can map on Logs
instance Functor Logs where
fmap f (Logs h t) =
Logs (f h) (fmap f t)
- -- I will collect logs with a value
data WriteLogs l a =
WriteLogs (Logs l) a
deriving (Eq, Show)
- -- I can map the pair of logs and a value
instance Functor (WriteLogs l) where
fmap f (WriteLogs l a) =
WriteLogs l (f a)
singlewritelog ::
l
-> a
-> WriteLogs l a
singlewritelog l a =
WriteLogs (singlelog l) a
- -- Monad without return
class Bind f where
(-<<) ::
(a -> f b)
-> f a
-> f b
- -- Can I Applicativate WriteLogs? Let's see.
instance Applicative (WriteLogs l) where
-- Well that was easy.
WriteLogs l1 f <*> WriteLogs l2 a =
WriteLogs (l1 <> l2) (f a)
pure a =
WriteLogs (error "wait, what goes here?") a
-- Oh I guess I cannot Applicativate WriteLogs, but I can Apply them!
- -- Well there goes that idea.
- -- instance Monad (WriteLogs l) where
- -- Wait a minute, can I bind WriteLogs?
instance Bind (WriteLogs l) where
-- Of course I can!
f -<< WriteLogs l1 a =
let WriteLogs l2 b = f a
in WriteLogs (l1 <> l2) b
- -- OK here goes ...
myprogram ::
WriteLogs String Int
myprogram =
-- No instance for (Monad (WriteLogs String))
-- RAR!, why does do-notation require extraneous constraints?!
-- Oh that's right, Haskell is broken.
-- Oh well, I guess I need to leave Prelude turned off and rewrite
the base libraries.
do a <- singlewritelog "message" 18
b <- WriteLogs (twologs "hi" "bye") 73
WriteLogs (singlelog "exit") (a * b)
- -- One day, one day soon, I can move on.
On 06/10/15 11:20, ami...@gmail.com wrote:
> IMO, the "tech debt" you're talking about feels very small. We've
> already made the change that return = pure by default. The
> historical baggage that this proposal cleans up is just the fact
> that legacy code is able to define its own "return" without
> breaking (which must be the same as the definition of pure
> anyway). I am also moving from +0.5 to +0 on this.
>
> Tom
>
>
>> El 5 oct 2015, a las 18:29, Alexander Berntsen
>> <alex...@plaimi.net> escribió:
>>
>>>> On 05/10/15 20:50, Nathan Bouscal wrote: There have been a
>>>> lot of objections based on the idea that learners will
>>>> consult books that are out of date, but the number of
>>>> learners affected by this is dwarfed by the number of
>>>> learners who will use updated materials and be confused by
>>>> this strange historical artifact. Permanently-enshrined
>>>> historical artifacts accrete forever and cause linear
>>>> confusion, whereas outdated materials are inevitably replaced
>>>> such that the amount of confusion remains constant.
> Thank you for making this point
>
> I would be very saddened if the appeal to history (i.e. technical
> debt) would halt our momentum. That's what happens to most things
> both in and out of computer science. And it's honestly depressing.
>> _______________________________________________ Haskell-Cafe
>> mailing list Haskel...@haskell.org
>> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
> _______________________________________________ Haskell-Cafe
> mailing list Haskel...@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2
iQEcBAEBCAAGBQJWEylQAAoJEFkczQCvdvv08KMIAIGEwj6dQqnk8Z3zjFC1Vpvb
LTXdnzlcXCMXmIdzr9RaSGUKo52b3BPaP6EgFDJm8U/CJYQ/X8FAyy0gmKVJlru4
JQc4Y+CcGqz7+UwfYRlOOJDFNscigvGDj33N3hp3G/HuWfvllWJSx9n7gTqSrnXS
W/jTDN3sntJWiCdC+A5rLoqzH3eZ2LhwB0iL26DSfE1OLPyBK2kignKCjnMtRbEq
xY5vjx7xLQKzApRARIrBdDNVuXVRy+QQyGTGmdOKaLscNNrMzewcUr8LZLRG+6W7
RKA12y3etcVXGRlACHNn67mUKJIlKWX5PZSVsj07SZNWp3eyDctHfKuqYonfgJU=
=G8G2
> littering the code with #ifdefs
> Bonus points for keeping the #ifdefs centralized
Littering is the right word. "Bonus" is merely less negative points.
It is ironic that the beautiful Haskell should progress by
adopting the worst feature of C. #ifdef is a sticking-plaster
for non-portable code. It is inserted at global level, usually
to effect changes at the bottom of the code hierarchy. (Maybe
in Haskell it should be a monad, in competition with IO for
the outermost layer.)
Plan 9 showed the way out of ifdef, by putting (non-ifdef-ed)
inescapably nonportable code, such as endianity, in compilation
units and #include files in a distinct part of the file system
tree selected by the shell.
Another trick is to use plain if rather than #if or #ifdef,
and let the compiler optimize away the unwanted side of the
branch.
In any event, #ifdef is, as Simon evidently agrees, telling
evidence of non-portability. It is hard to imagine an uglier
"solution" to keeping up with the drip-drip-drip of Haskell
evolution.
>> Strongly -1 from me also. My experience over the last couple of years is
>> that every GHC release breaks my libraries in annoying ways that require
>> CPP to fix:
>>
>> ~/personal/src/snap λ find . -name '*.hs' | xargs egrep
>> '#if.*(MIN_VERSION)|(GLASGOW_HASKELL)' | wc -l
>> 64
[...]
> On the libraries I maintain and have a copy of on my computer right now: 329
Although this was already pointed out to you in a response to a Tweet of
yours, I'd like to expand on this here to clarify:
You say that you stick to the 3-major-ghc-release support-window
convention for your libraries. This is good, because then you don't need
any CPP at all! Here's why:
So when GHC 8.2 is released, your support-window requires you to support
GHC 7.10 and GHC 8.0 in addition to GHC 8.2.
At this point you'll be happy that you can start dropping those #ifdefs
you added for GHC 7.10 in your code in order to adapt to FTP & AMP.
And when you do *that*, you can also drop all your `return = pure`
methods overrides. (Because we prepared for MRP already in GHC 7.10 by
introducing the default implementation for `return`!)
This way, you don't need to introduce any CPP whatsoever due to MRP!
Finally, since we're not gonna remove `return` in GHC 8.2 anyway, as GHC
8.2 was just the *earliest theoretical* possible GHC in which this
*could* have happened. Realistically, this would happen at a much later
point, say GHC 8.6 or even later! Therefore, the scheme above would
actually work for 5-year time-windows! And there's even an idea on the
table to have a lawful `return = pure` method override be tolerated by
GHC even when `return` has already moved out of `Monad`!
PS: I'm a bit disappointed you seem to dismiss this proposal right away
categorically without giving us a chance to address your
concerns. The proposal is not a rigid all-or-nothing thing that
can't be tweaked and revised. That's why we're having these
proposal-discussions in the first place (rather than doing blind
+1/-1 polls), so we can hear everyone out and try to maximise the
agreement (even if we will never reach 100% consensus on any
proposal).
So please, keep on discussing!
> - To use functions like "tryReadMVar", "unsafeShiftR", and
> "atomicModifyIORef'" that are in recent base versions but not older ones
> (this is a place where CPP use is actually justified)
Well, yeah, new functions don't magically appear in old versions. I
don't anybody expects that :).
[...]
> The python 2→3 transition may have caused some pain to some. However it
> would have been far more difficult if they had not provided tools like
> 2to3 and six:
> https://docs.python.org/2/library/2to3.html
> http://pythonhosted.org/six/
>
> Since the change here is trivial and pervasive why is such a tool not being
> considered?
> [Or have I missed the discussion?]
It is being considered! :-)
I'm in the process of collecting rewrite recipes and samples on
https://github.com/hvr/Hs2010To201x
Alan is looking into how HaRe can be leveraged for this. I've refrained
from bringing this up, as I'm still waiting to reach a proof-of-concept
stage where we can confidently say that this actually works.
> In particular we can envisage a tool say 8to10 that has these modes
> (command-line flags)
> --portable
> --readable
> --info
>
> with the idea that
> - portable is most unreadable (ifdef litter)
> - readable is not portable -- generate a copy of the whole source tree that
> will have the changes and not be compatible for ghc < 7.8
> This is for those library authors that prefer to maintain a clean code
> base and hell with earlier ghc versions
> - info summarizes which files would change and how so that a suitable
> reorganization of the files/directories can be done prior to the switch
> This is for those library authors that would like to take the trouble to
> have a compatibility layer and have code as readable as possible overall
>
> [Whether pure is really preferable to return is another matter --
> For me one of the strongest criticisms of the python 2→3 switch is this
> that if they were going to break things anyhow why was the cleanup not more
> far-reaching?
>
> Lets keep this note in these parentheses :-)
>
> ]
> _______________________________________________
> Haskell-Cafe mailing list
> Haskel...@haskell.org
> http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
--
"Elegance is not optional" -- Richard O'Keefe
What I don't like about this solution is how specific it is -- the gut
instinct that it can't be the last such extension, if we were to start
replacing CPP piecemeal.
And after a while, we'd accomodate a number of such extensions..
and they would keep coming.. until it converges to a trainwreck.
I think that what instead needs to be done, is this:
1. a concerted effort to summarize *all* uses of CPP in Haskell code
2. a bit of forward thinking, to include desirables that would be easy
to get with a more generic solution
Personally, I think that any such effort, approached with generality
that would be truly satisfying, should inevitably converge to an
AST-level mechanism.
..but then I remember how CPP can be used to paper over incompatible
syntax changes..
Hmm..
--
с уважениeм / respectfully,
Косырев Серёга
[...] That being said, as how to write your Monad instances today with GHC
7.10 w/o CPP, while supporting at least GHC 7.4/7.6/7.8/7.10: This
*does* work (admittedly for an easy example, but this can be
generalised):
--8<---------------cut here---------------start------------->8---
module MyMaybe where
import Control.Applicative (Applicative(..))
import Prelude (Functor(..), Monad(..), (.))
-- or alternatively: `import qualified Prelude as P`
[...]
--8<---------------cut here---------------end--------------->8---
This example above compiles -Wall-clean and satisfies all your 3 stated
requirements afaics. I do admit this probably not what you had in mind.
[...] That's because -Wall-hygiene (w/o opting out of harmless) warnings
across multiple GHC versions is not considered a show-stopper.
[...] Beyond what Ben already suggested in another post, there was also the
more general suggestion to implicitly suppress warnings when you
explicitly name an import. E.g.
import Control.Applicative (Applicative(..))
would suppress the redundant-import warning for Applicative via Prelude,
because we specifically requested Applicative, so we don't mind that
Prelude re-exports the same symbol. [...]
On Oct 6, 2015 7:32 AM, "Henrik Nilsson" <Henrik....@nottingham.ac.uk> wrote:
> Executive Summary: Please let us defer further discussion
> and ultimate decision on MRP to the resurrected HaskellPrime
> committee
Many more people are on this mailing list than will be chosen for the committee. Those who are not chosen have useful perspectives as well.
> 1. Is the Haskell Libraries list and informal voting process
> really an appropriate, or even acceptable, way to adopt
> such far-reaching changes to what effectively amounts to
> Haskell itself?
As others have said, no one wants that.
> But, as has been pointed out in a
> number of postings, a lot of people with very valuable
> perspectives are also very busy, and thus likely to
> miss a short discussion period (as has happened in the
> past in relation to the Burning the Bridges proposal)
The Foldable/Traversable BBP indeed was not as well discussed as it should have been. AMP, on the other hand, was discussed extensively and publicly for months. I understand that some people need months of notice to prepare to participate in a discussion. Unfortunately, I don't think those people can always be included. Life moves too quickly for that. I do think it might be valuable to set up a moderated, extremely low volume mailing list for discussion of only the most important changes, with its messages forwarded to the general list.
> The need to "field test" MRP prior to discussing
> it in HaskellPrime has been mentioned. Graham and I
> are very sceptical. In the past, at least in the
> past leading up to Haskell 2010 or so, the community
> at large was not roped in as involuntary field testers.
No, and Haskell 2010 was, by most measures, a failure. It introduced no new language features (as far as I can recall) and only a few of the most conservative library changes imaginable. Standard Haskell has stagnated since 1998, 17 years ago. Haskell 2010 did not reflect the Haskell people used in their research our practical work then, and I think people are justified in their concern that the next standard may be similarly disappointing. One of the major problems is the (understandable, and in many ways productive) concentration of development effort in a single compiler. When there is only one modern Haskell implementation that is commonly used, it's hard to know how changes to the standard will affect other important implementation techniques, and therefore hard to justify any substantial changes. That was true in 2010, and it is, if anything, more true now.
> Therefore, please let us defer further discussion and
> ultimate decision on MRP to the resurrected
> HaskellPrime committee, which is where it properly
> belongs. Otherwise, the Haskell community itself might
> be one of the things that MRP breaks.
I hope not. Haskell has gained an awful lot from the researchers, teachers, and developers who create it and use it. I hope we can work out an appropriate balance of inclusion, caution, and speed.
On 10/06/2015 11:11 AM, Johan Tibell wrote:
> It might be enough to just add a NOWARN <warning type> pragma that acts on
> a single line/expression. I've seen it in both C++ and Python linters and
> it works reasonably well and it's quite general.
+1. Simple is good and can hopefully also be backported to older GHC
releases. (Provided someone's willing to do said releases, obviously.)
Libraries mailing list
Libr...@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries
(Resending with smaller recipient list to avoid getting stuck in the moderator queue.)
On Tue, Oct 6, 2015 at 9:10 AM, Herbert Valerio Riedel <h...@gnu.org> wrote:
On 2015-10-05 at 21:01:16 +0200, Johan Tibell wrote:
> On the libraries I maintain and have a copy of on my computer right now: 329
Although this was already pointed out to you in a response to a Tweet of
yours, I'd like to expand on this here to clarify:
You say that you stick to the 3-major-ghc-release support-window
convention for your libraries. This is good, because then you don't need
any CPP at all! Here's why:
[...]
So what do I have to write today to have my Monad instances be:* Warning free - Warnings are useful. Turning them off or having spurious warnings both contribute to bugs.
* Use imports that either are qualified or have explicit import lists - Unqualified imports makes code more likely to break when dependencies add exports.* Don't use CPP.
Neither AMP or MRP includes a recipe for this in their proposal. AMP got one post-facto on the Wiki. It turns out that the workaround there didn't work (we tried it in Cabal and it conflicted with one of the above requirements.)
PS: I'm a bit disappointed you seem to dismiss this proposal right away
categorically without giving us a chance to address your
concerns. The proposal is not a rigid all-or-nothing thing that
can't be tweaked and revised. That's why we're having these
proposal-discussions in the first place (rather than doing blind
+1/-1 polls), so we can hear everyone out and try to maximise the
agreement (even if we will never reach 100% consensus on any
proposal).
So please, keep on discussing!
The problem by discussions is that they are done between two groups with quite a difference in experience. On one hand you have people like Bryan, who have considerable contributions to the Haskell ecosystem and much experience in large scale software development (e.g. from Facebook). On the other hand you have people who don't. That's okay. We've all been at the latter group at some point of our career.
What's frustrating is that people don't take a step bad and realize that they might be in the latter group and should perhaps listen to those in the former. This doesn't happen, instead we get lots of "C++ and Java so bad and we don't want to be like them." Haskell is not at risk of becoming C++ or Java (which are a large improvement compared to the languages came before them). We're at risk of missing our window of opportunity. I think that would be a shame, as I think Haskell is a step forward compared to those languages and I would like to see more software that used be written in Haskell.We've been through this many times before on the libraries list. I'm not going to win an argument on this mailing list. Between maintaining libraries you all use and managing a largish team at Google, I don't have much time for a discussion which approaches a hundred emails and is won by virtue of having lots of time to write emails.-- Johan
> As a practical suggestion, if any changes for version 1.3 could make > some revision of a 1.2 programs necessary, let's have a precise > stand-alone specification of these revisions and how to make them. > It had better be short and simple. Many would prefer it to be empty. > Perhaps it should be implemented in Haskell compilers?Overall I don't see the rush for these changes, let's try putting our faith in a new Haskell committee, whomever it is comprised of.
On Thu, Sep 24, 2015 at 2:43 PM, Herbert Valerio Riedel <h...@gnu.org> wrote:Sure... if we had a language that no one uses and could keep reforming like putty until it is perfect. But we don't.A modest proposal:We can't keep tinkering with a language and it's libraries like this AND have a growing ecosystem that serves an ever widening base, including the range from newcomer to commercial deployment. SO - Why let's do all the language tinkering in GHC 8 there can be as many prereleases of that as needed until it is just right. ...and leave GHC 7 (7.10? roll back to 7.8.4?) for all of us doing essential and dependable libraries, commercial work, projects on Haskell that we don't want to have to go back and #ifdefs to twice a year just to keep running, educators, people writing books. We can keep improving GHC 7 as needed, and focus on bugs, security issues, patches, cross compatibility, etc.
I think this discussion has gotten quite heated for reasons not related to the concrete MRP proposal, which, to be honest, I considered quite modest in terms of both scope and impact.
Instead, I think it is a proxy for lots of remaining frustration and anxiety over the poor handling over the Foldable Traversable Proposal. I would like to remind everyone that due to the broad discussions and concerns over the proposal, a very rare, careful poll of Haskell users was taken, announced broadly in many channels. [1] The poll, overwhelmingly, revealed a mandate for the FTP. The breakdown of that mandate was 87% in favor among hobbyists and 79% in favor among non-hobbyists (who constituted a majority of those polled).
I. Generalities
That said, even the _best_ poll was not a substitute for a better earlier discussion. The handling of the AMP and FTP, which I think was heroic in terms of minimizing breakage while accomplishing long-desired change also still could have been better. As a whole, the work accomplished the mandate of allowing code to be written backwards-compatible without requiring CPP. However, it did not also seek to prevent warnings. This in itself was an enormous step forward from changes in the past which have _not_ even managed to prevent the need for CPP. At the time, I think it was not recognized how much desire there would be for things that were _both_ CPP free and _also_ warning-free for 3 releases.
I think one of the great elements of progress in the current discussion is that there is now a proposal on the table which recognizes this, and seeks to accomplish this change in accordance with this desire. It is not the world’s most important change, but the recognition that change should seek to be both CPP _and_ warning free is a good recognition, and I’m sure it will be taken into account in future proposals as well.
I don’t think it is useful to continue to have abstract discussions on the conflict between desire for incremental improvement versus the need to minimize pain on maintainers. We might as well continue to argue about the need for purely functional programming versus the need to print “hello world” to the console. Rather, we should put our collective minds together as collaborators and colleagues to accomplish _both_, and to come up with solutions that should work for everyone. To the extent this discussion has been about that, I think it has been useful and positive. However, to the extent this discussion insists, on either side, on the shallow idea that we must treat “improvement” versus “stability” as irreconcilable factions in necessary conflict, then I fear it will be a missed opportunity.
II. Particulars
With that in mind, I think the _concrete_ voices of concern have been the most useful. Gregory Collins’ list of issues requiring CPP should be very sobering. Of note, I think they point to areas where the core libraries committee has not paid _enough_ attention (or perhaps has not been sufficiently empowered: recall that not all core libraries fall under its maintenance [2]). Things like the newtype FFI issue, the changes to prim functions, the splitup of old-time and the changes to exception code were _not_ vetted as closely as the AMP and FTP were, or as the MRP is currently being. I don’t know all the reasons for this, but I suspect they just somewhat slipped under the radar. In any case, if all those changes were as carefully engineered as the MRP proposal has been, then imho things would have been much smoother. So, while this discussion may be frustrating, it nonetheless in some ways provides a model of how people have sought to do better and be more proactive with careful discussion of changes. This is much appreciated.
Personally, since the big switch to extensible exceptions back prior in 6.10, and since the split-base nonsense prior to that, very few changes to the core libraries have really caused too much disruption in my code. Since then, the old-time cleanup was the worst, and the big sin there was that time-locale-compat was only written some time after the fact by a helpful third-party contributor and not engineered from the start. (I will note that the time library is one of the core libraries that is _not_ maintained by the core libraries committee).
Outside of that, the most disruptive changes to my code that I can recall have been from changes to the aeson library over the years — particularly but not only regarding its handling of doubles. I don’t begrudge these changes — they iteratively arrived at a _much_ better library than had they not been made. [3] After than, I made a few changes regarding Happstack and Snap API changes if I recall. Additionally, the addition of “die” to System.Exit caused a few name clashes. My point is simply that there are many packages outside of base that also move, and “real” users with “real” code will these days often have quite a chain of dependencies, and will encounter movement and change from across many of them. So if we say “base never changes” that does not mean “packages will never break” — it just means that base will not have the same opportunity to improve that other packages do, which will eventually lead to frustration, just as it did in the past and in the leadup to the BBP.
III. Discussions
Further, since there has been much discussion of a window of opportunity, I would like to offer a counterpoint to the (sound) advice that we take into consideration voices with long experience in Haskell. The window of opportunity is, by definition, regarding takeup of Haskell by new users. And so if newer users favor certain changes, then it is good evidence that those changes will help with uptake among other new users. So, if they are good changes on their own, then the fact that they are appealing to newer users should be seen as a point in their favor, rather than a reason to dismiss those opinions. But if we are in a situation where we see generations of adopters pitted against one another, then we already have deeper problems that need to be sorted out.
Regarding where and how to have these discussions — the decision was made some time ago (I believe at the start of the initial Haskell Prime process if not sooner, so circa 2009?) that the prime committee would focus on language extensions and not library changes, and that those changes would be delegated to the libraries@ list. The lack of structure to the libraries@ list is what prompted the creation of the libraries committee, whose ultimately responsibility it is to decide on and shepherd through these changes, in consultation with others and ideally driven by broad consensus. Prior to this structure, things broke even more, imho, and simultaneously the things that were widely desired were still not implemented. So I thank the libraries committee for their good work so far.
So, it may be that the process of community discussion on core libraries changes is not best suited for the libraries@ list. But if not there, Where? I worry that the proliferation of lists will not improve things here. Those involved with Haskell have multiplied (this is good). The voices to take into account have multiplied (this is good). Necessarily, this means that there will just be _more_ stuff, and making sure that everyone can filter to just that part they want to is difficult. Here, perhaps, occasional libraries-related summary addenda to the ghc newsletter could be appropriate? Or is there another venue we should look towards? “Chair’s reports” to the Haskell Weekly News maybe?
IV. Summing up
We should bear in mind after all that this is just about cleaning up a redundant typeclass method (albeit one in a very prominent place) and hardly the hill anyone would want to die on [4]. Nonetheless, I think it would be a good sign of progress and collaboration if we can find a way to implement a modest change like this in a way that everyone finds acceptable vis a vis a sufficiently slow pace, the lack of a need for CPP and the lack of any induced warnings. On the other hand, other opportunities will doubtless present themselves in the future.
Best,
Gershom
[1] https://mail.haskell.org/pipermail/libraries/2015-February/025009.html
[2] https://wiki.haskell.org/Library_submissions#The_Core_Libraries
[3] and in any case I am sure Bryan would be the last to want us to treat him as some sort of “guru” on these matters.
[4] for those in search of better hills to die on, this is a list of some good ones: http://www.theawl.com/2015/07/hills-to-die-on-ranked
P.S. In case there is any question, this email, as all emails I write that do not state otherwise, is not being written in any particular capacity regarding the various infra-related hats I wear, but is just an expression of my own personal views.
... for gnus sake let's not adopt the inane parity implies stability numbering convention ...
I've dealt with the IETF RFC process and the Python PEP process, and both of them worked better than that.
On Tue, Oct 6, 2015 at 7:24 PM, Mike Meyer <m...@mired.org> wrote:I've dealt with the IETF RFC process and the Python PEP process, and both of them worked better than that.While both those are good examples of mostly working organizations shepherding foundational technical standard(s) along... there is one thing more important than their processes: Their stance. Both organizations have a very strong engineering discipline of keeping deployed things working without change. I don't think it is enough to simply model their process.
Btw, I've also seen the trick below, in which you use the aliased `A.`
prefix just once so GHC considers the import non-redundant, and don't
have to suffer from prefixed operators in the style of `A.<*>`.
Is this any better? [...]
Hang on a moment, are you saying that all the people writing to arguethat these changes would require them to write dozens more #ifdef's
actually don't have to write any at all? I never knew what the *-compatpackages were all about. If that's what they're designed to do, I have afeeling they have not gotten *nearly* enough exposure.
I think there are several different conversations going on at once in this thread. I think it’s worth keeping them separate.
· Haskell Prime. The intention there is to take a set of language features that are already in wide use in GHC (i.e. have demonstrably proved valuable), work out any dark corners, formalise them, and embody the result in a new Haskell Report. That puts a useful stake in the ground, empowers alternative implementations of Haskell, gives confidence all round.
I think it’d be unusual for the Haskell Prime committee to specify a new feature of the language, one that had not been field tested.
·
Libraries. AMP, BBP, and Monad(return) are small but fundamental
changes to the core libraries. I think there was a consensus (not universal in the case of BBP) that the change was good. Yet AMP and BBP (esp) were controversial. The issues were mostly around how to make the transition; and, given that the transition
is expensive, whether the cost/benefit tradeoff justifies the change. The question of moving ‘return’ out of Monad is in this category.
The Core Libraries Committee was formed explicitly to handle this stuff. So my prior assumption was that the CLC would handle the Monad(return) question, not Haskell Prime.
Mark’s suggestion of a “stable” GHC 7.10 and a new GHC 8.0 is a reasonable one. But I for one would find it hard to promise to back-port every bug fix, especially as the two code bases diverge (which they will).
Here is another idea. GHC knows very little about Monad. It would take work, but it probably wouldn’t be impossible to make the same GHC work with two different ‘base’ libraries, each with a different definitions of the Monad class. That would not solve the problem: you still could not use one library that used old Monad with another library that used new Monad. But at least it’d decouple it from which version of GHC you were using. I stress: it would take some work to make this go, and I’d prefer not to do this.
Returning to ‘return’, my instinct is that when these pervasive breaking library changes come up, we should batch them into larger clumps. The “treadmill” complaint is real: small change A made me re-release my library; then small change B came along; and so on. Perhaps if we saved them up this would be less of an issue, for two reasons. First, the work happens once rather than many times. Second, the benefits of the change is the sum of the benefits of the little component changes, and so is more attractive to library authors and library clients.
That line of thinking would suggest that the Core Libraries Committee might want to maintain a list of well-worked out agreed changes that are being “saved up” for execution at some later date.
Simon
With that in mind, I think the _concrete_ voices of concern have been the most useful. Gregory Collins’ list of issues requiring CPP should be very sobering. Of note, I think they point to areas where the core libraries committee has not paid _enough_ attention (or perhaps has not been sufficiently empowered: recall that not all core libraries fall under its maintenance [2]). Things like the newtype FFI issue, the changes to prim functions, the splitup of old-time and the changes to exception code were _not_ vetted as closely as the AMP and FTP were, or as the MRP is currently being. I don’t know all the reasons for this, but I suspect they just somewhat slipped under the radar.
Outside of that, the most disruptive changes to my code that I can recall have been from changes to the aeson library over the years — particularly but not only regarding its handling of doubles. I don’t begrudge these changes — they iteratively arrived at a _much_ better library than had they not been made. [3] After than, I made a few changes regarding Happstack and Snap API changes if I recall. Additionally, the addition of “die” to System.Exit caused a few name clashes. My point is simply that there are many packages outside of base that also move, and “real” users with “real” code will these days often have quite a chain of dependencies, and will encounter movement and change from across many of them. So if we say “base never changes” that does not mean “packages will never break” — it just means that base will not have the same opportunity to improve that other packages do, which will eventually lead to frustration, just as it did in the past and in the leadup to the BBP.
This is explained in the original proposal. In particular, it
eliminates opportunities for errors and simplifies ApplicativeDo. I
don’t believe anyone has proposed removing return from base. The only
proposed change is turning return into a stand-alone function instead
of a method in Monad. There is no proposal for removing return.
Please consider that the the way practical development really happens[2]
On Wed, Oct 7, 2015 at 4:54 PM, Bardur Arantsson <sp...@scientician.net> wrote:...among web developers, who of course are the only real developers?Please consider that the the way practical development really happens[2]
[...]
I had heard that the financial users generally refused to have anything to do with the Haskell community.Now I know why.
From: Haskell-Cafe [mailto:haskell-ca...@haskell.org]
On Behalf Of Mike Meyer
Sent: 07 October 2015 00:24
To: Mark Lentczner; Johan Tibell
Cc: Haskell Libraries; haskell cafe; haskel...@haskell.org List
Subject: Re: [Haskell-cafe] MRP, 3-year-support-window, and the
non-requirement of CPP (was: Monad of no `return` Proposal (MRP): Moving
`return` out of `Monad`)
On Tue, Oct 6, 2015 at 4:15 PM Mark Lentczner <mark.le...@gmail.com> wrote:
On Thu, Sep 24, 2015 at 2:43 PM, Herbert Valerio Riedel <h...@gnu.org> wrote:TLDR: To complete the AMP, turn `Monad(return)` method into a
top-level binding aliasing `Applicative(pure)`.
Sure... if we had a language that no one uses and could keep reforming like putty until it is perfect. But we don't.
A modest proposal:
We can't keep tinkering with a language and it's libraries like this AND have a growing ecosystem that serves an ever widening base, including the range from newcomer to commercial deployment. SO - Why let's do all the language tinkering in GHC 8 there can be as many prereleases of that as needed until it is just right. ...and leave GHC 7 (7.10? roll back to 7.8.4?) for all of us doing essential and dependable libraries, commercial work, projects on Haskell that we don't want to have to go back and #ifdefs to twice a year just to keep running, educators, people writing books. We can keep improving GHC 7 as needed, and focus on bugs, security issues, patches, cross compatibility, etc.
I think there are several different conversations going on at once in this thread. I think it’s worth keeping them separate.
<snip>
Returning to ‘return’, my instinct is that when these pervasive breaking library changes come up, we should batch them into larger clumps. The “treadmill” complaint is real: small change A made me re-release my library; then small change B came along; and so on. Perhaps if we saved them up this would be less of an issue, for two reasons. First, the work happens once rather than many times. Second, the benefits of the change is the sum of the benefits of the little component changes, and so is more attractive to library authors and library clients.
That line of thinking would suggest that the Core Libraries Committee might want to maintain a list of well-worked out agreed changes that are being “saved up” for execution at some later date.
Simon
This whole discussion is tilted towards the software engineering side.
> Many of us here learned from those texts or those courses. It's easy online
> to say that materials being out of date isn't a big deal, but it can turn
> people off the language when the code they paste into ghci doesn't
> work.
FWIW, I am a newbie, and currently learning a lot from the web.
I've had a different feeling. Whenever I read something like "This is
historical, and should have been fixed, but isn't yet", it were these
sentences that almost turned me off the language, because I had a
feeling that there is some stagnation going on.
From the POV of a newbie, I am all for fixing historical hickups.
However, I realize that I have different priorities compared to
long-time users and people needing to maintain a lot of existing code.
Just my 0.01€
--
CYa,
⡍⠁⠗⠊⠕
As for use in production, it's been used for coming up to 10 years. I
will wait.
On 08/10/15 18:21, Mario Lang wrote:
> José Manuel Calderón Trilla <jm...@jmct.cc> writes:
>
>> Many of us here learned from those texts or those courses. It's easy online
>> to say that materials being out of date isn't a big deal, but it can turn
>> people off the language when the code they paste into ghci doesn't
>> work.
>
> FWIW, I am a newbie, and currently learning a lot from the web.
> I've had a different feeling. Whenever I read something like "This is
> historical, and should have been fixed, but isn't yet", it were these
> sentences that almost turned me off the language, because I had a
> feeling that there is some stagnation going on.
>
> From the POV of a newbie, I am all for fixing historical hickups.
> However, I realize that I have different priorities compared to
> long-time users and people needing to maintain a lot of existing code.
>
> Just my 0.01€
>
Is there an understanding of how they manage to qualify as an
industrially-acceptable language in spite of this?
--
с уважениeм / respectfully,
Косырев Серёга