New book: Modern Common Lisp with FSet

12 views
Skip to first unread message

Paolo Amoroso

unread,
Apr 16, 2026, 10:06:35 AMApr 16
to Medley Interlisp core
A new Lisp book is out, Modern Common Lisp with FSet by Scott Burson. The book is an introduction to functional collections and the FSet library for Lisp programmers and designers of other languages.

--

Ryan Burnside

unread,
Apr 16, 2026, 5:44:17 PMApr 16
to Paolo Amoroso, Medley Interlisp core
It seems sensible if you're doing something with successive streams of data where working with sub collections as copies would normally cons up extra cruft. There are some reference optimizations that can be taken when you know a collection can't be mutated and some concurrency guarantees can be made.

I can appreciate his balanced view. I think it's of the utmost importance to keep Lisp a multi paradigm tool for the user. It's a testament to the language design that you can have your cake and eat it too. I'm all for language bending in Lisp as long as the culture stays one of open-mindedness.

The whole point of Lisp is that you get to be the language programmer. People should be able to dabble freely and go down any avenue that seems particularly interesting. If you try to shoe horn a Lisp into being single paradigm you throw away a long tradition of letting the programmer know best. So, I suppose I can say I'm just happy to see this kind of experimentation.

-Ryan

--
You received this message because you are subscribed to the Google Groups "Medley Interlisp core" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lispcore+u...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/lispcore/CAGi1hzsgy_szfOdwR5ft1hkRYJSF7cYkbndVg9xUDC-WnEKV%2BQ%40mail.gmail.com.

Paolo Amoroso

unread,
Apr 18, 2026, 11:06:52 AMApr 18
to Ryan Burnside, Medley Interlisp core
My problem with FSet is it tends to pack special characters and short symbols in terse and visually tight code which feels unreadable, Perl-like and arcane to me. I get a similar impression with Clojure code.

My eyes need more breathing room with longer, descriptive symbols and less special character noise.


pixel...@gmail.com

unread,
Apr 18, 2026, 11:56:57 PMApr 18
to Medley Interlisp core
I think it's a cultural thing where folks who insist on Functional Programming are already steeped in opaque math notation.
I've never been a fan of non English symbols aside from some gradeschool established things like your standard operations. (+ - * / etc)
At first encounter you can't be sure what -> and ->> are in Clojure for example. (Similar forms appear in Haskell and OCaml)

It is at least a clue to the reader had they been named "thread-first" and "thread-last" etc.
Thought on the other hand having SUBTRACT is kind of overkill to me.

Clojure in general has a few anti-patterns (in my opinion) that culturally they hold dear.
These contribute to that style:

1. Threading "Pipelines" instead of new functions. At worst you see the same kind of threading over and over instead of naming a process.
ex: (-> email message body signature) copied all over rather than simply naming that process once aka (get-signature email)
This hurts the code because you go looking for keywords and attempting to find all occurrences of this rather than adjusting get-signature once.

2. Separate the language from what you write, avoid macros.
This means that instead of tight little DSLs using macros you end up writing the same kind of thing over and over because the language is never shrunk to the domain.
Clojure "the language" is often treated as say C++ where the language is static and beyond extending. (This is NOT the case, but culturally the norm)

3. Data dispatch drives the show, never create specialised objects, "Hickey's Just Use Maps".
Because everything is a messy hashmap you actually create a kind of horrible duck typing.
This means that to even find out what a FOO consists of you may need to trace all the pipelines that interact with it many steps backward.
And even then it will be non deterministic since in your particular usage you might not be using an expected key.

So I think a lot of what you see is the result of immutable data containers being threaded through various transitional language functions with little abstraction.

It's sort of the opposite of what SICP teaches.
Scheme doesn't have the luxury of creating new complex strong types directly.
So SICP teaches how to properly safeguard and protect what it CAN do with contracts, naming, and predicates.
A new "thing" isn't just a generic hashmap, usually a mini DSL is created to ensure creating it properly and accessing it properly.
It's a bit like the record package in Interlisp. (Sure they can just be lists but we protect access and creation vis DSL).

SICP: The underlying implementation doesn't matter as much as concrete accessors (selectors in SICP terms) and the abstraction.
Clojure: Just use hashmaps bro.

I didn't mean to deliver a personal sermon, but apparently here we are. :P

Sincerely,
Ryan
Reply all
Reply to author
Forward
0 new messages