Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Reader macros in Common Lisp (gist.github.com)
80 points by wtbob on Oct 19, 2015 | hide | past | favorite | 34 comments


<rant> It's sad that clojure opted, for no good reason IMHO (ease of implementation, perhaps?), not to allow reader macros. I know, they claim that reader macros lead to "snowflake" code. But this is lisp. If you don't want tweakability, go back to Java. </rant>


Ease of implementation is an excuse for other things in Clojure as well, single pass compiler comes to mind.

But I would give up all of these for an interactive debugger. The series below shows how Common Lisp debuggers are light years ahead of the Clojure.

http://malisper.me/2015/07/07/debugging-lisp-part-1-recompil...

http://malisper.me/2015/07/14/debugging-lisp-part-2-inspecti...

http://malisper.me/2015/07/22/debugging-lisp-part-3-redefini...

And this is important. I develop Clojure for a day job and this is a constant PITA.


Ease of implementation is not the reason for either no reader macros or a single pass compiler.

Rich Hickey discusses reader macros here: http://clojure-log.n01se.net/date/2008-11-06.html

Quotes:

"Clojure doesn't allow user-defined reader macros because they can't be combined - there's no namespace support, unlike for regular macros"

"CL has user reader macros and the consensus is to avoid them"

"you are unlikely to convince me they are a good idea"

"I recognize fully this is less flexible than CL in this area, OTOH, I'd like to think Clojure will engender less desert-island programming than did CL -- speaking a common language fosters library development, shared idioms etc. When everyone creates their own language there's less of that"

"and even standard reader macros pose an understanding challenge for newcomers"

Compilation units are discussed here:

https://news.ycombinator.com/item?id=2467359


Cider now has a debugger:

http://batsov.com/articles/2015/06/16/cider-0-dot-9/

Bozhidar Batsov just recently did a podcast with Cognitect where they speak about this and other things related to Slime and the Clojure programming environment:

http://blog.cognitect.com/cognicast/080


While there are no first-class reader macros in Clojure, reader literals cover many common use cases. For the small set of use cases where you REALLY need reader macros, it's possible to hack them into the language. One such implementation: https://github.com/klutometis/reader-macros

As far as interactive debugging - that's also something I missed from CL, but is relatively easy to hack into the language. A combination of REPL support for the standard JVM debugger (https://github.com/ohpauleez/cdt), with a nice expression-based REPL debugger (https://github.com/ohpauleez/cdt/blob/master/src/cdt/debug_r...) gets you pretty far. You can even trigger this environment on any exception using the built-in JVM support (https://gist.github.com/ohpauleez/4ca0b646b89512f38ea2#file-...)

To see how far you can take reader literals, hack a look at my talk from last year's Conj, "Unlocking Data-driven Systems" - https://www.youtube.com/watch?v=BNkYYYyfF48


Interactive debugging is not "relatively easy to hack into the language". In fact, it's impossible since Clojure lacks the Common Lisp condition system.

When an exception is thrown in Java, the stack unwinds meaning all state in stack frames from the point of the throw to the handler gets thrown out.

In Common Lisp, the stack does not unwind and high level logic can simply restart the computation (after fixing the error or trying different strategies) without losing any intermediate state. This killer feature is what makes interactive debugging possible.

Hacks like the one you linked to are a sorry state of affairs compared to what I described.


While I appreciate the dialog, your comment is rather condescending. I've worked extensively in Common Lisp and am well aware how the condition system works. My claims had little to do with conditions, and more about the benefits and use - It's pretty simple to have a dynamic var (say, handler), bound to anything you like, catching errors before the stack unwinds, including an expression-based REPL debugger, allowing you to change locals and return a value in the place of the exception. It has all the limitations you can imagine, but for many cases, this is plenty to be effective.


Reader macros are brilliant - but as with many powerful language features they should be used carefully. I vividly remember receiving a release of some code (early 90s - a tape in the post) and looking at it and thinking that the source wasn't Lisp at all. Turns out it was someone who invented their own syntax and implemented it all as reader macros on top of Lisp - not quite as bad as trying to turn C into Pascal with #defines though but in a similar spirit.


The author does address that in the article:

> However with great power comes great responsibility, so you should learn to use them with great care. For example, while a JSON reader was a good exercise in understanding reader macros, I would never use it in real world for the simple reason that easier alternatives already exist.

and:

> There is a saying that goes along these lines: where a function will do, don't use a macro. A corollary for reader macros could be: where a sexp will do, don't use a reader macro. Yes, it might be a subjective call, but think twice before using a reader macro.

The second is particularly spot-on: sometimes you need a reader macro, but often you don't. But when you do…you do.


And what is wrong, exactly, with changing the language syntax?

DSLs become more readable and closer to the way the problem domain experts think if they are using the more convenient syntax.


Personally I think it's totally cool as long as you stay consistent and your language makes sense.

It is written in SICP that programming is in fact building layers of abstraction, the lower one serving as a programming language for the higher one. Macros or not, everyone designs new 'programming languages', though today's term of art is "API".


There is nothing wrong if you can justify it to the next programmer who has to deal with your code. If your gain is only marginal but I have to read and understand 200 lines of code to understand your DSL, I might be mildly annoyed....


You know there is such a thing as documentation right?

If you have to read and understand 200 lines of code with no documentation, take it up with the incompetent developer.

This argument holds no water whatsoever. Bad programmers can cause all sorts of damage, why should everything be dialed down to their level?

Clojure forces you to deal with concurrency issues in a way that might be premature optimization at best, tremendous waste of time at worst. As opposed to DSLs in CL with reader macros, I have no choice in the matter, it's all or nothing. Yet I don't see you complaining.

Good DSLs (meaning they map very well to the problem domain ) are irreplaceable. They can make a problem completely disappear.


A well-designed DSL should be self-documenting and obvious. And the right tools (like a proper IDE integration, with syntax highlighting and autocompletion) would make such a DSL much more obvious than a library of functions (which still must be thoroughly documented to be usable).


A well-designed DSL should be self-documenting and obvious.

ROFL. You're adorable. :)

You're also conveniently ignoring the maintenance required on the DSL itself. DSLs are, themselves, code, and code must be tested, updated, will likely experience regressions, etc.


I've been building properly designed, syntactically rich DSLs for over a decade. It is much, much easier than building usable APIs.

DSL compilers are the simplest and cleanest code possible. They're nothing but a chain of very trivial tree rewrites, and with a help of a proper DSL you can write and maintain them easily.


While I don't think readability is a problem if you are judicious with your use of reader macros, the real problem is that they don't play well with editing tools e.g. all those nice Emacs shortcuts like forward-sexp, backward-sexp, etc. won't work correctly with any JSON that you write in your Lisp files.

Forget JSON, the problems are evident even when using something like CL-INTERPOL, which makes minimal changes to the read-table.


Yes, of course you need your IDE to play well with the reader macros. That's why I use PEGs to construct such macros and to provide a feedback for the IDE while parsing (e.g., token locations, definitions, etc.).


Programming languages carry with them certain conventions. Straying too much from the established code style will make your code a pain to work with.

It's pretty hard to hit the right balance between adhering to the original language and matching your DSL to the problem domain. If you build a DSL that requires a day of studying code and documentation in order to make a minor change, you might be better off without it.


> Straying too much from the established code style will make your code a pain to work with.

This is a theory. Reality is a bit different. There is nothing in common between, say, JavaScript and the regular expressions DSL embedded into it, but yet they come ok together.

Code style is overrated.


The difference is that's a standard embedded DSL (another excellent example is C#'s LINQ).

Inventing your own alien syntax is a recipe for long-term code maintenance disaster. Anyone who's worked on a codebase that's lived longer than a decade has likely experienced this pain firsthand.


Most of the code I worked with was 30-40 years old. Often full of ad hoc DSLs. Refreshing this kind of code was easier than any other kinds - rewriting the DSL itself with the modern technologies was sufficient, 90% of the remaining code base did not need much maintenance then.


Because if there is no explanation of what the new syntax is and it's one developers take on making Lisp look a bit more pascally (which this was - it wasn't a DSL) I don't really see what it achieves other than confusion.


So if lisp is so great and so are reader macros, why didn't Macsyma/Maxima win over Mathematica? It is just a bunch of reader macros and lisp, right? It seems like a CAS is a prime use case for lisp to really shine.

Why isn't anyone else writing new CASes in lisp?


Because Lisp ran like shit on even the high-end hardware of that day. Plus it was proprietary -- nothing like SBCL existed. By using C Wolfram was able to offer a much more performant system with much lower TCO. Combine that with his legendary self-promotion skills and you have a juggernaut in the marketplace.


Maclisp ran quite fast on the PDP-10, even faster than DEC's FORTRAN (for a while, DEC was sufficiently embarrassed they put more effort into it).

High quality Lisps being proprietary during the chaotic '80s through the early-mid '90s was certainly an issue for Maxima, but for the licensed Macsyma (see my other comment on the history) I think that would have been an acceptable part of the cost of goods; too bad Symbolics demanded you by one of their Lisp Machines to run it for too long.

Self-promotion ... yeah, Mathematica wins hands down there ^_^.


So if lisp is so great and so are reader macros, why didn't Macsyma/Maxima win over Mathematica?

There's one very specific reason: somehow, someway, a deal was set up where Symbolics managed to get the sole commercial licence to it, as far as we can tell to require users to buy one or more of their machines to use it, if they weren't using MIT-MC, a fast ECL KL10, but still just one machine.

Stephen Wolfram had already gotten screwed at least as badly as Joel Moses did by CalTech WRT to his Symbolic Manipulation Program (SMP), but I suppose among other things, not having tenure, nor more than a 5 year association with CalTech (by comparison, Joel eventually became Provost), nor as much invested in SMP, he left, and after 3 years at the IAS, presumably was damned sure to dot all his i's and cross all his t's when he started the next version.

tl;dr: Macsyma was put in the hospital as a toddler and never really recovered, SMP was strangled in its cradle, and with the hindsight of history, Mathematica avoided these fates.


> Why isn't anyone else writing new CASes in lisp?

Because Maxima exists and is open source, so people interested in better Lisp-based CASes are just contributing to Maxima.


I have heard that Maxima's source code is unreadable 1960's lisp and nobody actually tries to understand it, they just pile on more to the existing mess. Is that true?


Well, the best quote by Joel Moses, the father of Macsyma and therefore Maxima, is:

APL is like a beautiful diamond - flawless, beautifully symmetrical. But you can't add anything to it. If you try to glue on another diamond, you don't get a bigger diamond. Lisp is like a ball of mud. Add more and it's still a ball of mud - it still looks like Lisp.

So if your thesis is correct, that shouldn't be an big problem, unless there's something really wrong with the core.


Mathematica is a Lisp.


Yeah, yeah, so are Javascript and R, right?

No, really, why isn't Mathematica an actual lisp, one with a completely homoiconic language and with (reader) macros? Why aren't people writing modern CASes in lisp?


Doesn't answer your question specifically but it's still relevant: Why Wolfram Mathematica did not use Lisp (2002) (https://news.ycombinator.com/item?id=9797936)


Mathematica is homoiconic. And instead of reader macros it is using configurable presentation transformers (Mathematica notebooks) for different visual representation (and WYSIWYG editing) of the underlying Lisp-like lists. Even graphics in Mathematica is nothing but lists internally.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: