Given Peter Norvig's work on Lisp and Python, pity that after 24 years, his "Python for Lisp Programmers" essay from 2000 is still mostly true.
Python might have overtaken Lisp's role in AI, but still needs some catching up in tooling.
"The two main drawbacks of Python from my point of view are (1) there is very little compile-time error analysis and type declaration, even less than Lisp, and (2) execution time is much slower than Lisp, often by a factor of 10 (sometimes by 100 and sometimes by 1). Qualitatively, Python feels about the same speed as interpreted Lisp, but very noticably slower than compiled Lisp. For this reason I wouldn't recommend Python for applications that are (or are likely to become over time) compute intensive (unless you are willing to move the speed bottlenecks into C). But my purpose is oriented towards pedagogy, not production, so this is less of an issue. "
It's still… not the same. In CL (and specially with SBCL), we get compile time (type) errors and warnings at the blink of an eye, when we compile a single function with a keystroke (typically C-c C-c in Slime).
I haven't used CL myself, but it wouldn't surprise me if CL does a better job. Just pointing out that Python is significantly better in this respect than it used to be. In particular, using VS Code with Pylance, I can see type errors highlighted as soon as I write the code---I don't even need to use a keystroke. 24 years ago, I don't think anything like that was possible; the language didn't even support type annotations.
> wouldn't recommend Python for applications that are (or are likely to become over time) compute intensive (unless you are willing to move the speed bottlenecks into C)
The compute intensive parts of AI already always happen in CUDA/C++/C.
The difference is that Lisp doesn't require a two language syndrome.
Julia, Mojo, XLA, Triton,... are picking up speed, and the pressure for a JIT on CPython is increasing from Microsoft and Facebook, exactly because not everything is AI, and not everyone wants to write C++, C to speed up Python.
Ok the other hand, a good lisp implementation of neural nets would offer a specific set of macros dedicated to matrix operations. This set of macros is, in the lisp point of view, a domain specific language. So, you still have 2 languages.
The author was a labmate, and the person who introduced me to python. Taking over one of his codebases was rather a formative part of my career. (That particular code was considerably more normal than the one linked above).
I used Norvig’s lisp2.py to build a low code UI. I modified the interpreter to accept JSON flavored lisp, basically replace parens with brackets. The upside is that it was very very easy to make a react front end that manipulates JSON (JLisp). My thinking was, I need a serialization format for operations from the front end, and a way to interpret them. I could write my own language that no one has heard of, or use lisp, which few have used.
One main thing that one gets for free this way is garbage collection. I once started writing a lisp iterpreter in C++ but that kind of fell by the wayside once I realized that it is quite easy to create cyclical references in lisp and then using shared_ptr is not good enough.
> The beauty of Scheme is that the full language only needs 5 keywords and 8 syntactic forms.
Is there a learning resource that covers exactly this for those wanting to write software in lisp in 2024?
As "first principle thinkers" in some ways all hackers crave for that "fundamental building blocks approach", a bit like wanting to know how we go from transistors to full computers and every step along the way. Most of us have made peace with accepting the many abstractions because we're slinging highly abstracted mostly python, and javascript code at startups.
So learning Lisp seems like a nice digestible point to start from along the continuum if indeed there's some eloquent way to learn:
>only needs 5 keywords and 8 syntactic forms.
Various Scheme texts cover writing your own interpreter in more or less detail: Structure and Interpretation of Computer Programs (AKA SICP)*, the Little Schemer, Concrete Abstractions**. Norvig's Paradigms of AI Programming also contains a Scheme interpreter, though the text mainly focuses on Common Lisp. And finally I'd throw Exploring Programming Language Architecture in Perl*** in the mix as a more in-depth look at creating a Scheme interpreter.
The style of the Little/Seasoned/Reasoned Schemer is a little different from the other books I highlighted here, but various people have said they really like it, including Guy Steele, so they may be worth a look if the others don't really work for you.
I suppose 5 is better than lots but you can totally write a lisp with zero keywords.
I'm not sure what a syntactic form means here - dot as in dotted pair, nil, quote, quasiquote, parens? Having trouble coming up with 8 distinct syntactic things.
The basis set underlying lisp is something like the lambda calculus with optional delayed evaluation, a product type and some file I/O.
The optimal basis set for computation is either non-unique or not yet discovered as far as I can tell - different people arrive at different combinations.
Second edition of Friedman's Essentials of Programming Languages is very good for this. A tough read, but very good. The second edition is written in Scheme, I think the third changed? (I can't recall the details but I know when I was hunting for reading it was recommended to stick to the second as I was specifically looking for Scheme).
My way of learning about Lisp was based on first principles, after reading Norvigs blogpost many times and working through SICP. What helped is that I had just finished the nand2tetris book, which is very much a first principles speedrun of how a computer works.
So I built my own Lisp machine based on the nand2tetris architecture, with lisp instructions supported on the simulated chip level. I'm currently in the progress of doing a writeup of the project, you can see my attempts here if you are interested: https://deosjr.github.io/
I have an almost working version of a REPL running on the machine now, including garbage collection and parsing Lisp with Lisp and passing the output to an eval function written in assembly (the operating system, if you will).
Many years ago I wrote a simple programming language that would macro expand into lambda calculus statement that could then be compiled down to different sets of combinators - the simplest being just S & K, which are pretty fundamental given how simple they are. The fact that you can express things like recursion in the lambda calculate (see name of our hosts on this site) and therefore in combinators still amazes me.
This might sound crazy or stupid but I really want to know if there is some Lisp with manual memory management? I love Lisp syntax. People complain about the parentheses but for me they are a blessing. I like how extremely uniform and regular they look.
But all Lisps I've seen have garbage collectors. If I could find a Lisp with manual memory management, I could ditch C++ in favor of that Lisp. Is there one?
You can use a regular common lisp distribution like SBCL and just stick to foreign types https://www.sbcl.org/manual/#Foreign-Types this way you still have a GC for lighter tasks but with the option of manual memory management
You could do this in Common Lisp with global variables. They are not garbage collected. You'd have to decide how manual you want this to be. The ultimate manual setup would be something like
which is just a 1k bag of bytes. Then write some functions to allocate from it, keep track of it, etc. A typical Common Lisp implementation will make a "specialized array" when it does this - that is, the array is not filled with a bunch of pointers to individually-allocated bytes.
There are other ways you could do it - for example (I don't know if this would be efficient):
and there are ways you can write a FREE function to release these allocations or mark them for re-use.
These methods would not really release memory back to the OS. However, C malloc and free typically do not really free memory back to the OS either. You can easily do something in Common Lisp that's on equal footing with C in that regard.
Edi Weitz in "Common Lisp Recipes" points out that when you're doing something like this you're "pooling" memory, and in effect you're replacing automatic memory management with your own hand-rolled memory manager, and chances are it's not going to be as good as an automatic memory manager. But certainly it's doable.
I bet you could go quite far writing a straightforward compiler using an S-expr syntax.
Any by a "compiler" I'm talking doing something simply like taking Pascal, converting it to an s-expr grammar, and going to town.
Scheme can be a simple algol style language. Using s-expr syntax just made your lexer/parser much simpler. And (simple) compilation is really not that hard (just compile it to C for your first crack if you want).
This code is really beautiful and makes a lot of hard things easy to understand. I read this article many times before I developed the confidence to do it myself.
Python gives you a lot of things for free. Writing the lisp in C is quite the adventure in its own way.
I would prefer a python interpreter written directly in rv64 assembly with a near 0-SDK. Would run on x86_64 and arm64 with a rv64 interpreter (ofc with some code paths adaptation).
I made a lisp that's somewhat close to what you described. It's a freestanding lisp that targets the Linux kernel directly. No libraries, not even libc.
It is more upfront work for sure. But you get immunity from the planned obsolescence of syntax feature creeps from ISO or gcc extensions (a 5-10 years cycle)... and from compiler implementations.
Additionnally, if we are honnest with ourselves for a lot of programs, from a life cycle perspective, coding time of the bulk is actually negligictible. It is not the case with all types of programs though, but for a lot of system programs, this is very true.
And rv64 is supposed to become THE ISA standard. One of the main reasons for C existence is ISA abstraction which is kind of gone here. Of course, I wish risc-v to be a success.
What I find promising about LISP is the ability to do term rewriting and macros.
But people write lisps in imperative style rather than definitions of desired behaviour declaratively. I don't think we've sufficiently solved how to define desired behaviour to a computer.
Term rewriting behaviours. What are your thoughts?
I started trying to implement term rewriting into my LISP parser, which is the idea that we can match on trees and transform them, subsume branches or move branches around arbitrarily. You kind of want the matching function and transformation function to be imperative or sometimes like a query.
So use ASTs for behaviour and relationships and imperative LISP macros for transformations and rewriting.
The reason I say this is because I dream of a compositional language where I can create a cell in a spreadsheet and say that it should have these behaviours
Lisp is not the language for that unfortunately, it is very much an imperative language with better syntax and _some_ macros.
Scheme is close but quotation isn't thought about nearly enough. It is generally a CS problem as logic systems with quotation are very much an open problem. I think that types have gotten too much attention and quotation way too little.
Macros are basically a way to deal with the fact that neither lisp nor scheme have first class quotation which you can evaluate at leisure. I understand why they've done it for efficiency reason, but I think we should have at least one language that gets quotation right.
I think a lot of confusion come from the fact that a lot of people (including, presumably, the parent) use "Lisp" to mean Common Lisp, whereas many others take it to mean what Common Lisp people might call "lisp" or lisp-family. You can easily have a substanceless argument this way, and many often do!
There is no member of the lisp family that treats quotation as a first class citizen of the language. Granted, lisp is one of the few language families where it's even a part of the language but it's at best an afterthought, which is why you need macros to manipulate unquoted expressions.
What I was thinking about in your post was the the statement "[Lisp] is very much an imperative language with better syntax and _some_ macros". Of course some lisp-family languages are very much not imperative and some get by without macros (e.g. Clojure).
Can you imagine a language with such a first-class quoting system?
What is the requirement defining "first class quotation"?
In support of ways of implementing Scheme hygienic macros, there exists an invention known as syntactic closures. In what ways does a syntactic closure fall short of being a "first class quotation"?
Sure, whatever; what I intended to bring up was the specific way of implementing hygienic macros using syntactic closures. Syntactic closures are a way of quoting code, with context. Hygienic macros don't have to be implemented with syntactic closures.
That author you cited seems to be working on something very similar to syntactic closure. The paper has a lot of references, but very few of them are anything Lisp or Scheme related. Looks like the author is working entirely on his own.
In a work like this, I'd expect syntactic closures to be acknowledged, with a discussion of how the work being presented is different.
Syntactic closures are a way to hide the fact that lambdas are the only abstraction in lambda calculus.
The work that I cited has types _and_ quotation as first class abstractions.
It's a bit like asking to explain proof theory in terms of Godel numbers. Sure you can technically do it. You're completely missing the point if you do however.
I think syntactic closures don't have to be concerned with type, because their job is just to ensure that the symbols included in a quote refer to what they are supposed to refer to, regardless to where that quote is moved. When the closures are evoked, reasoning about types can take place then.
Lots of term rewriting found in compilers. Usually based on pattern matching on trees followed by some guard function to deal with DAGs / graphs. Lisp/scheme don't have pattern matching in the core - possibly because it's much cleaner with a static type system doing some of the dispatch - but you can implement one or use one of the libraries.
Declarative programming is roughly that control flow is handled by the language runtime instead of the programmer. That's either some dedicated language (makefile, yacc) or a DSL in some non-declarative language. It's not lisp, but the tablegen programs used by llvm's backend are an interesting example of a concise declarative syntax turning into a load of C++.
I'd say both are examples of things lisp doesn't really do for you. They're more reasonable to implement in a lisp than in most other languages.
You think lisp is "weird and sinister or ghostly" and that 'we' are still wrestling with something 66 years later?
This sounds more like someone getting caught up in the pageantry of a niche that pragmatic people have left behind a long time ago. Lisp was very influential, but those advancement have made their way into practical languages and lisp has been impractical for many decades at this point.
Scheme has pioneered a lot of stuff over the last 30-40 years that is still fresh for most of the PL world, hygienic macros (like Rust is trying to implement), delimited continuations (which underlies Java's new virtual threads, see https://www.youtube.com/watch?v=9vupFNsND6o), efficient closure representations (used all over the place since everybody got lambda fever). Into formal verification? Scheme was there in 1995 https://www.semanticscholar.org/paper/VLISP%3A-A-verified-im...
And there is at least half a dozen things in Racket that I wish mainstream languages could get their ass in gear and copy, but no such luck.
Insert meme of Nolan Grayson, representing Lisp/Scheme/Racket programmers, pointing at fighter jets labeled "transpilers" and saying: "Look at what they need to mimic just a fraction of our power!"
I think the roles would be reversed when trying to make fast, small interactive software that people want to use. "Powerful" is interesting but clear, straightforward and fast is better. No web browser, database, video codec, or high end video game is written in scheme. At best it's inefficient ancillary software that someone wrote in scheme because they wanted to, not because that's what a user wanted.
A language that contains all Lisp features becomes identifiable as a member of the Lisp family, and is then removed from the discussion of languages that don't have all Lisp features.
Ah yes, all of the modern practical languages allow you to connect to a running system, redefine a class, and automatically update every existing instance of that class.
It's easy to just assign different functions to a metatable in lua.
That isn't the point though. You were saying lisp is some mystical thing and it is 66 years old. Its influence happened decades ago. It isn't about every language having every feature. Pretty much all software is made without lots of the features in lisp because not every tradeoff is worth it. Lisp itself is barely used because it isn't about a check list of features, but pragmatism of an ecosystem, syntax, actual compilation etc are all crucial. Lisp is not a modern tool, it is an influential invention from over half a century ago.
When people are stuck on an airplane, they don't try to watch citizen kane, they want to watch literally anything else. Influential isn't the same as being good by modern standards. People don't want to write lisp and people don't want to use software written in lisp. Sorry for the harsh reality check.
https://norvig.com/lispy2.html
The followup expands the original to make it both more complicated and more complete.