Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Better question: why not Lisp?


Performance:

Getting Lisp to run as fast as C takes major effort when at all possible.

Resources:

Lisp needs a lot of space to do it's thing; and while it's certainly possible to downsize it, you're left with something that's not really Lisp anymore.

Ecosystem:

Finding solid libraries is tricky since it's not very popular professionally.

Power:

Unleashing the full power of Lisp in a diverse team is a recipe for an adventure, if not disaster.


> Getting Lisp to run as fast as C takes major effort when at all possible.

The Computer Language Benchmarks Game shows Lisp Code as generally being between 2x and 10x slower than C++[1]. As fast as C? No. Way faster than Python, and more than fast enough to be used in almost every single application, modulo hard-real-time systems and AAA video games? Yes.

> Lisp needs a lot of space to do it's thing; and while it's certainly possible to downsize it, you're left with something that's not really Lisp anymore.

Again, while a 13MB SBCL image might be significantly larger than a 100KB C program, given that that's the entire compiler and runtime bundled in, and the size of additional code scales also like C/C++, that still makes it viable for almost every kind of program (and still an order of magnitude smaller than Electron). Same deal with memory usage.

> Finding solid libraries is tricky since it's not very popular professionally.

This one is so true it's not even funny. (although there are C, C++, and Python FFI's that cover most of the stuff that you want, although that's kind of cheating)

> Unleashing the full power of Lisp in a diverse team is a recipe for an adventure, if not disaster.

The list of companies using Clojure[2], in addition to the commonly-cited Viaweb/Orbitz/Grammarly, beg to differ. Anecdotally, most Lisps have less footguns than C++ - if people can figure out how to use Stroustrup's monster in massive video games, Lisp is easy.

[1] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

[2] https://clojure.org/community/companies


> ... AAA video games? Yes.

One of my favorite stories of the ideas of lisp commercially is a dialect that Naughty Dog developed for their game development.

It started as being developed for Crash Banicoot on the PS1.

Which is really interesting given how limited the system was (1mb to 2mb of ram depending on what you were doing with the system!)

They later iterated on it for Jak and Daxter on the PS2. To quote the wikipedia page:

> GOAL does not run in an interpreter, but instead is compiled directly into PlayStation 2 machine code to execute. > ... > GOAL has extensive support for inlined assembly language code using a special rlet form,[1] allowing programs to freely mix assembly and higher-level constructs within one function. > ... > It supports a long term compiling listener session which gives the compiler knowledge about the state of the compiled and thus running program, including the symbol table. This, in addition to dynamic linking, allows a function to be edited, recompiled, uploaded, and inserted into a running game without having to restart.

[https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp]

I almost don't believe it, honestly.

In most modern engines I've used we STILL struggle hard for a perfect live-reload system.

I've implemented many types of custom interfaces that let you tweak things on the fly in my games, and it's almost criminal how much better/productive it feels vs. the 'tweak, compile, test' loop we're usually stuck with.

All that said, I see the same reason of why companies always transition away from lisp all the time:

> In all honesty, the biggest reason we're not using GOAL for next-gen development is because we're now part of Sony. I can only imagine Sony's shock when they purchased Naughty Dog a few years back, hoping to be able to leverage some of our technology across other Sony studios, and then realized that there was no way anyone else would be able to use any of our codebase.

It really makes me wish some big company would make a python clone in lisp, but put a tiny escape hatch in to fully utilize the lispy parts.


Hissp is a Lisp clone in Python, with a tiny escape hatch to fully utilize the Python parts. https://github.com/gilch/hissp


> It really makes me wish some big company would make a python clone in lisp, but put a tiny escape hatch in to fully utilize the lispy parts.

It’s not from big company but that describes clpython: https://common-lisp.net/project/clpython/index.html


that's basically Julia. take lisp, remove the s expressions and add a bunch of really good math libraries and you've just reinvented Julia.


there was dylan before julia, so julia might just be reinventing dylan :) but that's not what's interesting

the julia project is in fact very interesting to me and has a great team developing its ecosystem and i work with it alongside python for numerical work. however one key drawback (compared to common lisp) for me is that it is not self-compiled. it is hosted on llvm and over 30% of its repository is in another language (mainly C and C++)[0]. but i am saying this only in comparison to common lisp. other languages are not different to this, and are much worse. as far as scientific computing is concerned, i would work with julia over python any day

[0] https://github.com/JuliaLang/julia


It would be really cool to make Julia self hosting. The easiest non-julia in Base to replace is the parser which currently is in femtolisp. Most of the c++ code is just LLVM which we keep our own copy of because Julia has pretty specific LLVM version requirements that OS provided versions are unlikely to satisfy. Getting the C code out would be interesting since it's difficult to bootstrap a language that doesn't like being ahead of time compiled, but it would be a cool project.


Can I ask why it matters to you whether or not julia is self hosting?


because i want to be able to to go turtles all the way down with my code. julia being hosted on llvm does not provide that. if you want to do optimisations in julia you must learn llvm[0]. in this sense hosted languages are always a front-end. common lisp is not this

[0] https://docs.julialang.org/en/v1/devdocs/llvm/


> in this sense hosted languages are always a front-end. common lisp is not this

Even self-hosted languages are front-ends for assemblers and microcode. What actually matters is being able to have arbitrary control over how code is executed. Julia has this in spades.

Julia has it's own intermediate representations where you can write custom compiler passes and intercept and modify whatever you like (in pure julia). You can think of these tools as being like macros, except instead of lexical extent they have a completely dynamic extent.

With these passes, you can arbitrarily change the meaning of any code recursively. E.g. You could define an execution context where multiplication and addition have swapped meanings, or you could locate every if-else statement in the entire callstack of a program and switch the if clause with the else clause. And again, this is not limited to the lexical extent of a macro, and it does not disable compilation. It is instead a programmable step in compilation.

A classic example of this would be Zygote.jl [1] which takes a program and recursively generates code to that would give the derivative of that program. Doing this requires you to modify literally every function call in the callstack, and you can do this all in pure julia without ever knowing a lick of LLVM.

Granted, there is something ideologically appealing about the idea of a self-hosted langauge. I'm just arguing that on a practical level, the compiler tools allow total control over code execution without changing languages, just as you'd get in a self-hosted language.

[1] https://github.com/FluxML/Zygote.jl


Julia is a language and its implementation.

Common Lisp is a language with multiple very different implementations.

Common Lisp is not 'self hosted' - many implementations are. Many are not.


as i said i think that Julia is an exciting project in the scientific computing space. my statements were not directed against julia or any other language. i just happen to think that common lisp also has great potential for scientific computing, one that can appeal to scientists who want to be software engineers also. for those that just want to crunch numbers right now python is a first reference then julia, imho. however, i sincerely hope that julia overtakes python

having the ability to be self hosted is definitely not ideological. [* here is a hypothetical question that can be asked: would a julia programmer be more powerful if llvm was written in julia? i think the answer is clear that they would be *]

common lisp is a language specification. it does not have an official implementation, although by far the most popular one is sbcl. in fact there is an implementation CLASP[0] that is hosted on llvm

[0] https://github.com/clasp-developers/clasp

[*...*] enclosed part added to the response subsequently to initial posting


Yes, and sorry if I also came off as combative here, it was not my intention either. I've used some Common Lisp before I got into Julia (though I never got super proficient with it) and I think it's an excellent language and it's too bad it doesn't get more attention.

I just wanted to share what I think is cool about julia from a metaprogramming point of view, which I think is actually its greatest strength.

> here is a hypothetical question that can be asked: would a julia programmer be more powerful if llvm was written in julia? i think the answer is clear that they would be

Sure, I'd agree it'd be great if LLVM was written in julia. However, I also don't think it's a very high priority because there are all sorts of ways to basically slap LLVM's hands out of the way and say "no, I'll just do this part myself."

E.g. consider LoopVectorization.jl [1] which is doing some very advanced program transformations that would normally be done at the LLVM (or lower) level. This package is written in pure Julia and is all about bypassing LLVM's pipelines and creating hyper efficient microkernels that are competitive with the handwritten assembly in BLAS systems.

To your point, yes Chris' life likely would have been easier here if LLVM was written in julia, but also he managed to create this with a lot less man-power in a lot less time than anything like it that I know of, and it's screaming fast so I don't think it was such a huge impediment for him that LLVM wasn't implemented in julia.

[1] https://github.com/JuliaSIMD/LoopVectorization.jl


great! i think that as far as scientific computing is concerned one has every right to be excited about julia. it is impressive how much work is being done in such a short time. i first came across it in 2015/6 and i am very happy about how its ecosystem evolved. i use julia and will continue to use it

this is what i would advise given the current state of affairs in machine learning

* use tensorflow / pytorch / pyro for industrial purposes. otherwise know what you are doing !

* use julia and its ecosystem for academic/research purposes. universities should be at the cutting edge of research

* use anything else if you want to experiment deeply. here is where i see the value of common lisp


for escape hatches, see cl4py: https://github.com/marcoheisig/cl4py/ and the other way around, calling Python from CL: https://github.com/bendudson/py4cl/


> Finding solid libraries is tricky since it's not very popular professionally.

If we're considering Clojure, then I would argue this point isn't true. Java interoperability in Clojure is really well supported and easy to accomplish, so leaning on the massive Java library ecosystem is possible. Also, anecdotally, a lot of Clojure libraries appear to be defunct, but just actually haven't been updated in so long because of the strong backward compatibility across Clojure versions.


Well, if we're going to venture that far into excuse land Common Lisp has access to every C library in the world through CFFI.

Look, I love Lisp, but reality isn't going anywhere.


I haven't used CL so I can't comment on the differences between CL's C interop and Clojure's Java interop. But when I worked at a Clojure shop, "finding solid libraries" was never an issue, because we used Java libraries directly. And we usually preferred to use a Java library even when third-party Clojure wrapper libraries existed. Using Java from Clojure is natural and idiomatic.


This is a variant of the "Turing tarpit" fallacy - that is, saying that all things that are technically possible are equivalent.

Calling Java code in Clojure is completely seamless[1] - there's a world of difference from the CL FFI (which isn't that bad to use), which is turn easier to use than the CPython FFI.

You might as well say that C is just as powerful as Common Lisp, because you can implement Common Lisp in it. Ergonomics matter.

The point of a library is that you're supposed to be able to use its interface without worrying about the code underneath - so, Clojure does have access to the entire Java ecosystem.

My comment about lack of libraries was mostly referring to non-Clojures - CL in particular, but my argument also applies to Scheme and Racket and others.

[1] http://xahlee.info/clojure/clojure_calling_java.html


I certainly don't recall it as being seamless; Clojure is a very different language from Java, and the interop code I've seen doesn't look like the kind of Clojure anyone would write otherwise, hence the comparison to CFFI which has the same impedance issues.

But it's all JVM bytecode, so ofcourse it's going to be somewhat smoother.


What's the excuse GP is making exactly?


>> Finding solid libraries is tricky since it's not very popular professionally.

> This one is so true it's not even funny

I'd say that's a bit out of date as Quicklisp has made a huge difference for CL. Obviously it's still not the most library-rich ecosystem but it's come a very long way.


Do you mind elaborating a bit on this? I've actually never heard of Quicklisp.


https://www.quicklisp.org/

It's a package manager for Common Lisp. It's supported in pretty much every implementation (probably every, but I won't say that with certainty).


Ohhh, duh. I've even used that before a few times but forgot lmao


I don't think you should use "Lisp" as Common Lisp in one reply, and then "Lisp" as Clojure in the other. They are two very different languages. Though with the Armed Bear Common Lisp backend, you have access to the Java libraries.


> Getting Lisp to run as fast as C takes major effort when at all possible.

But Lisp is faster than almost every widely used language that's not C/C++/Rust, around Java speed.

> Lisp needs a lot of space to do it's thing; and while it's certainly possible to downsize it, you're left with something that's not really Lisp anymore

Dunno, by modern standards it seems pretty small.

> Finding solid libraries is tricky since it's not very popular professionally.

Ish. Because it's so mature, lots of code works even it it's not updated constantly. There's code out there for most use cases, at least for back-end-y things.

> Unleashing the full power of Lisp in a diverse team is a recipe for an adventure, if not disaster

So enforce coding standards. This isn't a negative and lots of us work alone or in small teams.

Honestly, the only reason I don't mainly use Lisp is because I use an even slower/easier language because I don't really need a performant language. But if I find a problem where I want more performance Lisp is probably the next stop.


> There's code out there for most use cases, at least for back-end-y things.

I'm not sure about that. These days in the backend you need a lot of libraries for cloud providers and new databases/services. For example, is there a Common Lisp library for AWS/Azure/Google cloud? Is there one for Clickhouse?


I've come across Common Lisp SDKs for AWS, there's Heroku build packs, there's Kubernetes clients listed on Kubernetes' website, etc...

Clickhouse doesn't have an official CL client but they have a CLI, TCP and HTML interfaces, C++ interface, etc... all listed on their website. It's not that hard to connect to an API without a library.

But yes, there's more Java/Python/Go/Ruby/JS stuff pre-built than Common Lisp. But you'll also find more CL stuff out there than almost any esoteric language, probably because it's easy to build stuff in CL.


Related to this, there's an AWS SDK for Clojure [0] (created by the same people who are behind Clojure), which is generated from the AWS specs themselves. Carmine, a popular Clojure library for Redis does something very similar. There's also a library for generating SDKs from Swagger specs [2]. I implemented a similar solution (generating Python bindings for some of our APIs from Postman exports), and it was super simple, really.

I suspect doing the same in CL would be similarly simple.

[0] https://github.com/cognitect-labs/aws-api

[1] https://github.com/ptaoussanis/carmine

[2] https://github.com/oliyh/martian


If performance was such a problem python wouldn't have a subreddit.

I think a lot of lispers like handling various paradigms in their head and start with basic lisp, and resort to edsl to reach more appropriate semantics/mechanical sympathy. Just like people call out to C wrappers mostly.

The social side of lisp I can't say for sure but I'm sure it's fuzzier than it seems. I've just talked to a dude saying his new guy was a clojurist and his thinking is way finer than the current team.


And yet there always seems to be another discussion about how to re-implement CPython so that programs run faster.


hehe. i remember a thing posted about python-on-guile, a rather ambitious project implementing python by compiling it to guile sheme.

The scheme code it produces is pretty awful. Not because it is bad, but because python is extremely unidiomatic scheme, and if you want to keep python semantics thay is where you have to go.

And despite being a one man job, despite being run on a far-from-the-fastest scheme, despite giving the optimizer any chance to do the regular optimizations it was about 1.5x times faster than cpython.

It was in numerical code, so beating cpython is maybe not very hard. I found it funny nontheless.


SBCL is already about as fast as Java and can get most of that speed without very much time optimizing everything.

Java has hundreds (thousands?) of developers working on it.

SBCL has a handful of part-time devs.

What could SBCL become with all those man-years behind it?

Custom syntax is hardly the big issue when using Common Lisp. I'll take that any day over the dozens of layers of useless abstraction in Java projects. I'll take it over the C code golf or "what does this undefined behavior do?". I'd even take it over the piles of garbage JavaScript I've waded through over the years.


> What could SBCL become with all those man-years behind it?

Whatever stopped that happening in the past, still stops that happening, and will continue to stop that happening in the future.


I cut my teeth on Lisp, but I've found that happiness comes from compile time guarantees. Maybe those two things aren't mutually exclusive, but a thoroughly robust type system seems like a big ask in such a dynamic language. None of the solutions I've encountered have satisfied me, but in fairness I haven't looked that hard.


I'm fairly new to lisp programming, but there's a language called Coalton that provides static type checking to Common Lisp. I believe both languages are one and the same, but Coalton provides some type guarantees while still being able to have all of the interactiveness CL devs are used to.

https://github.com/coalton-lang/coalton


I've been here before

I came to Clojure from F#/ReasonML/C#/PHP/JavaScript so I understand the comfort in specifying something with a type system and what it's like to not have one and what it's like to have a weak/strong one

Compile time for Clojure is when you inject new code into your running program

Imagine you have a large codebase with lots of interconnected types and then you tasked the computer with checking all those types everytime you injected

Would that slow down the code injections for you and every other developer? Yes maybe, what about checking for probable errors? Again yes but slower, the feedback loop in lisps are what make them feel magic so this would be a problem

What could we do to maintain dev speed and confidence?

This is my current approach:

First is clojure-lsp (clj-kondo) this will check for silly Monday morning mistakes and do it in a separate process so I can code inject unhindered but still spot errors via my editor

Second is a new library called hyperfiddle/rcf they are inline tests that run on code injection so any static assertion I want to write about data or functions I can

They will run under "compile"/code injection time and can be solidified into "real" tests at any time and maybe more importantly serve as great communication for how to use functions and what kind of data you can expect to flow through them close to the original definitions

For me those things combined with TDD, the repl and the static analysis from intellji and writing real tests every now and again is enough confidence for me and I'm in control of the "compile" time cost not the language

So when it comes to green/red cycles in TDD the repl (code injection) is great for creating code/solutions and hyperfiddle/RCF is great for creating the safety harness required to fearlessly refactor all triggered from inside the editor it's really addictive


Some CL implementations (including SBCL) have optional static type checking around function boundaries.

Barring that, Julia is a lispy language with great support for optional typing.

Both of the above will get a potentially better performance when you add types (the checks are (mostly?) done at compile time). Racket also has optional types, but I think these contracts are checked at runtime? I am not sure.


I have 0 experience here but that's really surprising to me.

I'd think out of any language lisp would shine here.

You could write as complex of a type system as you wanted that compile time checks, so it's surprising someone hasn't written one that mimics at least java/c++/whatever style types.

I do think that is one of the down sides to lisp though. When you can do almost anything it's very hard to agree on what to actually do.


> You could write as complex of a type system as you wanted that compile time checks, so it's surprising someone hasn't written one that mimics at least java/c++/whatever style types.

There's at least typed racket.

> I have 0 experience here but that's really surprising to me.

> I'd think out of any language lisp would shine here.

I don't think you should be surprised that people don't want to write their typecheckers as macros. A few people might want to do that, but most people don't. And at this point you're not far from just creating a new language, that you could create in ML or a descendant, which have always been one of the most popular options for that.


It’s a lot more practical than you think if you’re willing to roll up your sleeve and do the work.

Fortunately, people have already done the work for you. [1]

[1] https://github.com/coalton-lang/coalton


I don't think it would be more practical than just using a typed language in the first place.


Not really getting the argument, the above lib pretty much makes it a superset that adds types.

And while I don't know the scope of the library, if they have a way to not allow dynamic features then you're already starting from a having a typed language (with one library dependency).

Only now you can keep building up the type system in ways that benefit your project, unlike in most non lisp languages.


Shen is an example of that. In its original form, Qi, it was written for Common Lisp and later written for other languages.

http://shenlanguage.org/


Wow, I'm glad that they opened up the documentation. I liked what I saw a few years ago, but I'm not getting behind a language where all the usable docs are proprietary.


Can I be extremely shallow? Because the language feels like it's from the 60s. Like writing Fortran or Ada or Algol in 2021. C is timeless, it doesn't count.

I love lisps, but Racket, for example, feels more modern than Common Lisp to me. And for shallow people like me, feel and ergonomics are very important when writing code.


> C is timeless, it doesn't count.

Of course C counts. IMO it is a language that should have stayed in the last century.

> feel and ergonomics are very important when writing code

This is the reason why I can't stomach languages like Java, C++, etc.I feel like I'm fighting a very constrained interface.


C's succesor is either Go or Plan9/9front's C. Outside of "Unix 2.0", you have Go.

Please, do not compare Java and C++ against C, C++ is a monster it should never have born, and Java it's a shitty TCL "alternative" from its era.


If anything zig or rust are the successor, Go is a language from the 90s. We can agree on C++ and java, though.


Go is a rehash of Alef/Limbo. Is not from the 90's.

If you say so, we could say the same on Zig.

BTW, UTF-8 dates back to the 90's, good luck finding a better alternative. And TCP/IP it's even older, but it works.

Something being newer doesn't make it better.


When we're talking about strong type systems and classes of bugs eliminated just because the program compiled (a la OCaml, Rust, Haskell)... yes, newer is better.

Your argument isn't universal; something being older doesn't say much about its quality either.


Could you elaborate a few specific points on this?

Curious because I'm on the same boat.

I was annoyed for a while how ruby convention uses '?' characters at the end of method names that return a boolean.

I'm not even that old and I still want to yell `[a-zA-Z0-9_]+` identifier names or get off my lawn.


In Julia, the convention is that a function like "swap!" modifies its input. It's extremely handy. Also, "odd?" is just an elegant way to name a predicate.


I saw that convention in IBM/NAG's Scratchpad/Axiom computer algebra system which was designed in the 80s. It may go further back than that.


Why would you want to limit yourself so drastically in what you can express with function and variable names?


I don't, I always follow whatever convention the langue/code base uses.

I'm just saying like GP I'm shallow and prefer C style identifiers where it makes sense to.


I don't think this is drastically limiting, it's a sane subset that gets you 99% of what you need from a function name.


We use punctuation and symbols in typography and writing to provide clarity. Why not use it in our programs?


Programs? Aren't we talking about variable and function names?

But for the general case, Lisp does use puncuation and symbols. It just tends to use less. The other side of the balance is operators like ~+#>. Most languages are in the middle ground. All of these options are fine and have their uses. I wouldn't like to see stuff like ~+#> every day, but I like being able to implement <.


Programs are composed of those things.


Sure, but you don't use a lot of punctuation in identifier names. For example, in JS:

    function sumOfArray(arr) {
        return arr.reduce((sum, element) => sum + element, 0)
    }
Now in Scheme:

    (define (sum-of-list lst)
        (reduce + 0 lst))
Both have punctuation, but usually not in the identifiers (except for - and _)


In fact we do, but as in typography you do not use punctuation to write words.

Then again, I really like '-' in symbol names.


C was already pre-historic in the 1990's, of course it counts.


It's an IR for a never completed language.



no popularity, no money.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: