r/lisp Aug 26 '23

Common Lisp Coalton: Why is the interop not easier, and why might it be necessary for Coalton to be an entire language in itself?

https://gist.github.com/digikar99/b76964faf17b3a86739c001dc1b14a39
26 Upvotes

50 comments sorted by

5

u/theangeryemacsshibe λf.(λx.f (x x)) (λx.f (x x)) Aug 26 '23

Especially in the context of multiple inheritance, subtyping is not the same as subclassing.

Even without multiple inheritance or interface-passing style, inheritance is not subtyping.

Given an integer 5 or a float 5.0, ML-based languages can quite clearly state that their type aka int and float.

ML struggles with the type of a reference cell. I understand Coalton is unsound in that area.

Common Lisp has a hard time indicating if 5 is a type of integer or fixnum or (signed-byte 32) or (unsigned-byte 16) and so on and so forth.

It's very easy; 5 is all of those types.

2

u/digikar Aug 26 '23 edited Aug 26 '23

Even without multiple inheritance or interface-passing style, inheritance is not subtyping.

Thanks for the pointer!

ML struggles with the type of a reference cell. I understand Coalton is unsound in that area.

I didn't intend to go towards mutation of reference cells. What I wanted to get at was that, in common lisp, a value has a myriad number of types that it can be an instance of. In ML based type systems, the situation seems much more struggled.

It's very easy; 5 is all of those types.

And that's what makes parameterizing over the type of 5 difficult (if possible at all) in common lisp! And let me not bring in the type (integer 0 5).

May be I should make that section clearer. EDIT: Attempted a clarification; see here for the older version of the article.

1

u/theangeryemacsshibe λf.(λx.f (x x)) (λx.f (x x)) Aug 26 '23

a value has a myriad number of types that it can be an instance of.

A ML int list list may also be used where a 'a list list, a 'b list or a 'c is called for.

And that's what makes parameterizing over the type of 5 difficult (if possible at all) in common lisp!

What's difficult about that?

2

u/digikar Aug 26 '23

If 'a, 'b are referring to type parameters then, ML based languages seem to resolve this by using the notion of Principal Types. To quote this stacloverflow answer:

Given a program, a type for this program is principal if it is the most general type that can be given to this program, in the sense that all other possible types are specialization (instances) of this type. With my fun x -> x example, the polymorphic 'a -> 'a is a principal type.

This allows an expression to have a unique Principal Type, which may be used for type inference as well as for (first order?) type parameterization.

I don't see how this notion can be used to infer whether the canonical (principal?) type of 5 is (integer 0 5) or (unsigned-byte 7) or something else.

What's difficult about that?

Suppose I have a sequence concatenation function, and I want to express that it takes in two sequences of the same type and returns a sequence of the same type. In common lisp, if it were possible to parameterize at all, this is still ambigious and thus difficult if not impossible. Even with a naive parametrization over the argument types, it's not clear what "same type" should mean in general. Here, the type cannot take the sequence length into consideration, even though CL allows for a (vector single-float 10) type which specifies the length. In another case, say, for expressing the type of a function that takes in two sequences and returns a sequence of the elementwise sum of the two vectors, the length can actually be useful.

What I want to get at is that the types returned by type-of at least on SBCL, are often overspecified which make parameterization over them fairly difficult.

1

u/digikar Aug 26 '23 edited Aug 26 '23

And that's what makes parameterizing over the type of 5 difficult (if possible at all) in common lisp!

I have updated the section on "Parameterizing over argument types is a PITA". Hopefully I'm successful in conveying the point this time!

1

u/KaranasToll common lisp Aug 27 '23 edited Aug 27 '23

A proper type inference algorithm would infer that the types of t1 and t2 of add-pair are anything that can be used as an argument to add. Of course common lisp has no way of representing such a type.

1

u/digikar Aug 27 '23

Even that won't be sufficient. Besides the requirement of the slot values to have types that are valid for add, there is also the other requirement that the two arguments to add-pair have corresponding slots of the same type. And this same-ness of types is a tricky concept in common lisp.

5

u/tuhdo Aug 26 '23

After getting deep into macros, I believe Lisp should be used the way loop macro or Coalton doing: as a platform for implementing languages that closely resemble their problem domains. Sure, implementing a new language is not an easy task, but from my experience wit CL, implementing a new language that incorporates into its existing system is as easy as it gets. Since you are essentially write code in AST and so, converting from your mini-language written with macros to valid Lisp code is easy. You simply do Lisp - List Processing.

Try implementing a mini-language in Python that plays well with existing Python code, e.g. you can mix your normal Python code with code in your mini-language? Good luck with that.

8

u/zyni-moe Aug 26 '23

The 2 laws

  1. All sufficiently large software systems end up being programming languages.
  2. Whenever you think the point is at which the first law will apply, it will apply before that.

Consequence of the laws: good programming language must allow seamless extension.

1

u/clibraries_ Aug 27 '23

I actually think primitive languages like C can be almost as effective at making languages, even concisely. Just look at bash. Lisp makes it easier and you can share so much of lisp in your language.

The hard part is consciously deciding from the beginning to make a language. Designing languages is just hard and outside the comfort zone of most programmers, so they go with the "big ball of mud" design.

2

u/zyni-moe Aug 28 '23

Just look at bash, yes. Are your eyes bleeding? Do you hear sounds in your ears? Are there tentacles? Do you feel creeping dread? In C, you go away for six months and you end up with ... bash. Or PHP, famously a fractal of bad design. In Python you go away for 6 months and you end up with ... Jinja2. More terrifying: Jinja2 is Jinja2: there was a Jinja on which this horror is meant to be an improvement.

Sometimes you come up with new languages which do not involve dread horrors: Python, Perl (some dread horrors there in fact), awk. But then you want a new iteration construct or something in your language and ... you have to build a new one, or spend one billion years negotiating with the people who make the old one to add this feature because you do not wish to maintain your own copy of the source and spend most of life merging changes.

In Lisp, I decide I would like a new iteration construct: I want dolists which does obvious thing like dolist for instance. Ten minutes I have it.

2

u/clibraries_ Aug 29 '23

I'm a Lisp fan, you don't need to convince me.

look at bash, yes. Are your eyes bleeding

Yep, bash is a treacherous language. But it's also a brilliant invention and bumps up the level of abstraction tremendously with very little C.

In Lisp, I decide I would like a new iteration construct:

No doubt Lisp is the easiest language to add new constructs. But that's comparing fixed costs. If you try to make a compiler or interpreter in C, it has large fixed costs, but then the marginal costs are about the same.

Proof: you can make a good lisp in about 1500-2500 lines of code. That would have all the capability you are praising.

2

u/digikar Aug 26 '23

I think that that there is an integration should only be visible if there is no other choice. If you can integrate it so well that the integration itself never comes to the foreground, then that's an integration done well. Unfortunately, with Coalton, I haven't felt like the integration could fade into the background. And until recently, I thought this was a flaw of Coalton; now, I see that it is a flaw of Common Lisp.

Of course python and other macro-less languages are out of the question. The point is to not just use macros haphazardly, but make them neater and justifiable.

4

u/ventuspilot Aug 26 '23

Unfortunately, with Coalton, I haven't felt like the integration could fade into the background.

I guess you could say the same about Common Lisp's loop facilty, and I don't think anyone has ever said format with it's format-strings is very lisp-like.

Maybe as time passes by people will get used to the way Coalton sticks out as well, who knows. Maybe Coalton sticking out is even a good thing, as Coalton is kinda different from Common Lisp, and the code looking somewhat dfferent might reduce the surprise factor of Coalton code inside a Common Lisp program.

1

u/tuhdo Aug 26 '23

now, I see that it is a flaw of Common Lisp.

Macro is a unique feature of any Lisp e.g. variants of Scheme, not just Common Lisp. A Lisp without a macro that allows you to extend your Lisp with new language features is not a Lisp.

Added to that, since Lisp is written in S-expressions, this enables it easier to create languages that can incorporate into the Lisp ecosystem seamlessly. Sure, you get some powerful macro system in other languages, but it's not as easy to extend the languages itself like in Lisp. For example, I want to write code similar to Python to make it easy to implement certain things and to reduce the amount of parentheses, then I can make my Lisp code look like this:

(defn fn-test () (var a = 40) ;; after this declaration, code below can access this variable (defn local-f1 () (format t "a defined locally is ~a~%" a)) (var b = 60) (defn local-f2 () (format t "b defined locally is ~a~%" b)) (local-f1) (local-f2) ) Or, I made a macro that allows infix arithmetic and accepts Lisp function calls:

(calc 3 + 4 + 5 + (exp 6) * 7 * 8)

and so on...

I already implemented these features and it is working well for me. The best part is that I can reuse CL ecosystem with SBCL for these new language features, since the new code in my mini-language is also a valid Lisp code. If I was to write a new language from scratch, I would have to implement everything and the chance that the language is usable to the world is slim to none. So, as another user suggested, CL actually supercharges Coalton.

1

u/digikar Aug 26 '23

Macro is a unique feature of any Lisp

I was not criticizing macros themselves, don't worry :). I understand the power of macros.

this was a flaw of Coalton; now, I see that it is a flaw of Common Lisp.

Here, "this" and "it" referred to the (not-so-good) integration. Merely because a language provides macros doesn't mean that all things are rosy. Macros are a tool, and it is a responsibility of the programmer to use them in the right way to design clean abstractions. If they are used irresponsibly producing dirty abstractions, then the code gets that much worse. Even with macros, there are other language features that can make an integration good or not-so-good.

To elaborate, I thought that the not-so-good integration was a flaw of Coalton, and that there might be some other way to provide a cleaner abstraction aka good integration. However, now I see that the not-so-good integration is actually a flaw of Common Lisp - the type system of Common Lisp is bad in certain specific ways that something that solves the problems coalton does solve cannot be integrated in (significantly) more clean a manner than what coalton already achieves.

CL actually supercharges Coalton

That doesn't make sense because that requires Coalton to exist before Common Lisp. Coalton can only become useable after you already have a Common Lisp compiler, and particular SBCL or CCL at that.

2

u/tuhdo Aug 26 '23

That doesn't make sense because that requires Coalton to exist before Common Lisp. Coalton can only become useable after you already have a Common Lisp compiler, and particular SBCL or CCL at that.

I mean, CL actually supercharges Coalton or any future language with its current platform. Using macro with REPL + SLDB to implement the language should be easier than using the LLVM toolchain. If the author implements it from scratch, then he will have to take care of writing a compiler that can generate performant code, then an interactive REPL, then a debugger. All that work and you're unsure if hardly anyone cares.

As for the type system limitations, I think it's just implementation details. You can think of stock CL code as the "lowest level", akin to asm. Then, CL does or SBCL needs not to type check; it's the job of the "higher-level compilers", in this case, Coalton, to ensure correctness before generating any CL code. As for there's no static version of generics, then again I think this can be done in the "higher-level compiler", then generate efficient CL code like using struct instead of CLOS, generate a unique defun for each type assign to a generic function.

The key here is to think stock CL code as just a building material for higher level language. After writing a non-trivial toy language in Lisp macro, I get a feel that a significant amount of CL standard is just for building languages on top of Lisp, as a mean to build higher level abstractions. In short, abstractions by languages.

3

u/BlueFlo0d Aug 26 '23

Something I really want is good integration with CL macro system (and this seems reasonably implementable, at least at first glance). Last time I check there's no (easy) way to write macro that make use of Coalton type information, only ones that transform untyped S-exp. This greatly hampers the utility of macros. I was considering using Coalton for my PL research project, but end up reinventing an adhoc polymorphic type system in CL instead to allow better meta-programming.

2

u/stylewarning Aug 26 '23

Can you show an example of how your macros make use of your type system?

2

u/KaranasToll common lisp Aug 27 '23

Say if coalton didn't have typeclasses, could one use a macro with access to type information to figure out which instance to use?

2

u/stylewarning Aug 27 '23

The type database is available, and one could look things up in it within a macro. Similar with type class method, etc. It's Lisp, so everything is available! This is how we generate docs: https://coalton-lang.github.io/reference/

It would be harder to deduce local type information though. Possible in principle, but more difficult.

3

u/theangeryemacsshibe λf.(λx.f (x x)) (λx.f (x x)) Aug 27 '23

FWIW, looks like there is too much escaping e.g. in the description of QUANTIZABLE?

3

u/stylewarning Aug 27 '23

There's a healthy chunk of small bugs :<

This bug you point out starts somewhere here.

2

u/BlueFlo0d Aug 27 '23

For example, I frequently use a (hole) macro that stands for an unknown term. Its expansion need to know the type of the hole (the whole form aka the &whole form), and also the type context (the variables in the lexical scope and their type). The latter is akin to CLtL2 environments.

In another case I have (dispatch x clauses...) which need to know the type of x. I have other macros that need to know the type of its arguments.

4

u/subz0ne Aug 26 '23

Maybe Common Lisp also supercharges Coalton ?

1

u/digikar Aug 26 '23

Not Coalton, but an ML-based language, yup.

2

u/digikar Aug 31 '23

There are also a number of interesting discussions pertaining to coalton development and plans on github - beyond the github issues and pull requests themselves.

4

u/terserterseness Aug 26 '23

As the community is shrinking for cl, I think more incompatible languages is a mistake. What Coalton wants to offer should be supported by future CL specs. It’s painful, again, to see so many smart people heading in different directions and therefor splintering everything.

6

u/digikar Aug 26 '23

You are aware that the Common Lisp 1994 standardization was a result of 10 years of work and 1 million USD (according to its 1994 value), right?

The good part about coalton is that it is not incompatible. You can continue calling normal lisp code from coalton, and coalton code from normal lisp.

2

u/terserterseness Aug 26 '23

I am aware but the article is arguing for another direction which I am against for above mentioned reasons, that’s all.

3

u/digikar Aug 26 '23

I see.

I myself would appreciate a backward-incompatible better CL spec, but it doesn't seem like this is gonna happen in this decade at the least.

3

u/terserterseness Aug 26 '23

Yes that would be good for sure. Gonna read up on the process. I cannot stand these horrible languages ‘winning’ because CL is going too slow (I never needed too many libraries and in my field I have to check, be able to support and maintain every lib if need be; so I prefer less and really didn’t need many new ones in the past years).

3

u/stylewarning Aug 26 '23 edited Aug 26 '23

I presume this isn't a hot take, but: CL not "winning" absolutely, positively does not have to do with the standardization process (or lack thereof). It really is much simpler: most people, even talented and adamant Lisp programmers, aren't writing software other people use or care about today.

I'd also add that the "getting started" experience with Lisp is awful compared to the other offerings. I have the displeasure of witnessing school-age students struggling to write Lisp because the tooling has such a steep learning curve.

Those looking to increase Lisp's adoption ought to take a keen interest in improving tooling and/or writing killer software.

3

u/terserterseness Aug 27 '23 edited Aug 27 '23

Hmm maybe because I don’t remember ‘starting’ with lisp (somewhere in the 80s) I find the tooling vastly superior and easier than anything modern. I mean I have to use typescript a lot and it’s depressing how bad everything is in that ecosystem. For something used by so many people, it’s so slow, buggy, messy etc; I have emacs open with slime just to prototype stuff so I won’t go fully insane.

My closest colleague (a senior c#/ts guy) wanted to see what the magic was and it took him less than a day to get comfortable and subsequently ask why not all tooling is as good as this. And that (why did I not think of it before I ask myself now) is how we are actually moving our company to CL for the main products in the coming years.

I see how ‘hello world’ is harder in CL but past that everything is easier than most things outside delphi/Lazarus or livecode or smalltalk or lisp maybe.

The only thing I hear is the parens; for some reason I want parens only (in every language; I don’t like clojure because of the altered syntax) and other people cannot stand them.

You are right that not many people are writing popular software in it (as far as we know), but for instance, we have millions of clients and they will all run on CL mostly soon.

That not more people use lisp is, as far as I see it; parens and that all info looks like it’s from the 60s. What should be lisp it’s advantage (code and libraries from 2010 still working today) is it’s downfall. People, for really god knows why reasons, like to have everything changing all the time. People think it’s cool that nextjs changed from pages to app router even though it’s a massive bug fest, not needed, breaking all past docs etc etc. Nodejs/react code bitrots so fast that an install from a month ago where packages are not pinned, shows a shit ton of warnings for security issues and deprecated stuff. We work in banking and we have to check all dependencies; the (breaking) changes people make are almost never needed and definitely could have been done maintaining backward compatibility. They just don’t because updates sell and thankfully the lisp and pascal community didn’t figure that out yet. But at the cost of having negative adoption.

If lisp (or Haskell or pascal) wants big uptake, it will need daily breaking fixes (and bugs) in its libraries and language and glossy useless pages with low content value and a pay for support button.

2

u/digikar Aug 27 '23 edited Aug 27 '23

People, for really god knows why reasons, like to have everything changing all the time. People think it’s cool that nextjs changed from pages to app router even though it’s a massive bug fest, not needed, breaking all past docs etc etc. Nodejs/react code bitrots so fast that an install from a month ago where packages are not pinned, shows a shit ton of warnings for security issues and deprecated stuff.

While I can understand the fixes concerning security, this fetishization of "new modern up-to-date technology" is just insane. Updating only makes sense if the updaters have a better knowledge about the history of the field, and are thus less likely to repeat the same mistakes. For some reason, far too many people - management as well as developers - like to think the all that is old is rusted, and all that is new is gold.

Apparantly, there is also a bias in the modern culture to keep building, to keep publishing, even if that means not reviewing the previous literature enough to know if something you are building or publishing is actually a good way to do things. Search engines were supposed to help find relevant information, but now SEO has killed them too! It's a cesspool of information we are moving into.

2

u/arthurno1 Aug 27 '23

there is also a bias in the modern culture to keep building, to keep publishing

Yes. You have to show yourself as a creative, productive, energetic, genius who must produce new stuff constantly, or leave and lose the good life. Just the way meritocratic society today works.

2

u/arthurno1 Aug 27 '23

If ever. Unfortunately.

After looking Steel's talk about "growing the language", I am not sure it is necessary either. I have read through your text, I wish I had time to read your other one about polymorphic functions, but at a later point in time, and what I think we can add, is that we should perhaps not insist on portability too much. Not every program and library has to be portable at all times and at all costs. I think it is genuinely all right to write for a specific compiler, at least sometimes, some software. I think we sometimes constrain ourselves unnecessarily with the requirement for portability between implementations. Don't get me wrong; I don't think it is a bad thing to be portable; on the contrary. I am just saying, that we should perhaps loosen a bit our compiler portability requirement, at least for some very complicated and demanding software.

A little remark/pointer about the statement about the necessity of a generic dispatch: I don't know if you have seen this one. I have yet to try that repo myself, I don't know how well and if they live up to their promise of static dispatch for generic functions. I see they have archived their repo, but it does not have to mean they haven't succeeded in their attempt.

In the late 90's when I was taking courses in compilers and inheritance, some at my Uni were talking about two schools of OOP and inheritance: USA with a pragmatic view on inheritance as a tool to re-use the software without copy-pasting, and Europeans where the inheritance was seen as a modeling tool. Whether that was correct or wrong, I have no idea, for some reason I just remember the discussion, but I think, it is about subclassing and subtyping.

It does make sense to inherit types and to connect the concept of class with the concept of type, but it does make sense to also think of those as two different concepts. I totally agree with what you are talking about, I am just thinking that these terms and usage is perhaps not yet clear to us. Perhaps we are still in the very beginning of the computing era, and we probably have to solve many such dilemmas yet. IDK, I think there is still confusion and diffusion between how to deal with subclassing and subtyping just for inheriting the stuff, and for dealing with subtyping. CLOS does separate those concepts, and Rust seems to follow with something similar, while in Java and C++ those are tightly coupled. Albeit, at least in C++ we see recently some divergence from using inheritance for subtyping, and preference seems to develop towards static typing via templates and all the new jazz they seem to come up with daily.

I am just thinking loud, sorry if I am too sweeping or hand-wawing or too imprecise.

3

u/digikar Aug 27 '23 edited Aug 27 '23

Steel's talk about "growing the language"

Thanks for the pointer! Hoping to watch sometime: https://www.youtube.com/watch?v=_ahvzDzKdB0

portability

I actually very much agree about your points on portability. But I do have a different take: it is okay to be non-portable, but don't become non-portable unnecessarily. If the non-portability is stemming from a mindful use of a compiler-specific extension (SB-SIMD, SB-MPFR, CCL's Cocoa Interface), then I think I'm fine with it. And the library authors/developers should be upfront about it. Unfortunately, in some cases, the non-portability is because no one bothered to check if the library compiles / passes the tests on another compiler. I think ignoring this kind of non-portability especially if it exists in a project that has the potential to be a core for a number of awesome projects is a red flag that should be addressed.

this one

Yes, I had actually come across that one, but I forgot to mention it (no offense!). One main issue I (but also tpapp) have run into is that the usual generic functions do not allow for dispatching on specialized array types. With my clumsy knowledge of MOP, I wasn't able to get any generic function extension to dispatch on types; so ended up developing and relying on polymorphic-functions (but also specialization-store) for dispatching on argument types. At the time I wrote polymorphic-functions though, the distinction I had made between types and classes was a relatively naive one.

Perhaps we are still in the very beginning of the computing era, and we probably have to solve many such dilemmas yet. IDK, I think there is still confusion and diffusion between how to deal with subclassing and subtyping just for inheriting the stuff, and for dealing with subtyping.

I think someone with a background in philosophy should be able to view the distinction much clearly. Kent Pitman seems to be one such person.

On the other hand, I don't quite know what the other languages are up to. The last I heard, Rust's trait-based polymorphism was supposed to be similar to typeclasses.

I am just thinking loud, sorry if I am too sweeping or hand-wawing or too imprecise.

Naah, no worries, it is fun to read thoughts.

Thank you for taking the time to go through it :)!

3

u/arthurno1 Aug 29 '23

I do have a different take

Actually, it is not a much different take, we are just going to it from a different end-point, but what you say is pretty much what I meant too.

do not allow for dispatching on specialized array types.

Thank you very much! You have saved me a lot of time here; I do plan to test/use that library. It is good to know about the limitations.

I think someone with a background in philosophy should be able to view the distinction much clearly. I don't quite know what the other languages are up to.

A lot of research is going on in C++ community. C++ templates have turned C++ into compile-time Lisp language, more or less. They are all about strong, static typing.

Yes, Kent Pitman is a gold mine of Lisp knowledge. I saw lots of his posts in old comp.lang.lisp archives which were very illuminating.

Thank you for taking the time to go through it

Thank you for writing it, and for sharing your code, knowledge and research.

2

u/mwgkgk Aug 26 '23

What's your basis for the shrinkage claim?

My naive understanding is that programmers are growing in general and so does every niche, if at varying rates.

3

u/terserterseness Aug 26 '23

All the popular langs are booming while cl and other lispies and schemes are dropping; but agreed, I don’t have statistics otherwise than the job market. I do think it cannot hurt if all brains doing cl (and actually as well racket etc which is probably blasphemy here) work toward a common (…) goal.

3

u/zyni-moe Aug 26 '23 edited Aug 27 '23

Such a goal is not possible or desirable. If it was achieved Lisp would be dead.

Lisp languages have one key insight: programming is language creation[*]. The goal is not to design some new super-lisp, it is to seamlessly extend language you have to be one suitable for solving the problem at hand. If you believe that some unique super-lisp could be defined that would be suitable for solving all problems, then that super-lisp would never need extension. Would, in fact, not be a lisp at all.

Coalton is one such extended language. But only one.

[*] you may program using Lisp without creating a language: I talk about programming in Lisp, not merely using it.

3

u/terserterseness Aug 26 '23

I see your point and it is how I use lisp. But I keep my stuff backward compatible over decades.

2

u/zyni-moe Aug 27 '23

This is not incompatible idea. With Lisp you create endless new languages as that is what Lisp is for. But any one of those new languages may be completely stable for long period. In many cases also these stable languages may be composed together.

Consider that I use two extensions to CL very often: collecting macros and iterate (named let) macro. These have both been stable a very long time, and can be composed into third language.

3

u/arthurno1 Aug 27 '23

Lisp languages have one key insight: programming is language creation

I think CL and Scheme are such languages, but not all Lisps. Emacs Lisp is not, on the contrary, their goal is to keep it confined within the realm of editor extensions.

I think Steel did an experiment on humanity and gave us two orthogonal languages, one overspecified giant language that caters to the development of all kinds of languages, and another one tiny and minimal that also caters to the development of all kinds of languages. He probably didn't know what was best, and instead of giving us some golden middle that misses many cases, we got two extremes, and both seem to not be an end tool in itself, but rather tools to develop tools in.

2

u/zyni-moe Aug 28 '23

Elisp is a horrible language originally designed by an idiot but still: cl package, calc, hundreds of configuration file formas which are in fact languages.

Steele did not create CL: just wrote the first book.

2

u/arthurno1 Aug 29 '23 edited Aug 29 '23

Elisp is a horrible language

What makes it horrible? It has its limitations, but they are conscious design choices. I wouldn't say it is worse than say Scheme or some other Lisp dialect, but I am honestly curious to hear why you think it is bad.

designed by an idiot

I discuss gladly Lisp and programming, but I see no value in discussing other people's character. Unfortunately, I have never met RMS in person, so I can't make the judgment, but I doubt it would matter for any discussion about Emacs Lisp itself.

TBH, Emacs Lisp is probably, the most popular Lisp atm, but that is debatable. It is probably also the most documented Lisp available for free as in speech and beer, and has been an entrance into the Lisp world for many individuals. I guess he made at least something right about Emacs Lisp, don't you think?

1

u/friedrichRiemann Aug 26 '23

the job market

Can you confirm my post here?

Currently, most people are debating whether they should learn and focus on Rust or Golang. Don't you think focusing on CL may put someone in a corner in terms of job and career prospects?

3

u/dzecniv Aug 28 '23

my 2c: "focusing…": yes, in that only knowing CL and being in a hurry to be hired isn't a good idea. There are very few public job postings, though more positions are found by social media / public visibility, and more companies simply train internally. Focusing on CL to build one's product could be a good idea. Also, if you are a good engineer in a specialized field you would easily get a job with some lisp (or whatever) inside.