I do not think all this language fragmentation is a good thing. A million little obscure languages that all at the end of the day do the same thing.
Yeah, we need language research to keep devising new features and more efficient ways of programming, but this is different.
I wish the world would get behind a couple well thought out languages that cover most programming needs (functional, systems/bare metal, scripting) and stick with those.
I've seen some ridiculous and unsustainable stacks at some shops because everyone got to pick their favorite language. Then the morale of subsequent hires is in toilet because there's such a cognitive load to learn all these little crap languages.
I feel like some of these languages come about because someone needed to do some task and didn't understand or take the time to learn how to do it in an existing language. Among the latest versions of the mainstream language, there is no programming paradigm you cannot do.
And where these obscure languages REALLY fall down is tooling. Got debugger support for this? Got perftool support? No, of course you don't.
With a small number of languages, work can be put into serious tooling, and fixing compiler bugs, rather than a few devs spread thin trying to keep up with the bugs in their hobby language.
> I wish the world would get behind a couple well thought out languages that cover most programming needs (functional, systems/bare metal, scripting) and stick with those.
uh, thats basically been the case since ... 20 years or so? I mean i dislike java and c like the plague, but they're nonetheless the defacto enterprise standards at the moment for performance critical work and ... everything else.
just look at the TIOBE Index [1] for example. Ill admit that this is a pretty skewed statistic, but its nonetheless pretty representative for the basic idea behind it.
There is, of course, lots of services / software in other languages -- and often with very good reasons (take erlang for example for layer < 4 routing), but thats basically because every other language was insufficient for the usecase they had. Or because they wanted to deliver the software today, and not next week (i'm looking at you, php)
as an aside: <5%% is still a staggering amount of code, so please don't feel marginalized everyone ;)
Only Java and C? What about C++, JavaScript, Objective C/Swift, Python, etc.? And there's also stuff like MATLAB and R, which are the defacto standard in their own areas ...
It’s no different from ecosystem/library fragmentation within a language. For example, there are at least six approaches to accelerating numerical code in Python (weave, plain C + ctypes, Cython, f2py, numexpr, numba) each with its own cognitive load, interop and debug problems (Julia advocates rejoice, but it’s coming for them too).
It seems like a community needs huge incentives to avoid churn and fragmentation eg (a) strong backwards compatibility commitments, (b) big backing company to do the endless boring stuff, (c) strong benevolent dictator, ...
The same is true for languages except there’s no outer “scope”, except for platforms like iOS that might dictate toolchains or a company where CTO might make such choices.
But trying to avoid fragmentation among hackers seems like barking up the wrong tree.
"Julia advocates rejoice, but it’s coming for them too"
Why do you say that? The whole point of Julia is to create a language that's similar to Python in ease of development but is natively fast. Do you think it fails at this?
I think Julia was well designed to stay fast for a long time, for many tasks. But anything that isn’t universally optimally addressable by a Lispish front end to LLVM JIT is going grow multiple approaches that aren’t fully compatible, in the same way that Python, not built for speed of execution, grew multiple approaches to fast code ( just realized I forgot about PyPy in my list above). So I expect there to be multiple eventually incompatible approaches to AOT compilation, web frameworks, GUIs etc. Julia’s youth restricts divergence in the short run but long run I think divergence is a healthy part of any ecosystem, and not to be disparaged as in the post I originally replied to.
In the general case that can't really be true, because the Lua interpreter is written in C.
With regards to GC languages in general, if you spend a lot of time working around the GC by doing things like object pooling, which is really just reinventing manual memory allocation, you can get close to a non GC language in terms of performance.
GC languages are obviously fine for plenty of use cases, and for some use code snippets they can be faster, but there is no way to make a GC free--there's going to be some overhead no matter what you do.
The point of a tracing JIT is that it runs code in an interpreter, then generates machine code for loops and hot spots. By doing this at runtime you can take advantage of knowledge that a C compiler doesn't have. This is why LuaJIT is often faster than C.
LuaJIT can be faster than C for some code. Just like C can be faster than someone's naive hand coded assembly.
That doesn't change the fact that in the general case C is still faster, and there are classes of critical high performance code that have to be written in C (or Assembly, Rust, or even Fortran). Sometimes, manual memory management is necessary to get acceptable performance (also determinism is occasionally required).
All else being equal, GC is always going to be slower than non-GC because a GC introduces unavoidable overhead.
I've worked in this space btw and I've never seen any evidence that LuaJIT is actually faster than C for anything outside of very specific micro-benchmarks.
Multiple large programs written in LuaJIT that have better performance than the same programs written in optimized C.
The vast majority of benchmarks I've seen are down to LuaJIT performing specific optimizations out of the box that the C compiler used in the comparison can perform but doesn't.
In particular the last time I looked at LuaJIT vs C++ benchmarks, the C++ compiler flags weren't set allow the use of SIMD instructions by default, but LuaJIT does.
There was another recent example I saw where LuaJIT was calling C functions faster than C in a benchmark. Then someone pointed out what the LuaJIT interpreter was actually doing, and how to implement the same speed up in C.
Java people made the same arguments years ago: "Java is just as fast or faster than C++". You'll notice that after 20 years of comparisons, no one who writes high performance code for a living makes that claim.
Java is fast enough that the increased programmer productivity of the GC and other features wins out in many cases. People aren't choosing Java over C++ because it results in generally more performant code.
People make the same argument against having dozens of Linux distributions and dozens of window managers. Having a lot of choice doesn't hurt anything, and benefits everybody in the long run.
> I wish the world would get behind a couple well thought out languages that cover most programming needs (functional, systems/bare metal, scripting) and stick with those.
For the most part, "the world" has got behind a small handful of languages. Java, C++, Python, Javascript and maybe a half dozen others make up the majority of real world projects.
> With a small number of languages, work can be put into serious tooling, and fixing compiler bugs, rather than a few devs spread thin trying to keep up with the bugs in their hobby language.
That assumption probably doesn't hold. If the authors of Nim weren't working on Nim, there's no guarantee they'd go work on tooling for some other language. Furthermore, who would decide which small group of languages people can work on?
With the exceptions of Java and Javascript, almost all of today's popular languages started off as somebody's small pet project. The good ones (by some metric, anyway) grew and their usage spread and the not so good ones died out. The best way to make forward progress is to try new things and see what works and what doesn't.
> Having a lot of choice doesn't hurt anything, and benefits everybody in the long run.
But it does hurt in many cases; there is only so much programmer time; if people would invest their time in less projects, they would move on quicker.
I am as guilty of this as probably most people here, having written frameworks, orm’s, parsers, libs, languages, game engines etc instead of helping out on existing projects. I know I did it because I thought I could do better than what was there; usually that was false, however sometimes, at least I thought, it was true. For the bigger ones; frameworks, linux distros, languages, it was always false though so I would say it does hurt; I wasted time and the world did not improve. Only I improved as I liked it and learned, but I would have done as well if I had helped an existing project.
Another point to that is that a lot of people, especially people who solely use computers to make money to survive, really do not like choice in my experience. A lot of my colleagues ask me what to use and hate the fact that when they Google that there is a choice. They do not want choice, they want to use what is the de facto standard for the particular case in every case. With choice and fast change, the de facto standard isn’t there and even more experienced people experience the same angst beginners have when learning frontend web dev. It causes a lot of (probably underestimated) stress in the workplace.
Folks that I worked with had the same idea: “Fortran was good enough for our ancestors, it is good enough for the world. And COBOL”
This was before the obscure language C was invented.
And as far as debugging is concerned, debuggers are overused. Reading “Programmers at work” you see that 1. They used the equivalent of printf for debugging 2. Mostly they had not read TAOCP 3. They did not like C++.
This are great arguments for why we should all just use Java. Great tooling! Lots of high quality libraries! Nothing to make you get out of bed in the morning!
After twenty years of professional Python (mostly) development I have suddenly and recently become a neophyte Prolog programmer.
Bottom line: If you're not using Prolog you are almost certainly wasting your time.
Almost all PLs you're likely to be acquainted with can be thought of as syntactic sugar over Lambda Abstraction. Prolog, from this POV, is syntactic sugar over Predicate Logic. It's actually a fantastically simple language, both in syntax and in its underlying operation, which could be summarized as Logical Unification with chronological backtracking.
I have been working with Prolog for only about two months but I am already way more productive. Typically a piece of code that might have been five pages of Python will amount to a page or less of Prolog. I hasten to point out that 1/5 code means 1/5 of the bugs, but it is also much more difficult to introduce bugs into Prolog code, so the total ratio of bugs per unit of functionality is much lower. Further, Prolog code typically operates in several "directions", e.g. the Prolog append() relation (not function) will append two lists to make a third, but can also be used to extract all prefixes (or suffixes) of a list, etc.; a Prolog Sudoku program will solve, validate, and generate puzzles from one problem description.[1] So you get more functionality with less code and many fewer bugs. It's also very easy to debug Prolog code when you do encounter errors. I'm spending fewer hours typing in code, fewer hours debugging, and I'm still more productive than I was. Looking back, I estimate that as much as half of my career was wasted effort due to not using Prolog.
I'm implementing a sort of compiler in Prolog and I am impressed with the amount of work I haven't had to do. I'm beginning to suspect that most high-level languages are actually a dead-end. For efficient, provably-correct software generated by an efficient development process, I think we should be using Prolog with automatic machine-code generators.
Last but not least, Prolog is old. It's older than many of the folks reading this. Almost everything has been done, years ago, often by researchers in universities. Symbolic math? Differentiation? Partial evaluation? Code generators? Hardware modeling? Reactive extensions? Constraint propagation? Done and done. You probably can't name something that hasn't been explored yet.
This.
But also "Javascript" is a way overloaded term. The JS community is quite fragmented. Same could really be said of all of them. "Just use those" may come with a caveat of "also be conservative about what tooling you adopt around the language you choose."
I almost left Javascript off the list because of it's peculiar (but interesting!), prismatic nature. Still, they all transpile to Javascript (I think?), so really it's all just javascript ;)
While we're at it, let's dispel the notion that we need tiny domain specific languages to accomplish daily tasks. One that covers HTML, Makefiles, SQL, and Awk is all anyone will need.
You know, I'll have to add to the disagreement. Because of all the technical reasons yeah, your way would stop language development, but also because herding cats is a really counterproductive thing to do, every time somebody powerful enough to get a chance tries it, everybody loses in the end. (But you are free to keep thinking this way. Of course, I won't change to your preferred set, but if you want to reduce fragmentation, you are free to surrender your preferences.)
Anyway, perftools are only a must have when performance becomes a problem, step through debuggers are way overrated, there are many features that no amount of libraries or tools will give you, and ecosystems quality matter about as much as size.
Realistically, what features can be provided by the language but not by libraries? If there are many cases, would that not indicate that the language's expressiveness is lacking?
IMO, the goal of languages isn't to provide many features, but to provide constraints, so that programmers can stick to a set of rules that their peers can understand and agree upon. The most powerful language is the machine code for whatever CPU you're using, because there are no constraints, you have all of the power of the processor.
I think it's great that a lot of these minor languages get some play in companies. If they're good enough to overcome the problems you mention, it improves things for everyone. If they're not, then the company dies and so does an evangelist for that language. It's a little like programming language Darwinism; a few companies need to die in the process, but ultimately it's better for programmers worldwide.
"I do not think all this language fragmentation is a good thing."
I both agree and disagree with this.
I love the creativity and imagination that goes into a language like nim. But at one time I thought the same about python! Sometime obscure little languages become important.
On the other hand, the tooling statement is dead on. At the point I find out it doesn't have a step-through debugger I'm just reading the docs for fun and then moving on.
The honest truth is that it’s easier to be famous as a big dev in a small pond than a small dev in a big pond. As communities stagnate, it is harder to get your name known and becomes more political than technical.
"Can use" is different from "can use productively." Technically if it's compiled to C/C++ you can use gdb/lldb etc, BUT the compiled version may be so drastically different from the input that it's effectively useless to track down logic errors in the original.
C has this nice preprocessor directive that indicate the actual source code:
#line 42 "actual_source.lang"
Subsequent tools like gdb/lldb pick up on that, and point to your source code instead of the intermediate C. So you don't care that the C code is wildly different from your own, the tools can still point to to the right place.
Yeah, we need language research to keep devising new features and more efficient ways of programming, but this is different.
I wish the world would get behind a couple well thought out languages that cover most programming needs (functional, systems/bare metal, scripting) and stick with those.
I've seen some ridiculous and unsustainable stacks at some shops because everyone got to pick their favorite language. Then the morale of subsequent hires is in toilet because there's such a cognitive load to learn all these little crap languages.
I feel like some of these languages come about because someone needed to do some task and didn't understand or take the time to learn how to do it in an existing language. Among the latest versions of the mainstream language, there is no programming paradigm you cannot do.
And where these obscure languages REALLY fall down is tooling. Got debugger support for this? Got perftool support? No, of course you don't.
With a small number of languages, work can be put into serious tooling, and fixing compiler bugs, rather than a few devs spread thin trying to keep up with the bugs in their hobby language.