The author's experience is based on toy examples. Programming (and designing) in a professional setting is more than that, though. And the solutions are not the same for everyone: writing an Office plugin has different needs than writing a mobile game. For many of us, the availability of tooling and libraries is crucial. I'm not going to reinvent the web server by building on OS sockets or write a graphics rendering engine from scratch.
Yeah, languages are "easy" to learn when you already know a few from different paradigms. I can skim through the docs of an unknown language and know enough to start writing in it quite quickly. What takes time is to also learn all the tooling, not just the syntax. How do you build it, debug it, deploy it, what frameworks and libraries should you use, and spend time to learn them.
Take javascript, for instance. If you're used to a C style based language with some functional patterns, jumping into a codebase and understand it is quite easy. But around the language itself you also need to know stuff like npm, webpack or similar, how the browser works, http concepts etc etc.
Or python, need to deal with pip, virtualenvs, perhaps poetry, need to understand the limitations to concurrency because of the GIL and how it affects deployments, perhaps django or some other framework, etc.
Right. The languages themselves are usually pretty straightforward; 80% of them have a "common DNA" as most other languages, which an experienced programmer can pick up quickly. The ecosystems, on the other hand, have a lot less commonality, though more recent languages like Rust and Go are bundling more of that stuff so it's less chaotic than C and JavaScript.
Even that is not a problem. Tooling and deployment are the most boring parts of any language.
What's more interesting and what's more difficult to find shortcuts for is understanding the spirit of the language, knowing its weaknesses, bottlenecks and inner workings. The threading model, what's done at compile time and what's done at run time, etc etc. There are so many things you need to learn before you can judge any given language relative to others that you know.
I almost chose nim over rust, but then rust is an industrial machine with a ton of weight behind it. Nim may be the better language for my needs but not as good an ecosystem. Sad but true.
I agree. I love Nim, it's such a great language but I try to stick to Rust because I have a feeling that's where the industry is headed. And Rust will improve over time, maybe at some point it will be more pleasant to use.
> Maybe at some point it will be more pleasant to use.
Unless there is a complete rewrite of the borrowing rules and thus breaking changes that's highly unlikely of ever happening. Instead of following what the industry promotes, the same industry that lays off thousands of people on a whim, maybe start following community projects. Do what you like most, don't expect them to change to fit your needs and wants and stop trying to get into every trend that is hot. At least that my mantra to prevent having regrets later on. Good programmers have a future no matter what language they use.
Provided that one ecosystem is huge (e.g. Python) and the other one small.
With that said, learning to write production code in a small language allows for certain better developments in mindset (e.g. Stack Overflow? Forget about it) and better in the quirks of the language, which you will see (to a lesser extent) in other languages.
No, not at all. These projects were driven without a direct, professional need. That grew them so much, that they became viable in professional settings as well. Nim could get big too, but it has a lot of competition to overcome. Python and Linux were seeded and nurtured in education; that's hard to beat.
> most programmer convenience is sacrificed on the altar of "never crash". This makes it a good choice for sending a rocket to space
When it comes to safety- or mission-critical software, crashing due to a dangling pointer, a stack overflow or a failed allocation are equally catastrophic, and not crashing but producing a wrong answer due to an algorithmic error is also equally catastrophic. In fact, producing the correct answer late can also be just as catastrophic when the system is real-time -- we don't care if the operation takes 1ms or 99ms, but if the deadline for emitting a control command is at 100ms, emitting it 100.001ms late is just as catastrophic as a crash (while completing the calculation in 20ms or 95ms is equally good). Never crashing for any reason, never producing a wrong result, and never producing a correct result late are all equally critical.
How are such systems written? They normally require very simple code and very simple algorithms -- to make analysis by tools, and even more importantly, by humans as easy as possible. Any compiler magic or implicitness in the code may be problematic (possibly including reference-counting if a bound on deallocation time cannot be guaranteed). We don't want to make code as fast as possible, but as non-surprising and as predictable as possible.
It depends on the domain. Sometimes a wrong answer is worse than a crash (e.g. a bank). Sometimes it is better (e.g. a game).
Similarly the worst bugs can be crashes (random, non-deterministic, release-mode only, timing sensitive heisenbugs), or they can be undetected algorithmic errors.
It entirely depends on the specifics. But either way, Rust's tradeoffs are easily worth it in terms of eliminating bugs unless you really don't care.
The domain is "sending a rocket to space" whose specifics are, in general, not only that you really don't care, but that one of the things you want to maximise and make as easy as possible is inspectability by humans.
I have used Nim for personal projects for 6 years now and it continues to surprise me on how well versed it is for many problem domains. I am fond of it's SPA framework, karax https://github.com/karaxnim/karax for which I wrote a translation utility https://github.com/nim-lang-cn/html2karax Latest Nimv2 release candidate has improved in the ergonomics and syntax that affect compilation to js, so I was able to cleanup my webapp's code to be less verbose. On GPU programming there has been a few projects that touch GPU programming, most notably https://github.com/treeform/shady
Also on the game development front, I maintain a raylib wrapper https://github.com/planetis-m/naylib As long utilities like c2nim https://github.com/nim-lang/c2nim exist, it's trivial to create bindings of C/C++ libraries. One thing I want to experiment more is making it more automatic by writing a callback exposed by c2nim that transform the generated code using Nim's AST. But regardless in that project I was able to write safe language abstractions on top of the bindings that provide a more native experience. It has scope-based memory management, generics and ... function/enum overloading.
The author tests the seven languages python, julia, rust, c#, swift, c and nim by programming a Yahtzee bot in each and comparing them. The "winner" is nim.
Unfortunately the author only provides the code for nim.
So it is subjective based on a simple practical project (1000 locs). Unfortunately it leaves out all the other "cool kids" like crystal, zig, v, dart, d and so forth...
> But the big dealbreaker with Julia is it only complies on the fly. That means you can't just hand a compiled executable to someone. You must give them instructions to install Julia and all your dependencies first. That's no good for anything you want to ship like a desktop app or a game. You also don't want Julia on your server recompiling with every web request.
That's an issue with Python as well, but the Julia team is working on that.
How well does this work? I've seen this library but never used it. Julia is really slow from a cold start, which limits its utility for scripting.
About a year ago I was very off-put by someone on Reddit who sort of scolded me for wanting to use Julia as a general-purpose scripting language at the CLI. The Redditor said this wasn't as idiomatic usage of Julia. Idk how representative this is of the community at large.
PackageCompiler is one of the 6 projects pinned in the JuliaLang repo, along with Pkg (the package manager), the juliaup installer, IJulia Jupyter kernels, the Julialang.org website, and Julia itself.
It's not "idiomatic usage of Julia". What the hell? It's literally given by the language developers as first class as the package manager and put on the website right next to it.
By any chance, was your post about starting Julia in a tight inner loop? If so, this really isn't idiomatic and exactly one of the few cases where Julia isn't great. Though it can be circumvented.
Tooks some digging, but [here it is](https://www.reddit.com/r/Julia/comments/n2mje4/comment/gwmt1...). I was talking about invoking Julia from the shell as a script. Because loading packages takes time, I found that my script was significantly faster in Python than Julia.
I never did get around to trying the daemon mode thing someone suggested. Someday I'll have a reason to do this again and try it.
Scolded is a bit exaggerated, I think. Python starts faster, but as soon as you have long running complex tasks Julia takes off - it's just the tradeoff you were hitting.
Idk man. I started with BASIC on the CPC 464. Then moved over to the C64 and ASM.
I believe it was good starting so low level. At times I had to type in the actual machine code to program.
When you're young your mind is very capable of learning. The more complex the better. The lower level the better.
Although a former co-worker said in a break-chat that it's best to learn something from both sides top down and bottom up.
I'm just not sure how that would work.
I have to say, I was quite impressed at the speed of Julia in this test. Getting speeds faster than Rust in a high level language is extremely impressive; maybe I should really make an effort to learn Julia again.
I was with you till Julia but why C and C++? They're OG languages but I can't think of one situation where you have to use them for Machine Learning.
Out of Python, R and Julia, just learn ONE for 1y and become the best at it. That way, you can build anything in that language and then use that to build something mind-blowing.
I like nim but I've never really gotten used to the slicing syntax. I guess that I'm just too used to python, plus you have to be a bit careful to leave a whitespace when doing backwards indexing so that the operators work correctly.
So python:
a = [0,1,2,3,4,5]
a[0:5] => [0,1,2,3,4] - half closed interval
a[0:-1] => [0,1,2,3,4] - negative indexing
a[:5] => [0,1,2,3,4] - skipping start index
a[3:] => [3,4,5] - skipping end index
nim:
var a = [0,1,2,3,4,5]
a[0..5] = [0,1,2,3,4,5] - closed interval
a[0..^1] = [0,1,2,3,4,5] - closed interval with negative indexing
a[..4] = [0,1,2,3,4] - skipping start index
a[..<5] - Error - doesn't work
a[3..] - Error - doesn't work
a[0..<5] = [0,1,2,3,4] - half closed interval
a[0..< ^1] = [0,1,2,3,4] - half closed interval with negative indexing
(Important - the white space is required, it will crash otherwise!)
I'm sure that there are good reasons for this behaviour and that if you use nim a lot, you just get used to it but I find the python syntax just nicer.
To explain 'why' it behaves this way is quite simple the stdlib does not have slicing operators for all of your expected cases, Nim does not have postfix operators(aside from [], {}, *) so 3.. is just impossible to define. If one really wanted to https://play.nim-lang.org/#ix=4nSQ solves most of the issues.
func with same name, can have different named parameter, and return type.
func iconForFile
func iconForFiles
func iconForContentType
Will be the alternative in every other language(paramtype overload is available in c*/java/kotlin to extent, but it is less explicit), Swift sounds neat isn't it?
You can see more examples, in particularly class overloads it looks cleaner and convenient to communicate, verbosity is not an issue, code is optimised for reading.
I realise this is only a surface level comparison but the Julia code uses the built in factorial function which I think uses lookup tables and the C# version uses a recursive static method, the Python code has your own lookup table. I guess that’s on the hot path? I haven’t taken the time to check if you’ve put a lookup or memoized it somewhere.
Complains about words, semi-colons, Julia syntax which prevents writing one-liners, wants a REPL, wants symbols and brevity and optimised numeric work, doesn't try APL?
Swift: 840 lines
Nim: 768 lines
C#: 872 lines
Julia: 761 lines
Rust: 861 lines
Python: 550 lines
I am curious what a good APL-er [better than me] could do without trying to codegolf it, and how it performs.
They do, but like so many other Rust-projects, they're in their infancy - from the project README: "The project is still in early development, expect bugs, safety issues, and things that don't work".
> It's hard to feel happy grinding out a contrived LINQ expression for the same result as a "practically English" list comprehension in Python.
I feel the other way. LINQ is great for building pipelines. List comprehensions are kinda like English, but it gets ugly when you use the flattening feature, or when you want something more than just map and filter.
Looking at their C# code, they’re going at it in a fairly low-level way, with a lot of aliases for numeric types and a lot of casting between them. For example:
If you take out the unnecessary number type abuse and just use `int` everywhere, you get something much clearer:
private static float DistinctArrangementsFor(DieVal[] dievals) {
var dievalCounts = dievals
.GroupBy(x => x)
.Where(group => group.Key != 0)
.Select(group => group.Count())
.ToArray();
var divisor = dievalCounts
.Select(factorial)
.Aggregate(1, (a, b) => a * b);
var nonZeroDievals = dievalCounts.Sum();
return factorial(nonZeroDievals) / divisor;
}
Also, even with all this hard work with weird types, this function has a suspicious return type, since it does integer division and returns a float.
There are a few more weird things, like not using switch statements/switch expressions, or this function:
// calculate relevant counts for gamestate: required lookups and saves
public int counts() {
var ticks = 0;
var false_true = new bool[] {true, false};
var just_false = new bool[] {false};
foreach (var subset_len in Range(1,open_slots.Count)){
var combos = open_slots.Combinations(subset_len);
foreach (var slots_vec in combos ) {
var slots = new Slots(slots_vec.ToArray());
var joker_rules = slots.has(YAHTZEE); // yahtzees aren't wild whenever yahtzee slot is still available
var totals = Slots.useful_upper_totals(slots);
foreach (var _ in totals) {
foreach (var __ in joker_rules? false_true : just_false ){
// var slot_lookups = (subset_len * subset_len==1? 1 : 2) * 252; // * subset_len as u64;
// var dice_lookups = 848484; // // previoiusly verified by counting up by 1s in the actual loop. however chunking forward is faster
// lookups += (dice_lookups + slot_lookups); this tends to overflow so use "normalized" ticks below
ticks++; // this just counts the cost of one pass through the bar.tick call in the dice-choose section of build_cache() loop
} } } }
return (int)ticks;
}
There are a few weird things, like the pointless foreach loops at the end (which could be replaced with a bit of arithmetic) or the use of a foreach loop where a regular for loop would be better.
If this code was rewritten in more idiomatic C#, it could be the basis of a good comparison between languages’ performance and syntax. But I’m not sure if it’s a good example at this point.
> I like named parameters for self-documenting code. Julia allows named parameters but naming them doesn't provide any flexibility with regard to their position when calling the function.
I don't understand this complaint? "Named parameters" i.e. keyword arguments in functions work great in Julia!
julia> foo(;x = 1, y = 2, z = 3) = x + y + z
foo (generic function with 1 method)
julia> foo(; z = 4, y = 0)
5
Perhaps the author meant default arguments?
julia> foo(x = 1, y = 2, z = 3) = x + y + z
foo (generic function with 3 methods)
julia> foo(3)
8
julia> foo(0, 0, 0)
0
> this flexibility does tend to cause bugs working with other people's code. This is compounded by Julia's pursuit of composability. If you cram your custom data value into someone else's function, and if it seems to be the right shape, it will probably work! Unless it doesn't. In that case you just get silently wrong answers. This is a deal-breaker for a lot of scientific computing people where Julia would otherwise shine.
This is almost entirely mitigated by most people using `eachindex` or `axes` or other functions that make the array related code index agnostic. The only reason I say almost entirely is because there's probably some really old code that doesn't work the right way and would silently fail or do the wrong thing if you changed the indexing convention. That said, calling this a "deal-breaker for scientific computing" seems extreme.
> I also find Julia's errors to be fairly obtuse. And I see a lot of them because dynamic typing means the tools can't catch most errors before run-time.
I 100% agree. Julia errors are my biggest gripe with the language at the moment.
> But the big dealbreaker with Julia is it only complies on the fly. That means you can't just hand a compiled executable to someone. You must give them instructions to install Julia and all your dependencies first. That's no good for anything you want to ship like a desktop app or a game. You also don't want Julia on your server recompiling with every web request.
There are packages like PackageCompiler that work pretty well. And I'm positive in the near future (3 years?) we'll have a version of Julia where PackageCompiler will produce small binaries. That said, the convenience of writing in Julia and shipping a precompiled app is pretty awesome, and I personally don't mind it taking more space on disk.
> Like all compiled languages, the development cycle involves a lot of recompiling as you go. For small programs it doesn't matter, but this creeps up as the program grows -- or when you add a heavy dependency, like a plotting library. Julia suffers from this compiled-language drawback, but without the normal advantage of getting a compiled executable you could distribute.
This used to be one of my biggest gripes, but things have gotten a lot better with every version of Julia.
----
My first order approximation when picking a language is this:
1) if I think it'll be easy to write in Python, I should write it in Julia.
2) If I want to ship a precompiled binary that exposes a command line interface, I'll think about using Julia first, and if the size of the binaries are an issue, I might pick Rust.
wrt to Nim, I'm waiting to see how the ORC story shakes out + better documentation about ORC to appear on the scene and for more packages to adopt it. I think Nim currently has too small a community. I've found packages that are commonplace in Rust or Julia are just not even available in Nim. Sometimes a package is 5 years old and hasn't been updated. Yes, it is easy to write interfaces but that requires a lot of work. If it is a project where I'm writing code just by myself, Nim might be a good choice but if I'm working with someone else, having them learn Nim is much harder. The error messages in Nim also need to get better in my opinion.
> You don't have to decide between snake_case or camelCase. You can define your variable either way and both references will work. Ditto for most cases of capitalization. I thought this might be problematic, but in practice it's brilliant. I think that sentiment applies to many of Nim's unexpected design choices.
I personally don't like this at all. Every time I search a nim codebase, I have to use `nimgrep` instead of using `ripgrep`. I have SO many aliases built on top of things like `ripgrep` and `fzf` and none of them are certain to work in `Nim`. It's frustrating that the community is so divided on this, because even though I can see there are benefits to this approach, the benefits pale in comparison to getting user adoption and buy in to use the language.
One of my senior developers on my team agreed to learn Nim in their spare time as a favor to me, to evaluate it for a project at work, and as soon as they came across this "feature" they were so turned off, they wrote the language off as being too weird.
There's a LOT of average programmers out there, and a lot more analysts and data scientists that just want to get shit done, and from personal experience I think it's extremely hard to get adoption for Nim in the scientific community. If 7 out of 10 people have used Python, 4 out of 10 people may have heard of Julia and heard that it's new and modern, but 0 people have even heard of Nim. I've gotten SO many quizzical looks over the years, it is not even funny.
> I've gotten SO many quizzical looks over the years, it is not even funny.
May I ask what's your involvement in the Nim community, I don't recognize your username? Also sorry to say but if your colleagues look you like that maybe they don't respect you?
That said: Nim does look nice.