Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> From what I understand compiler frontends can be augmented to solve all of the problems Rust is trying to solve.

This is not correct. If it were this easy, we would have done it!



Then perhaps correct me on what problems Rust is specifically trying to solve? As far as I can tell Rust stands on the shoulders of the LLVM backend- there are existing frontends, like C and C++ but Rust started from scratch.

My question is for what purpose? The problems we face today syntactically relate to intuitively expressing global superword-level parallelism to fully leverage AVX-512. No language is currently sufficient in this regard. Was Rust's syntax designed to maximally inform the autovectorizer in the backend to do this? That would be a worthy goal for this kind of investment. Otherwise it's wheel reinvention to me, and I feel Rust is being pushed really, really hard.


Writing vectorizable code is certainly important for some use cases, but it’s not relevant for the majority of programming tasks, even in systems/infra programming. Most things outside of numerical work just aren’t bound by the effective width of the core backend.

Rust doesn’t have to solve every problem in order to be useful. The major problem it does solve (relative to C++) is explicit compile-time tracking and enforcement of the scope of object validity, preventing (or at least making more unlikely) a class of bugs including use-after-free, uninitialized reads, buffer overflows, dangling pointers, and so on — without requiring a garbage collector.

This C++ code, a real mistake I’ve seen in the wild, would be totally impossible to write in Rust without explicitly using the “unsafe” keyword:

    SomeType* foo()
    {
        SomeType x;
        return &x;
    }
None of this has to do with vectorization, but it’s still important and useful for many tasks.

(As an aside: I would be curious to know if there are any languages that make it easier to write auto-vectorizing compilers. Personally I’ve never seen numeric tight loop kernels written without explicit use of architecture-specific intrinsics and/or raw assembly language. But I’ve only worked on that sort of code for at most a few months in my career, so I’d be happy to learn if there’s research in this area.)


> Explicit compile-time tracking and enforcement of the scope of object validity, preventing (or at least making more unlikely) a class of bugs including use-after-free, uninitialized reads, buffer overflows, dangling pointers.

Why does an entirely new syntax have to be invented from the ground up to solve this problem? Does C-like syntax not adequately inform a compiler that wants to implement something like Rust's borrowing? It appears to work for `std::unique_ptr` and `std::move` semantics. It doesn't appear that Rust's syntax conveys any order of magnitude more information. Minor augmentations to C/C++, taking it some other direction which fixes whatever shortcomings for next-generation static analysis tools can be the focus of effort. I don't see why Rust first has to invent the universe.

Here's an example of overhead effort required by Rust: https://github.com/rust-vmm/kvm-bindings. This has already been implemented in C and distributed with most operating systems. Why wrap it? Now when KVM gets new features Rust has to also update its wrapper interface too. Isn't that just a propagation delay?


> Why does an entirely new syntax have to be invented from the ground up to solve this problem?

I don’t know. I’m not on the core Rust team.

Whether it’s theoretically possible to write static analyzers that implement Rust-like guarantees in C-like languages seems a bit irrelevant to me, since in the real world, none actually exist. Rust does.

I will happily continue using Rust, and happily reevaluate that choice if these next-generation static analyzers ever materialize.

(Also: Personally, to me, understanding rust syntax is actually much easier than trying to remember what && means in various contexts, the difference between std::move and std::forward, the difference between decltype(foo) and decltype((foo)), and various other C++ gotchas. But YMMV.)


> Whether it’s theoretically possible to write static analyzers that implement Rust-like guarantees in C-like languages seems a bit irrelevant to me, since in the real world, none actually exist. Rust does.

actually, MSVC has a lifetime tracker for a year now : https://devblogs.microsoft.com/cppblog/lifetime-profile-upda...

and clang's is progressing nicely : https://llvm.org/devmtg/2019-04/slides/TechTalk-Horvath-Impl...


Both of those are much newer than Rust. So it appears Rust was valuable after all, even if all it did was influence C++ static analysis developers!

I will also point out that these are still much less than what Rust offers — read through the list of caveats in the MSVC post. Or the fact that “analysis is only function-local” in Clang...

Oh, and for these to catch everything you would need to only depend, transitively, on things that also used them. Just like in Rust you need to only depend transitively on things that don’t abuse `unsafe`, but that seems less unreasonable to me.


> Or the fact that “analysis is only function-local” in Clang...

This is not inherently a problem; Rust's analysis is also function-local, it's actually a design goal.

That said, you're 100% right at the issues here; this does not guarantee memory safety, it is far less than what Rust provides. That said, I'm happy it exists; anything that makes stuff better is a win.


Thanks; you’re right, of course. I think I was slightly misinterpreting the meaning of “function-local” when I originally wrote that.


As someone who wrote a precursor to that code (I work on crosvm), any language other than C has to write binding code for KVM (or any kernel interface for that matter). Are you suggesting that the only language that should get to use KVM is C (and other languages that can consume C headers)?

Additionally, when KVM gets new features, the hard part will not be adding more bindings to that crate; it will be utilizing that feature, which would have to happen in any langauge.


> Are you suggesting that the only language that should get to use KVM is C

What I'm asking is why Rust can't read the C header. I know that sounds like a ridiculous question, but I'm asking it honestly and seriously.


> Does C-like syntax not adequately inform a compiler that wants to implement something like Rust's borrowing? It appears to work for `std::unique_ptr` and `std::move` semantics.

It does not. Note that Rust's move semantics are different than C++'s, Rust has what C++ folks call "destructive move."


>This has already been implemented in C and distributed with most operating systems. Why wrap it?

Because the wrapper will offer Rust features that C failed to implement for almost half a century.


> Then perhaps correct me on what problems Rust is specifically trying to solve?

Here's the very first presentation about Rust: http://venge.net/graydon/talks/intro-talk-2.pdf

There are a few different goals, but one of the largest ones is memory safety. C and C++ are not memory safe, but Rust is. Memory safety is impossible to retrofit onto these languages, and so, you need a new language.

> As far as I can tell Rust stands on the shoulders of the LLVM backend- there are existing frontends, like C and C++ but Rust started from scratch.

Implementations are irrelevant, the issue is semantics.


> Implementations are irrelevant

And that's why Rust started from scratch. I totally understand the problem and the justification to address it. Yet nobody has ever been able to justify why it's necessary to dispense with the entirety of C and C++ to achieve memory safety. The power of frameworks like LLVM give you freedom to implement whatever semantics you want. Rust wouldn't have gotten off the ground if it weren't for that kind of power. I feel this power has been abused. For example, you could create a fork of C++ that merely compile-errors when you use a pointer and not a reference unless you enclose it in some `unsafe` block that you invented -- anything you want like that is possible. One can simply start with C++ augmented so that everything is const by default; that is the ideal progress to me.

There's still no justification for Rust to be completely hausdorff to the last 50 years of established C systems and start over. If Rust doesn't really offer anything substantive for this amount of divergence I feel it will always be relegated to wrappers and a niche community.

That just makes Rust a simulacrum.


> Yet nobody has ever been able to justify why it's necessary to dispense with the entirety of C and C++ to achieve memory safety.

As soon as you make a backwards incompatible change, you've already made a new language, effectively. Because:

> For example, you could create a fork of C++ that merely compile-errors when you use a pointer and not a reference unless you enclose it in some `unsafe` block that you invented

This means that effectively all C++ code would fail to compile under this implementation.

> One can simply start with C++ augmented so that everything is const by default; that is the ideal progress to me.

This is nowhere near enough to achieve the goal. Sure, it's a thing someone could do. But it's kind of irrelevant.


> This means that effectively all C++ code would fail to compile under this implementation.

You're implying that all C++ code uses pointers and that's patently false. It's even false for C code. Pointers was the first thing C++ boxed with references. Box it more. Forks of C++ even exist, like GNU++ extensions. There's a whole spectrum of possibilities between that and where you are right now and I don't see why everyone needs to cross this chasm and embrace Rust at this cost.

Breaking one part of the language doesn't break everything for everyone. That's definitely a fallacy if it couches the justification I'm chasing here.


> For example, you could create a fork of C++ that merely compile-errors when you use a pointer and not a reference unless you enclose it in some `unsafe` block that you invented -- anything you want like that is possible. One can simply start with C++ augmented so that everything is const by default; that is the ideal progress to me.

But then it wouldn't be C++, it would be something else. You would very quickly end up with a language incompatible with C or C++ if you set off to implement the safety mechanisms of Rust on top of them.

I mean, look at how long it took to get modules into C++, and that's a feature that is opt-in and won't change the semantics of existing code. You can't "simply start with C++" for anything.

> There's still no justification for Rust to be completely hausdorff to the last 50 years of established C systems and start over.

C and C++ aren't static either. They're evolving too and modern iterations of the language aren't compatible, or even recognizable, from the earlier incantations.

(Is the big hangup here that Rust puts the type on the right side of the variable name?)

> If Rust doesn't really offer anything substantive for this amount of divergence I feel it will always be relegated to wrappers and a niche community.

If you're coming from a background of C++ development it's not that hard to pick up Rust. The borrow checker can be a bit of a pain but it isn't mysterious; it's got a well defined set of rules it plays with. There's a lot more to Rust than memory safety, it has a well laid out feature set that is refreshing and probably not implementable in C++ in my lifetime. The trait-based generics are great, the metaprogramming facilities are great, pattern matching is killer, that (nearly) everything is an expression is something C and C++ desperately need, and the out-of-the-box tooling is phenomenal.

I get that Rust isn't for everyone, but painting it as irrelevant and unnecessary is too extreme, and arguably wrong if it's inspiring linting tools that bring some of Rust's rules to the C/C++ world, or inspiring language features.

On a side note, what do you think of Apple moving away from Objective C toward Swift?


So my running conclusion is that Rust's founding decisions were made with a sort of "go big or go home" logic given this premise. To conclude that breaking even one part of C/C++ is grounds to throw up one's hands and drain the baby with the bathwater makes sense through this prism. At that point one might as well pour through the literature and academic languages to pick out all the goodies and gimmicks that haven't been mainstreamed before too.

The problem here is that Rust is not graceful replacement for C/C++. I'm being sold on Rust from a systems perspective and the system is already written in C. It's a simple and incomplete language begging for extension but it's done a good job of abstractly representing commodity computing hardware to a programmer for 50 years. Rust doesn't even make an attempt to gracefully regrade C/C++ into Safe-C++ and I can't come to any other conclusion that this is for nothing more than a lack of creativity.

There is an ignorance to the real costs of reinventing the wheel when one doesn't replace the original wheel too. That's why I am starting to truly convince myself Rust is actually harmful. If Rust gracefully took C in another direction (like C++ did), I could add Rust compilation units to my projects and share C interfaces which basically give me access to the entire existing universe so long as I use something like `extern "Rust" {` or visa-versa. If Rust broke ~10% of C++ I could put forth ~10% effort to port my projects to it and integrate it entirely. At a certain point down that vector, Rust can be the messianic "C++ done right" without looking back.


With the introduction of modules in C++20 there is now an intriguing possibility of implementing a new/cleaned-up language with compatible object model/ABI and using modules to interface with existing C++ and even C (via header units).




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: