Also C++ could be adopted incrementally by C developers. You could use it as “C with classes”, or just use operator overloading to make vector math more tolerable, or whatever subset that you happened to like.
So there’s really three forces at play in making C++ the standard:
1) The Microsoft ecosystem. They literally stopped supporting C by not adopting the C99 standard in their compiler. If you wanted any modern convenience, you had to compile in C++ mode. New APIs like Direct3D were theoretically accessible from C (via COM) but in practice designed for C++.
2) Better compilers and more CPU cycles to spare. You could actually count on the compiler to do the right thing often enough.
3) Seamless gradual adoption for C developers.
Rust has a good compiler, but it lacks that big ticket ecosystem push and is not entirely trivial for C++ developers to adopt.
Repo contributor here, just to curb some expectations a bit: it's one very smart guy (Kenny), his unpaid volunteer sidekick (me), and a few unpaid external contributors. (I'm trying to draw a line between those with and without commit access, hence all the edits.)
There's no other internal or external Microsoft /support/ that I'm aware of. I wouldn't necessarily use it as a signal of the company's intentions at this time.
That said, there are Microsoft folks working on the Rust compiler, toolchain, etc. side of things too. Maybe those are better indicators!
Don't be, they also killed C++/CX, even went to CppCon 2016 telling us how great future C++/WinRT would bring to us.
Now almost a decade later, VS tooling is still not there, stuck in ATL/VC++ 6.0 like experience (they blame it on the VS team), C++/WinRT is in maintenance, only bug fixes, and all the fun is on Rust/WinRT.
I would never trust this work for production development.
The most infuriating thing is their habit of rebuilding things just about the time they reach a mature and highly stable state, creating an entirely new unstable and unreliable system. And then the time that system almost reaches a stable state - it's scrapped and it starts all over again.
WPF -> UWP -> WinUI -> WinUI 2 -> WinUI 3 is just such a ridiculous chain. WPF was awesome, highly extensible, and could have easily and modularly been extended indefinitely - while also maintaining its widespread (if unofficial) cross platform support and just general rock solid performance/stability. Instead it's the above pattern over and over and over.
And now it seems WinUI 3 is also dead, alas without even bothering with a replacement. Or maybe that's XAMARIN, wait I mean MAUI? Not entirely joking - I never bothered to follow that seemingly completely parallel system doing pretty much the same things. On the bright side this got me to finally migrate away from Microsoft UI solutions, which has made my life much more pleasant since!
I'd say the inertia is far more social than codebase size related. Right now whilst there are pockets of interest there is no broader reason to switch. Bevy as the leading contender isn't going to magic it's way to being capable of shipping AAA titles unless a studio actually adopts it. I don't think it's actually shipped a commercially successful indie game yet.
Also game engines emphatically don't have to be huge. Look at Balatro shipping on Love2d.
> Also game engines emphatically don't have to be huge. Look at Balatro shipping on Love2d.
Balatro convinced me that Love2D might be a good contender for my next small 2D game release. I had no idea you could integrate Steamworks or 2D shaders that looked that good into Love2D. And it seems to be very cross-platform, since Balatro released on pretty much every platform on day 1 (with some porting help from a third party developer it seems like).
And since it's Lua based, I should be able to port a slightly simpler version of the game over to the Playdate console.
There’s a pretty big difference between the Playdate and anything else in performance but also in requirements for assets. So much so I hope your idea is scoped accordingly. But yeah Love2d is great.
It is. I've already half ported one of my games to the Playdate (and own one), I'm pretty aware of its capabilities.
The assets are what I struggle with most. 1-bit graphics that look halfway decent are a challenge for me. In my half-ported game, I just draw the tiles programatically, like I did in the Pico-8 version (and they don't look anywhere near as good as a lot of Playdate games, so I need to someday sit down and try to get some better art in it).
Speaking as a Godot supporter, I don't think sales numbers of shipped games are relevant to anyone except the game's developer.
When evaluating a newer technology, the key question is: are there any major non-obvious roadblocks? A finished game (with presumably decent performance) tells you that if there are problems, they're solvable. That's the data.
Game engines are tools not fan clubs. It’s reasonable to judge them on their performance for which they are designed. As someone who cares about the commercial viability of their technology choices this is a small but positive signal.
What it tells me is someone shipped something and it wasn’t awful. Props to them!
Or maybe Rust allowed them to develop twice as fast. Who knows? We're going by data here, and this data point shows that games can be made in Bevy. No more and no less.
So far I am way less productive in rust than in any language I've ever used for actual work, so to rewrite an entire game engine would seem like commercial suicide.
I was the same the first two times I tried to use rust (earnestly). However, one day it just "clicked" and my productivity exceeds that of almost anything else, for the specific type of work I'm doing (scientific computation)
I think we shouldn't expect any language to lead different programmers to the same experiences. Rust has the inital steep learning curve, and after that it's a matter of taste whether one is willing to forge on and turn it into a honed tool. Also, I think it's clear that Rust excels in some fields far more naturally than in others. Making blanket statements about how Rust, or any language, is (un)productive is a disservice to everyone.
Thanks for the link. This one was also posted awhile back in a rust comment and when I first read it, I thought Google had used Rust in the V8 sandbox, but re-reading it seems that the article uses Rust as an ‘example’ of a memory safe language but does not explicitly say that it uses Rust. Maybe someone with more knowledge can confirm that Rust was (or was not) used in the V8 Google Chrome sandbox example….
Theoretically accessible describes the experience of trying to use D3D from C very well!
Was trying to use it with some kind of gcc for windows. The C++ part was still lacking some required features, so it was advised to use D3D from C instead C++. There were some helper macros, but overall I was glad when Microsoft started to release their Express (and later Community) Editions of Visual Studio.
I access D3D(11) from C in my libraries and tbh it's not any different from C++ in terms of usability (only difference is that the "this" argument and vtable indirection is implicit in C++, but that's just syntax sugar that can be wrapped in a macro in C).
So there’s really three forces at play in making C++ the standard:
1) The Microsoft ecosystem. They literally stopped supporting C by not adopting the C99 standard in their compiler. If you wanted any modern convenience, you had to compile in C++ mode. New APIs like Direct3D were theoretically accessible from C (via COM) but in practice designed for C++.
2) Better compilers and more CPU cycles to spare. You could actually count on the compiler to do the right thing often enough.
3) Seamless gradual adoption for C developers.
Rust has a good compiler, but it lacks that big ticket ecosystem push and is not entirely trivial for C++ developers to adopt.