Many new languages are still recycling ideas from the 1970's research labs.
Outside affine types, all the praise for Rust's type system traces back to Standard ML from 1976.
The heretic of doing systems programming in GC enabled programming languages (GC in the CS sense, including RC), goes back to research at Xerox PARC, DEC and ETHZ, late 1970's, early 1980's.
Other things that we know, like dependent types, effects, formal proofs, capabilities, linear and affine type systems, are equally a few decades old from 1980's, early 1990's.
Unfortunely while we have progressed, it seems easier to sell stuff like ChatGPT than better ways to approach software development.
Earlier than the sixties? All elements of OOP were known before 1967, but their combination, which we still use today under the title OOP, appeared in Simula 67 for the first time. I think the first appearance of a GC in literature was in 1960.
I think, the first mark‑and‑sweep collector was published in McCarthy's 1960 Communications of the ACM paper "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I". It's resonable to assume, that they already had it when Steve Russell implemented the first Lisp evaluator, but we don't know exactly when it was added.
No, that paper doesn't describe the garbage collector. I do think it is true that before it was published Slug Russell had implemented the GC, but I think it's correct that we don't have listings from that early.
Edit: yes, yes it does describe the garbage collector.
Isn't the "Free-Storage List" in section 4c, starting on page 26, a mark & sweep collector? They didn't use the term then, but I think it describes one. I'm not aware of any earlier publication.
The paper only has 12 pages, but there is a section 4c with that title starting on page 192 of https://dl.acm.org/doi/pdf/10.1145/367177.367199, and you are correct that it describes a mark-and-sweep collector. I was mistaken, and I appreciate the correction.
Yes, but ideas are (mostly) worthless. I mean, they are necessary, but that's the easy part, building the technical foundation that make it possible is the hard part.
The internet needs wires and routers, distributed computing need a good network (i.e. the internet), current-day AI needs GPUs and GPUs need silicon chips that defy the laws of physics. Really, looking at the EUV lithography process makes all of computer science feel insignifiant by comparison, everything about it is absurd.
The real progress is that now, we can implement the ideas from the 70s, the good ones at least. I don't want to diminish the work of the founders of computer science, there is real genius here, but out of the billions of people on this planet, individual geniuses are not in short supply, but the real progress come from the millions of people that worked on the industrial complex, supply chains and trade that lead to modern GPUs, among everything that define modern computing.
"Fully specified" is the important part here, and the work that went into making the idea fully specified is where the value lies.
For this, we could look at intellectual property laws. Ideas are not protected. Neither by patents, nor copyright, nor trademark. If you want to make your idea worthy with regard to the law, you have to "fully specify" it, turning it into an invention (patent), or code (copyright).
Aside from bringing in the groundbreaking feature, Rust doesn't bring in any groundbreaking changes. It could be argued that bringing in lesser known features to more people is a good thing in it's own right.