Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because what we know about programming has progressed, and new languages appear to take advantage of that.


Many new languages are still recycling ideas from the 1970's research labs.

Outside affine types, all the praise for Rust's type system traces back to Standard ML from 1976.

The heretic of doing systems programming in GC enabled programming languages (GC in the CS sense, including RC), goes back to research at Xerox PARC, DEC and ETHZ, late 1970's, early 1980's.

Other things that we know, like dependent types, effects, formal proofs, capabilities, linear and affine type systems, are equally a few decades old from 1980's, early 1990's.

Unfortunely while we have progressed, it seems easier to sell stuff like ChatGPT than better ways to approach software development.


It seams like most things boil down to ideas from the 70's. The internet, distributed computing, AI, etc...


> most things boil down to ideas from the 70's

Rather from the sixties. E.g. OOP including dynamic dispatch, late binding, GC etc. appeared 1967 with Simula.


GC was a bit earlier. :)


Earlier than the sixties? All elements of OOP were known before 1967, but their combination, which we still use today under the title OOP, appeared in Simula 67 for the first time. I think the first appearance of a GC in literature was in 1960.


Did Lisp have GC from the beginning? If so, that would be 1958.


I think, the first mark‑and‑sweep collector was published in McCarthy's 1960 Communications of the ACM paper "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I". It's resonable to assume, that they already had it when Steve Russell implemented the first Lisp evaluator, but we don't know exactly when it was added.


No, that paper doesn't describe the garbage collector. I do think it is true that before it was published Slug Russell had implemented the GC, but I think it's correct that we don't have listings from that early.

Edit: yes, yes it does describe the garbage collector.


Isn't the "Free-Storage List" in section 4c, starting on page 26, a mark & sweep collector? They didn't use the term then, but I think it describes one. I'm not aware of any earlier publication.


The paper only has 12 pages, but there is a section 4c with that title starting on page 192 of https://dl.acm.org/doi/pdf/10.1145/367177.367199, and you are correct that it describes a mark-and-sweep collector. I was mistaken, and I appreciate the correction.


Ok, I see. Just downloaded the first hit from Google, which has 34 pages and is nicely printed: https://www-formal.stanford.edu/jmc/recursive.pdf.


The concept of automatic memory management dates back to the 1950s.

However, John McCarthy, the creator of Lisp, introduced the first widely recognized garbage collection mechanism around 1959/1960.


If so, he didn't write about it in that paper, and I don't think he introduced it at all; I think Slug Russell did.

Correction: as Rochus pointed out, he certainly did write about it in that paper, giving a complete description.


Indeed, everything old is new again.


Yes, but ideas are (mostly) worthless. I mean, they are necessary, but that's the easy part, building the technical foundation that make it possible is the hard part.

The internet needs wires and routers, distributed computing need a good network (i.e. the internet), current-day AI needs GPUs and GPUs need silicon chips that defy the laws of physics. Really, looking at the EUV lithography process makes all of computer science feel insignifiant by comparison, everything about it is absurd.

The real progress is that now, we can implement the ideas from the 70s, the good ones at least. I don't want to diminish the work of the founders of computer science, there is real genius here, but out of the billions of people on this planet, individual geniuses are not in short supply, but the real progress come from the millions of people that worked on the industrial complex, supply chains and trade that lead to modern GPUs, among everything that define modern computing.


As for this measure, Software is worthless as well, which is basically fully specified ideas.


"Fully specified" is the important part here, and the work that went into making the idea fully specified is where the value lies.

For this, we could look at intellectual property laws. Ideas are not protected. Neither by patents, nor copyright, nor trademark. If you want to make your idea worthy with regard to the law, you have to "fully specify" it, turning it into an invention (patent), or code (copyright).


Aside from bringing in the groundbreaking feature, Rust doesn't bring in any groundbreaking changes. It could be argued that bringing in lesser known features to more people is a good thing in it's own right.


Hey Walter, while we have you. Have you ever had the itch to make a newer D that takes all the fancy constructs while retaining the good parts?


We're doing just that by introducing an "Editions" feature that will streamline the feature set.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: