Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It feels a bit like the field of physics claiming the invention of AI, where we all know that mathematics and/or CS deserve the honor.


Someone changed the Wikipedia article today to call Hopfield a "physicist". Previously the article called him simply a scientist, because his main work wasn't limited to physics. I changed it back now, let's see if it holds up.


It’s ‘physicist’ again now.


User "ReyHahn" has changed it back to physicist. Justification: "he defines himself as physicist but he has worked i mnay fields"

https://en.wikipedia.org/w/index.php?title=John_Hopfield&dif...


To use the local lingo…

[citation needed]

I suppose some might argue that being awarded the Nobel Prize in Physics is enough to call yourself a physicist.

…it does have the unfortunate implication, however, that nominations need not be restricted to physicists at all since any winner becomes a physicist upon receipt of the prize.

It’s sort of like the No True Scotsman but inverted, and with physicists instead of Scotsmen.


The Nobel Committee doesn’t represent the field of physics. I talked to a few former colleagues (theoretical physicists) just now and every one of them found this bizarre.


I think the Nobel prize doesn't want any scientific advance to fall outside the range of awards entirely.


>where we all know that mathematics and/or CS deserve the honor

Or semiconductor manufacturers.

All the math and CS needed for AI can fit on a napkin, and had been known for 200+ years. It's the extreme scaling enabled by semiconductor science that really makes the difference.


That's absurd. The computer science needed for AI has not been known for 200 years. For example, transformers were only invented in 2017, diffusion models in 2015.

(When the required math was invented is a different question, but I doubt all of it was known 200 years ago.)


TBF backpropagation was introduced only in the 1970's, although in hindsight it's a quite trivial application of the chain rule.

There were also plenty of "hacks" involved to make the networks scale such as dropout regularization, batch normalization, semi-linear activation functions (e.g. ReLU) and adaptive stochastic gradient descent methods.

The maths for basic NNs is really simple but the practice of them is really messy.


Residual connections are also worth mentioning as an extremely ubiquitous adaptation, one will be hard-pressed to find a modern architecture that doesn't use those at least to some extent, to the point where the original Resnet paper sits at over 200k citations according to google scholar[1].

[1] https://scholar.google.com/citations?view_op=view_citation&h...


Highway nets introduced them in the 90s


> All the math and CS needed for AI can fit on a napkin, and had been known for 200+ years.

This isn't really true. If you read a physics textbook from the early 1900s, they didn't really have multivariate calculus and linear algebra expressed as concisely as we do now. It would take several napkins. Plus, statistical mechanics was quite rudimentary, which is important for probability theory.


> If you read a physics textbook from the early 1900s, they didn't really have multivariate calculus and linear algebra expressed as concisely as we do now.

This is completely incorrect.


I don’t think calculus existed at sufficient rigor 200 years ago.

Computer science wasn’t even a thing 100 years ago.


Calculus has been around for quite some time.

If Newton had the machinery to fit large models to data, he would have done so. No doubt.


Cauchy’s main work was under 200 years ago; and there’s been quite a lot of work since.

Again, I’m unsure that calculus existed at sufficient level 200 years ago — it didn’t appear in modern form from either Leibniz or Newton.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: