Hacker Newsnew | past | comments | ask | show | jobs | submit | bakkoting's commentslogin

Neat! I wonder how slow this would be running in wasm. In my dream world it would also use WebGPU but that's a much bigger lift.


README says it’s optimized for Metal, if it really is using metal compute shader, apparently the programming model is fairly similar to WebGPU. You could try asking Claude to translate it :)


You're thinking of a previous report from a month ago, #897 or #481, or the one from two weeks ago, #728. There's a new one from a week ago, #205, which is genuinely novel, although it is still a relatively "shallow" result.

Terence Tao maintains a list [1] of AI attempts (successful and otherwise). #205 is currently the only success in section 1, the "full solution for which subsequent literature review did not find new relevant prior partial or full solutions" section - but it is in that section.

As to speed, as far as I know the recent results are all due to GPT 5.2, which is barely a month old, or Aristotle, which is a system built on top of some frontier LLMs and which has only been accessible to the public for a month or two. I have seen multiple mathematicians report that GPT-5.2 is a major improvement in proof-writing, e.g. [2]

[1] https://github.com/teorth/erdosproblems/wiki/AI-contribution...

[2] https://x.com/AcerFur/status/1999314476320063546


Thanks for the wiki link, very interesting, in particular

- the long tail aspect of the problem space ; 'a "long tail" of under-explored problems at the other, many of which are "low hanging fruit" that are very suitable for being attacked by current AI tools'

- the expertise requirement, literature review but also 'Do I understand what the key ideas of the solution are, and how the hypotheses are utilized to reach the conclusion?' so basically one must already be an expert (or able to become one) to actually use this kind of tooling

and finally the outcomes which taking into consider the previous 2 points makes it very different from what most people would assume as "AI contributions".


Formally JavaScript is specified as having TCO as of ES6, although for unfortunate and painful reasons this is spec fiction - Safari implements it, but Firefox and Chrome do not. Neither did QuickJS last I checked and I don't think this does either.


ES is now ES2025, not ES6/2015. There are still platforms that don't even fully implement enough to shim out ES5 completely, let alone ES6+. Portions of ES6 require buy in from the hosting/runtime environment that aren't even practical for some environments... so I feel the statement itself is kind of ignorant.


Have you seen the YoWASP toolchain for VSCode [1]? It sounds pretty similar.

[1] https://github.com/YoWASP/vscode


Yes! YoWASP is fantastic. In fact, that extension came to be after we contracted the dev to create NPM packages for the WebAssembly bundles they're maintaining. We use the exact same bundles if the extension detects that it is running in a browser (or if the user explicitly wants to use them). However, if possible we prefer to download and maintain native tool bundles for performance reasons.

Their VSCode extension is a lot more basic than ours, but it might be more suitable for advanced users. It's basically just a wasm tool runner that you pass command line options into, whereas we also include things such as project management and various visualization options. Which one to use depends on what your needs are, really.


And there's an open issue for that already: https://github.com/bearcove/arborium/issues/62


This hasn't been true since version 5.4.2, released in 2017.

`npm install` will always use the versions listed in package-lock.json unless your package.json has been edited to list newer versions than are present in package-lock.json.

The only difference with `npm ci` is that `npm ci` fails if the two are out of sync (and it deletes `node_modules` first).


Very few packages published on npm include polyfills, especially packages you'd use when doing local scripting.


I'm sorry, but this is just incorrect. Have you ever heard of ljharb[0]? The NPM ecosystem is rife with polyfills[1]. I don't know how you can make a distinction on which libraries would be used for "local scripting" as I don't think many library authors make that distinction.

[0] - TC39 member who is self-described as "obsessed with backwards compatibility": https://github.com/ljharb

[1] - Here's one of many articles describing the situation: https://marvinh.dev/blog/speeding-up-javascript-ecosystem-pa...


Yes. I'm on TC39 as well, and I've talked to Jordan about this topic.

It's true that there are a few people who publish packages on npm including polyfills, Jordan among them. But these are a very small fraction of all packages on npm, and none of the compromised packages were polyfills. Also, he cares about backwards compatibility _with old versions of node_; the fact that JavaScript was originally a web language, as the grandparent comment says, is completely irrelevant to the inclusion of those specific polyfills.

Polyfills are just completely irrelevant to this discussion.


Fair enough. Thank you for the clarification, and I apologize for not recognizing your status as a TC39 member.


If you look at the list of compromised packages, very few of them could reasonably be included in a standard library. It's mostly project-specific stuff like `@asyncapi/specs` or `@zapier/zapier-sdk`. The most popular generic one I see is `get-them-args`, which is a CLI argument parser - which is something Node has in the form of `util.parseArgs` since v16.17.0.


Well they clearly lacked marketing? Pretty sure a red text in npm every time that package was installed that says "hey we have a better way to do this with node alone" would have made a dent in the library usage, but they didn't do anything of the sort.


I don't think there's literally any conforming implementations of modern ECMAScript by that definition.


If you mean "latest" ECMAScript then that is true. Even latest gcc or clang does not support all features from C++23: https://en.cppreference.com/w/cpp/compiler_support.html.


Google mostly does obey web standards that are set by an industry consortium (WHATWG, W3C, or in the case of JavaScript EMCA).

Chrome has the best compliance with standards of any of the big three (see wpt.fyi) - which is not surprising, because they also have the most engineering time dedicated to their browser, and the most people working on standards.

These bodies require buy in from multiple vendors, but generally not unanimity. That said, browsers can and do ship things which haven't been standardized (e.g. WebUSB, which is still only a draft because only Chrome wants to ship it). In a lot of cases this pretty much has to happen pre-standardization, because it is difficult to come up with a good standard from the ivory tower with no contact with actual use. Chrome is unusually good about working in public to develop specifications for such features even when other browsers aren't currently interested in shipping them.

I don't know what problem you think this proposal would solve.


> Chrome is unusually good about working in public to develop specifications for such features even when other browsers aren't currently interested in shipping them.

That is, if there's a promotion, or a company bet, or a need to establish/secure market dominance for one property or another, Chrome dumps a scribble on a napkin, barely engages in any conversation, and ships to production within a few weeks after dumping said scribbles.

Once it's out there, it couldn't care less what other browsers vendors will say. Dominant market share and an army of developers who never bothered to learn about standards processes will make sure that this is now a standard.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: