Hacker Newsnew | past | comments | ask | show | jobs | submit | nurumaik's commentslogin

If you ban grok, people will generate using unlocked open chinese models

Also, this always existed in one form or another. Draw, photoshop, imagine, discuss imaginary intercourse with popular person online or irl

It's not worthy of intervention because it will happen anyway and it doesn't fundamentally change much


Nukes make a lot of difference though, wouldn't be so sure about russia

Minified json would use even less tokens

Yeah, but I tried switching to minified JSON on a semantic labelling task and saw a ~5% accuracy drop.

I suspect this happened because most of the pre-training corpus was pretty-printed JSON, and the LLM was forced to derail from likely path and also lost all "visual cues" of nesting depth.

This might happen here too, but maybe to a lesser extent. Anyways, I'll stop building castles in the air now and try it sometime.


if you really care about structured output switch to XML. much better results, which is why all AI providers tend to use pseudo-xml in their system prompts and tool definitions

Example from this article looks more like "unspecified" behavior rather than "undefined". Title made me expect nasal demons, now I'm a bit disappointed

At least now it should be pretty easy for any tech person to patch apk removing this check

Probably not, because whatever Google is calling its remote attestation scheme this week (SafetyNet? Play Integrity?) has a way to check where the app was sourced and whether it has been altered.

Google is an asshole for making this. When Microsoft first proposed a scheme like that for PCs under the name Palladium, everyone knew it was a corporate power grab. Somehow, it got normalized.


I'd rather complain why somebody decides for me where what websites I'm allowed to open


Looks like they are acquiring the team rather than the product


No, they're clearly acquiring the technology. They're betting Claude Code on Bun, they have an invested interest in the health of Bun.


Why would they want to bet on nascent technology whereas Node.js bas existed for a god 15 years?


Because they needed something that could produce a single binary that works on every platform. They started shipping Claude Code with Bun back in July: https://x.com/jarredsumner/status/1943492457506697482



Every time I see people mention things like this in node vs bun or deno conversations I wonder if they even tried them.

>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.

>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.

Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:

> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe

> [32ms] bundle 60 modules

> [439ms] compile dist/myprogram.exe

it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.

Node is death through thousand cuts compared to the various experiences offered by Bun.

Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.


I agree, they seem to have never tried it at all! Bun DX is the best, and Bun is the trend setter. Others are just catching up!


Yeah, just switch to a non-NPM compatible and inexistant ecosystem just because you need to specify a few parameters in a config.

Anthropic is trying to IPO at a valuation of $300B so if their engineers nor their AI can be bothered to do this then maybe they're not as good as they think they are.


>Yeah, just switch to a non-NPM compatible and inexistant ecosystem just because you need to specify a few parameters in a config.

Yeah.. you do not know what you are talking about.

I run my test suites on both node and bun because I prefer to maintain compatibility with node and it works just fine. I use some of bun's APIs, like stringWidth and have a polyfill module that detects if it's not bun and loads an alternative library through a dynamic import.

Not NPM compatible? Bun is literally a drop in for NPM, in fact you could use it solely as a replacement for NPM and only use node for actual execution if that's your thing. It's much faster to run bun install than npm install and it uses less disk space if you have many projects.

The developer experience on bun is so much better it isn't funny. It remains prudent, I would agree, to not depend too heavily on it, but Bun is many things: a runtime, a set of additional APIs, a replacement for the dogslow npm, a bundler etc. Pick the parts that are easily discarded if you fear Bun's gonna go away, and when using their APIs for their better performance write polyfills.

>Anthropic is trying to IPO at a valuation of $300B so if their engineers nor their AI can be bothered to do this then maybe they're not as good as they think they are.

Or maybe you should have a cold hard look in front of the mirror first and think twice before opening your mouth because the more you write the more the ignorance is shown.


They evidently evaluated Node.js in comparison to Bun (and Deno) earlier this year and came to a technical decision about which one worked best for their product.


I highly doubt that the JS ecosystem is driven mostly by hype so I highly doubt the nodejs solution even put on a table in an internal issue tracker.


Claude Code shipped on top of Node.js for the first four months of its existence.

Why wouldn't they consider their options for bundling that version into a single binary using Node.js tooling before adopting Bun?


Because Microsoft already owns that.


Are you referring to node? MS doesn't own that. It's maintained by Joyent, who in turn is owned by Samsung.


Joyent handed Node.js over to a foundation in 2015, and that foundation merged into the JS Foundation to become the OpenJS Foundation in 2019.

I'm not sure if Joyent have any significant role in Node.js maintenance any more.


Oops, thank you :)

regardless, it's certainly not MS.


Microsoft owns npm outright and controls every aspect of the infrastructure that node.js relies on. It also sits on the board (and is one of the few platinum members) of the Linux Foundation, which controls openjs. It is certainly MS.


it starts fast and does better job than nodejs for their product


That was my thinking is, this would be useful for Claude Code.


Using google was more productive


Seems like pelican benchmark is finally added to model training process


Mission accomplished for Simon:

> Truth be told, I’m playing the long game here. All I’ve ever wanted from life is a genuinely great SVG vector illustration of a pelican riding a bicycle. My dastardly multi-year plan is to trick multiple AI labs into investing vast resources to cheat at my benchmark until I get one.

https://simonwillison.net/2025/Nov/13/training-for-pelicans-...


It's easy to test. Use "one-eyed octopus on a tricycle" or something.


If you want electron app that doesn't lag terribly, you'll end up rewriting ui layer from scratch anyway. VSCode already renders terminal on GPU and GPU-rendered editor area is in experimental. There will soon be no web ui left at all


> If you want electron app that doesn't lag terribly

My experience with VS Code is that it has no perceptible lag, except maybe 500ms on startup. I don't doubt people experience this, but I think it comes down to which extensions you enable, and many people enable lots of heavy language extensions of questionable quality. I also use Visual Studio for Windows builds on C++ projects, and it is pretty jank by comparison, both in terms of UI design and resource usage.

I just opened up a relatively small project (my blog repo, which has 175 MB of static content) in both editors and here's the cold start memory usage without opening any files:

- Visual Studio Code: 589.4 MB

- Visual Studio 2022: 732.6 MB

update:

I see a lot of love for Jetbrains in this thread, so I also tried the same test in Android Studio: 1.69 GB!


I easily notice lag in vscode even without plugins. Especially if using it right after zed. Ngl they made it astonishingly fast for an electron app, but there are physical limits of what can be done in web stack with garbage collected js


That easily takes the worst designed benchmark in my opinion.

Have you tried Emacs, VIM, Sublime, Notepad++,... Visual Studio and Android Studio are full IDEs, meaning upon launch, they run a whole host of modules and the editor is just a small part of that. IDEs are closer to CAD Software than text editors.


- notepad++: 56.4 MB (went gray-window unresponsive for 10 seconds when opening the explorer)

- notepad.exe: 54.3 MB

- emacs: 15.2 MB

- vim: 5.5MB

I would argue that notepad++ is not really comparable to VSCode, and that VSCode is closer to an IDE, especially given the context of this thread. TUIs are not offering a similar GUI app experience, but vim serves as a nice baseline.

I think that when people dump on electron, they are picturing an alternative implementation like win32 or Qt that offers a similar UI-driven experience. I'm using this benchmark, because its the most common critique I read with respect to electron when these are suggested.

It is obviously possible to beat a browser-wrapper with a native implementation. I'm simply observing that this doesn't actually happen in a typical modern C++ GUI app, where the dependency bloat and memory management is often even worse.


Try gvim, neovim-qt or any other neovim gui client, before calling vim a "TUI only experience".

Also, emacs is a GUI app since the 90's .


I never understand why developers spend so much time complaining about "bloat" in their IDEs. RAM is so incredibly cheap compared to 5/10/15/20 years ago, that the argument has lost steam for me. Each time I install a JetBrains IDE on a new PC, one of the first settings that I change is to increase the max memory footprint to 8GB of RAM.


> RAM is so incredibly cheap compared to 5/10/15/20 years ago

Compared to 20 years ago that's true. But most of the improvement happened in the first few years of that range. With the recent price spikes RAM actually costs more today than 10 years ago. If we ignore spikes and buy when the cycle of memory prices is low, DDR3 in 2012 was not much more than the price DDR5 was sitting at for the last two years.


> I never understand why developers spend so much time complaining about "bloat" in their IDEs. RAM is so incredibly cheap compared to 5/10/15/20 years ago, that the argument has lost steam for me. Each time I install a JetBrains IDE on a new PC, one of the first settings that I change is to increase the max memory footprint to 8GB of RAM.

I had to do the opposite for some projects at work: when you open about 6-8 instances of the IDE (different projects, front end in WebStorm, back end in IntelliJ IDEA, DB in DataGrip sometimes) then it's easy to run out of RAM. Even without DataGrip, you can run into those issues when you need to run a bunch of services to debug some distributed issue.

Had that issue with 32 GB of RAM on work laptop, in part also cause the services themselves took between 512 MB and 2 GB of memory to run (thanks to Java and Spring/Boot).


I don’t really complain about bloat in IDEs. They have their uses. But VSCode feature set is a text editor and it’s really bloated for that.


I prefer my RAM to being use for fs cache or on other more useful stuff, instead of launching full lobotomized web browsers.


Anyone saying that Java-based Jetbrains is worse than Electron-based VS Code, in terms of being more lightweight, is living in an alternate universe which can’t be reached by rational means.


> VSCode already renders terminal on GPU

When did they add that? Last time I used it, it was still based on xterm.js.

Also, technically Chromium/Blink has GPU rendering built in for web pages, so everything could run on GPU.


Enabled by default since about a year

> GPU acceleration driven by the WebGL renderer is enabled in the terminal by default. This helps the terminal work faster and display at a high FPS by significantly reducing the time the CPU spends rendering each frame

https://code.visualstudio.com/docs/terminal/appearance#_gpu-...


It's actually been the default since v1.55 which released early April 2021: https://code.visualstudio.com/updates/v1_55#_webgl-renderer-...

Before that from v1.17 (~October 2017) it was using a 2d canvas context: https://code.visualstudio.com/blogs/2017/10/03/terminal-rend...


Wow, it's true--Terminal is <canvas>, while the editor is DOM elements (for now). I'm impressed that I use both every day and never noticed any difference.


I'm not sure how you went from terminal and editor GPU rendering, which can benefit from it, to "there will soon be no web ui left at all".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: