I built a window vent with a variable-speed computer fan and an esp8266 that communicates with a CO2 monitor and raises the speed when the CO2 gets too high. I have a very fine mesh on the window screen that's designed to filter out pollen and dust. It works really well. I usually only turn it on at night, because that's when my wife and I are both in the room and the CO2 can really build up.
I still see a disturbing amount of people claim it does matter a whole lot to them on LinkedIn. Hell, Sam Altman himself made a big deal about someone he knows "committing 100k lines of code per day with AI," as if that code was anything other than complete garbage.
I noticed this a few years ago and decided to start giving any automated systems a special email address so I can filter automated email into a separate folder. I only give out my personal email address to actual humans. Its been a huge improvement to only see human-written emails in my inbox.
I knew npm was a train wreck when I first used it years ago and it pulled in literally hundreds of dependencies for a simple app. I avoid anything that uses it like the plague.
Lots of languages ecosystems have this problem, but it is especially prominent in JS and lies on a spectrum. For comparison, in the C/C++ ecosystem it is prominent to have libraries advertising that they have zero dependencies and header only or one common major library like Boost.
No they are not as susceptible - auto updating dependencies, post install scripts and culture of thousands of crappy micro packages (like left-pad) is mainly a NPM issue.
The JavaScript ecosystem has a major case of import-everything disease that acts as a catalyst for supply chain attacks. left-pad as one example of many.
Just more engineering leaning than you. Actual engineers have to analyze their supply chains, and so makes sense they would be baffled by NPM dependency trees that utterly normal projects grow into in the JavaScript ecosystem.
Good thing that at scale, private package repositories or even in-house development is done. Personally, I would argue that an engineer unable to tell apart perfect from good, isn't a very good engineer in my book, but some engineers are unable to make compromises.
Do you think companies using node don't analyze supply chains? That's nonsense. Have you cargo installed a rust app recently? This isn't just a js issue. This needs to be solved across the industry and npm frankly has done a horrible job at it. We let people with billions of downloads a month with recently changed password/2fa publish packages? Why don't we pool assets as a collective to scan newly published packages before they're allowed to be installed? These types of things really should exist across all package registries (and my really hot take is that we probably don't need a registry for every language, either!).
> Do you think companies using node don't analyze supply chains?
I _know_ many don’t. In fact suggesting doing it is a good way to be looked at like a crazy person and be told something like “this is a yes place not a no place.”
It is solved across the industry for those who care. If you use cargo, npm, or a python package manager, you may have a service that handles static versioning of dependencies for security purposes. If you don't, you aren't generally working in a language that encourages so much package use.
Ah yes, this old way of thinking. Bro we live in a world where at least in web (and plenty of other domains) the velocity demanded from developers is exceedingly high; not necessarily because that's what those developers want, but because that's what management wants.
Most of my career Node.JS has paid the bills and I'm very grateful to fate for that; but I have also worked in C/asm/etc for embedded firmware etc. Implying that the JS ecosystem is only comprised of terrible devs is classic gatekeeping holier than thou type shit.
"I knew you weren't a great engineer the moment you started pulling dependencies for a simple app"
You realize my point right? People are taught to not reinvent the wheel at work (mostly for good reasons) so that's what they do, me and you included.
You ain't gonna be bothered to write html and manual manipulation, the people that will give you libraries to do so won't be bothered reimplementing parsers and file watchers, file watcher writers won't be bothered reimplementing file system utils, file system utils developers won't be bothered reimplementing structured cloning or event loops, etc, etc.
I myself just the other day had the task of converting HTML to markdown, because I don't remember whether it was Jira or Github APIs that returns comments as HTML and despite it being mostly few hours of work that would get us 90% there everybody was in favor of pulling a dependency to do so (with its own dependencies) and thus further exposing our application to those risks.
One that gets me 90% there would take me few hours, one that gets me 99% there few months, which is why eventually people would rather pull a dependency.
LLMs are pretty good at greenfield projects and especially if they are tasked with writing something with a lot of examples in the training data. This approach can be used to solve the problem of supply-chain attacks with the downside being that the code might not be as well written and feature complete as a third-party package.
Not for the parser, only for the demo server! And I guess the dev dependencies as well, but with a much smaller surface area. But yeah, I don't think a TypeScript compiler is within the scope of an LLM.
I try to avoid JS, as it is a horrible language, by design. That does include TS, but it at least is useable, but barely - because it still tied to JS itself.
Off-topic, but I love how different programmers think about things, and how nothing really is "correct" or "incorrect". Started thinking about it because for me it's the opposite, JS is an OK and at least usable language, as long as you avoid TS and all that comes with it.
Still, even I who'd call myself a JavaScript developer also try to avoid desktop applications made with just JS :)
JS's issue is that it allows you to run an objectively wrong code without throwing explicit error to the user, it just fails silently or does something magical. Seems innocent, until you realize what we use JS for, other than silly websites or ERP dashboards.
It is full of gotchas that serves 0 purpose nowadays.
Also remember that it is basically a Lisp wearing Java skin on top, originally designed in less than 2 weeks.
Typescript is one of few things that puts safety barrier and sane static error checking that makes JS bearable to use - but it still has to fall down to how JS works in the end so it suffers from same core architectural problems.
> JS's issue is that it allows you to run an objectively wrong code without throwing explicit error to the user, it just fails silently or does something magical. Seems innocent, until you realize what we use JS for, other than silly websites or ERP dashboards.
What some people see as a fault, others see as a feature :) For me, that's there to prevent entire websites from breaking because some small widget in the bottom right corner breaks, for example. Rather than stopping the entire runtime, it just surfaces that error in the developer tools, but lets the rest to continue working.
Then of course entire web apps crash because one tiny error somewhere (remember seeing a blank page with just some short error text in black in the middle? Those), but that doesn't mean that's the best way of doing things.
> Also remember that it is basically a Lisp wearing Java skin on top
I guess that's why I like it better than TS, that tries to move it away from that. I mainly do Clojure development day-to-day, and static types hardly ever gives me more "safety" than other approaches do. But again, what I do isn't more "correct" than what anyone else does, it's largely based on "It's better for me to program this way".
>it's there to prevent entire websites from breaking because some small widget in the bottom right corner breaks, for example.
the issue is that it prevents that, but also allows you to send complete corrupt data forward, that can create horrible cascade of errors down the pipeline - because other components made assumption about correctness of data passed to them.
Such display errors should be caught early in development, should be tested, and should never reach prod, instead of being swept under the rug - for anything else other than prototype.
but i agree - going fully functional with dynamic types beats average JS experience any day.
It is just piling up more mud upon giant mudball,
> JS is an OK and at least usable language, as long as you avoid TS and all that comes with it.
Care to explain why?
My view is this: since you can write plain JS inside TS (just misconfigure tsconfig badly enough), I honestly don’t see how you arrive at that conclusion.
I can just about understand preferring JS on the grounds that it runs without a compile step. But I’ve never seen a convincing explanation of why the language itself is supposedly better.
I was hyped for wasm because i thought it was supposed to solve this problem, allowing any programming language to be compiled to run in browsers.
But apparently they only made it do like 95% of what JS does so you can't actually replace js with it. To me it seems like a huge blunder. I don't give a crap about making niche applications a bit faster, but freeing the web from the curse of JS would be absolutely huge. And they basically did it except not quite. It's so strange to me, why not just go the extra 5%?
That 5% of js glue code necessary right now is just monumentally difficult to get rid of, it's like a binary serialization / interface (ABI) of all DOM/BOM APIs and these APIs are huge, dynamic, callback-heavy and object-oriented. It's much easier to have that glue compiler generated, which you can already do right now (you can write your entire web app in rust if you want):
This is also being worked on, in the future this 5% glue might eventually entirely disappear:
> Designed with the "Web IDL bindings" proposal in mind. Eventually, there won't be any JavaScript shims between Rust-generated wasm functions and native DOM methods
Maybe its something about sharing memory with the js that would introduce serious vulnerabilities so they can't let wasm code have access to everything.
The only way to remove Js is to create a new browser that doesn't use it. Fragments the web, yes and probably nobody will use it
depends on use case, i don't think one language can fit all cases. 100% correctness is required for systems, but it is a hindrance in non-critical systems. or robust type systems require high compilation times which hurt iterating on the codebase.
systems? rust - but it is still far from perfect, too much focus on saving few keystrokes here and there.
general purpose corporate development? c# - despite current direction post .net 5 of stapling together legacy parts of .net framework to .net core. it does most things good enough.
scripting, and just scripting? python.
web? there's only one, bad, option and that's js/ts.
most hated ones are in order: js, go, c++, python.
I mean, it's hard to avoid indirectly using things that use npm, e.g. websites or whatever. But it's pretty easy to never have to run npm on your local machine, yes.
In my experience, this is how it goes whenever there are more than 2 people in a conversation. The speed of the topic switching defaults to the fastest processor in the group, basically excluding any slow processors from the conversation.
Removing/degrading the UX for keyboard and mouse users would be (professional) suicide for Blender, I'm 99% confident Ton would never do something like that, unless most of the user base suddenly find themselves using tablets exclusively. And even then I'd doubt he'd leave us desktop users behind.
Note that I said it's controversial because it's considered implicitly sexual, not that it's illegal. The article you posted literally mentions the social stigma I'm referring to.
Before I started self-hosting my LLMs with Ollama, I imagined that they required a ton of energy to operate. I was amazed at how quickly my local LLM operates with a relatively inexpensive GeForce RTX 4060 with 8GB VRAM and an 8b model. The 8b model isn't as smart as the hosted 70b models I've used, but it's still surprisingly useful.