Deno (with Typescript of course) is my "smaller Rust". Whenever I don't need the performance and efficiency of Rust, I fall back to Deno. The development speed is much higher while the result is reasonable fast and safe.
- Typescript (compared to other main stream languages like Go or Java) comes pretty close to Rust regarding the type system, e.g. discriminated unions (https://mkosir.github.io/typescript-style-guide/#discriminat...) or null-checks as "the second best version of null safety". (The best is still the option type.)
- Deno has an exceptional core api and std lib. Performance is great. In my experience it's very stable (except some newer Node APIs). The tooling is great and now the move to JSR and Node compatibility widens the ecosystem while trying to make it safer. I wonder how the Deno team plans to push JSRs adoption.
- It's single threaded. Yes, I think it's a feature. Multithreading via Webworkers might not be as fast as sharing memory, but it's much safer. I even think it has an edge over Go, if you don't need the extra performance of Go. Go is generally faster, esp. for computations. Web stuff in Deno is fast enough as they use Hyper under the hood.
This seems like a good step with import maps. Especially following web standards. I like the idea behind jsr, however I'm interested in how they will be integrating additional specifiers. I don't philosophically like having hard coded [0] strings vs smart redirect urls. I'd be interested why jsr couldn't have used standard smart semver following 301s to dynamically load packages, combined with an import map lock file.
For example:
`import { assertEquals } from "jsr:@std/assert@1";`
could be:
`import { assertEquals } from "jsr.io/std@1/assert";`
With this deno could assume https, as well as follow configs/redirects to get the correct import, then lock those correct paths using import maps (the logistics of this would need hashing out, but it doesn't seem impossible considering they do it with the "special" `jsr:` specifier).
It seems like this would maintain the same pros while avoiding the "special"
specifier hard coding that exists now (seems to only be for npm: and jsr: right now; no github: for example).
There is still a problem with unreliable hosts. They might be temporarily down, or they might change the source code under you.
Although that's pretty rare, it's a security vulnerability that we're more wary of nowadays due to supply-chain attacks. You still have to check new releases, but it would be even worse if old, widely used code were modified to mine crypto or something.
Go ran into a similar problem because 'go get' would import directly from source control systems. For GitHub that's mostly fine, but other hosts are sometimes less reliable. They solved it with a indexing proxy server. You can still turn it off and download directly from source control, or swap in your own proxy server, so it's less centralized than jsr.io.
In practice, most people use the default proxy server, so on most days, jsr.io should work just as well as Go's solution. But it's nice to have alternatives.
This definitely looks like a bit more complicated indeed.
But I like that someone is working on a standardized alternative to bundling, which also comes with its host of problems:
- it makes debugging harder (I know source maps, but they are not a full replacement to just plain old javascript files downloaded separately by the browser and come with their own costs)
- when you want to update something, the whole bundles needs to be updated, and every clients needs to redownload everything from scratch - for applications that are deployed often, you might as well not have cache at all
- bundles are often full of unused parts, and dead code elimination doesn't do everything because you can't determine if a code is dead for everything
- they are hard on unreliable connections / slow hardware: there's one big thing to download and then parse, and if the connection fails in the middle, you probably have to redownload everything again. But then again, having to connect to many hosts or to download many files is also problematic in its own way.
- they mix up dependency management, compiling optimization, polyfilling, browser support… it's a mess, especially when you want to address one of those issues separately, or just to set it up. You end up depending on some bundle you don't control that makes choices different from what you would have made.
- The centralization has issues. Now, it sounds like we are back to centralized again with JSR, I hope someone can provide their own JSR server if they want to. It doesn't seem impossible from a cursory read.
The problem here is orthogonal to bundling, unless I've missed something in the discussion. You can have a Node/NPM-style package manager without doing bundling, and you can bundle together a Deno project if you wanted.
Bundling is mainly a front-end technique used for performance reasons - you don't want to resolve and load each import separately, so you combine all the files into a single one for the sake of efficiency. But for server-side Javascript - be that NodeJS or Demo - you typically don't need to bundle anything because resolving files doesn't take a slow network call.
IMO Package.json is beautiful. Become gold standard for package management. But it looks like deno seriously hates npm, and cooking up solutions like this. If npm had to be replaced they could do go like packagement with direct links to github. Which package.json already supports.
I think the biggest issue with npm these days is the same as Node.js in general... you wind up in a weird effort to mix/use ESM and CJS packages, which winds up being a bit of a convoluted mess. Where Deno/jsr is based in TS/JS source trees with ESM patterns as the primarily supported method, with npm/cjs as a compatibility shim.
Http (with and without import-maps) supports direct links to github as well.
I don't think people will migrate to Go, as they most likely chose Node (or Deno) because of full-stack Typescript or Javascript. People who need the great performance of Go are probably already using it. For a Rust, Typescript, C#, Kotlin or the like user, Go is a hard pill to swallow due to it's limited type system and cumbersome DX. Also, Go modules are typically un-intuitive.
I also think that Bun will have a very hard time to become relevant. The hype is already over. Their benchmarks and marketing are old and sketchy. I mean, they're using uwebsocktesJS under the hood, which you could use with Node or Deno as well, if it's important to you to serve 2000000 times per second a static "hello world". Startup times for JS runtimes have been debunked recently. (You can prepopulate some caches which makes the startup surprisingly fast, https://deno.com/blog/aws-lambda-coldstart-benchmarks)
If you take a closer look at Deno's core api and std lib, there's much to like. The advantage compared to Node a significant, but I agree that they might not be decisive for most projects. Time will tell.
Ha, in what I believe Ryan's first talk about Deno, he mentioned something about avoiding cute features or so, then he said Deno has URL-based imports "because they're cute". It didn't take long for someone to raise it up in 2018:
It's slightly annoying to me that the messaging around this, including in this article, seems to be a bit all over the place.
What was always actually cool about HTTP imports, and several other Deno features, was browser parity. And I applaud their choice to follow the import map spec for the same reason, instead of inventing a new custom manifest format.
So to me the real message should be "well, we have to have HTTP imports, because that's how the web works. But that doesn't scale to this-or-that use case, so here's what we've done to help with that"
We are using deno to allow our customers to run custom code with imports. This is only made possible with http imports and I for one am very happy it exists.
- Typescript (compared to other main stream languages like Go or Java) comes pretty close to Rust regarding the type system, e.g. discriminated unions (https://mkosir.github.io/typescript-style-guide/#discriminat...) or null-checks as "the second best version of null safety". (The best is still the option type.)
- Deno has an exceptional core api and std lib. Performance is great. In my experience it's very stable (except some newer Node APIs). The tooling is great and now the move to JSR and Node compatibility widens the ecosystem while trying to make it safer. I wonder how the Deno team plans to push JSRs adoption.
- It's single threaded. Yes, I think it's a feature. Multithreading via Webworkers might not be as fast as sharing memory, but it's much safer. I even think it has an edge over Go, if you don't need the extra performance of Go. Go is generally faster, esp. for computations. Web stuff in Deno is fast enough as they use Hyper under the hood.
I really hope JSR adoption will grow.