I think one obvious approach would be to assign "roles" - one person is "operations" and is the only one allowed to read the instructions to you, one person is "research" and has access to the list of ingredients, etc. But that probably bottlenecks things too hard and you have to figure out a fast way to assign roles. You could just increase the difficulty by requiring more precise instructions? Ah, you split the list of instructions into four parts and put one list in each corner of the classroom, then randomly sort people into the corners - one corner has ingredients, one corner has operations, one corner has conditionals, one corner has goals, and the class has to communicate to build valid instructions. Maybe give the ingredients tongue-twister names and make them devise ways to communicate without getting things confused. And obviously the end of the demo is "so why didn't any of you just take a list of ingredients and walk over to the list of operations so you didn't have to shout?".
You know, looking back at this eight hours later, I realize that this post sounds way worse than I intended. I was attempting to signal my own epistemic uncertainty, not imply sarcastic dismissal. Sorry. :/
Raspberry Pi definitely works! I have a project you can take a look at; you'll have to modify it slightly since you want a keyboard rather than a joystick, but they're both HID so the majority of it should work pretty much out of the box: https://github.com/saulrh/composite-joystick.
If you're starting over with a new backing store using jj's pluggable backends, can you give us a native lazily-materialized store for megarepos? It's a tragedy that the open-source world has no answer to piper.
I know someone who uses a Twiddler full-time, and I used mine for about a month when I broke my dominant hand about a decade ago. Works very well if your hand is the right size for it.
I have a tap strap, but I use it mostly as a remote control for my TV, not as a primary input device. It probably works, but I'm not good enough with it to have the kind of error rate I'd really like.
Android has a Morse input method which would be entirely suitable for one-handed text input and there are certainly solutions for using an android phone as a keyboard, but I don't know how it'd handle things like arrow keys.
Works even if you're not in a college town. I once pulled a $4000 set of speakers out of my building's trash room - Boston Dynamics floor speakers, Polk Audio subwoofer - and I was just in a random apartment building in the bay area. Turned out the tweeter on one of the speakers needed replacing but that was like a $40 part on eBay and ten minutes of work with a screwdriver, didn't even need a soldering iron. You can get some crazy stuff if you're in the right building. Really sucks seeing it go to waste when it isn't something you can take, I always have to fight myself to leave some things behind.
I had those speakers for a few years before someone else noticed it, lol. The other tweeter worked just fine, and the speakers as a whole were so good that even without the dedicated hardware for higher frequencies it was still better in those ranges than what I'd been using.
I don't know how likely it'd be for something like that to turn out unsalvageable. I think that essentially everything at that level uses wooden enclosures, so it'd come down to whether the speaker bit is set into the wooden enclosure with screws or adhesive, and I don't know about the industry enough to know what the ratio is on that. Probably mostly screws. Then getting a compatible driver is probably guaranteed, at worst you have to replace both sides to keep them balanced.
I'll point out some higher order impacts of this, since the article doesn't: Losing these forecasts will be catastrophic for American farmers. Crops literally live and die on weather forecasts; forecasts tell farmers what to plant, when to plant it, how to plant it, how to water it, and when to harvest it. Without these forecasts we will see negative effects on the entire American agricultural industry and all of the people it feeds, US citizens or not. I'm not a farmer myself, so I can't tell you exactly how severe the impact will be, but there will be an impact, whether it's an immediate crop failure and outright famine this year or "only" shockwaves bouncing through our entire economy as farmers plant the wrong crops and go out of business over the next few years. This is one of the scenarios I've been most worried about when it comes to the stability of human civilization, up there alongside the looming specter of nuclear war and the randomisation of US foreign policy.
I hate that my cynicism is at the level where I think one of the things they will do if they end up privatizing it, is use price discrimination to hide 'sticking it to those liberal states'.
For example, a massive part of California's economy is in farming, and while most farmers are more conservative leaning, I am sure a vindictive admin will have no qualms with sacrificing them to stick it to California. Similarly with Oregon and Washington. And when prices of food go up commensurately in those states, they will spin it as "look at how shitty those places are, look at how much they charge you for food."
Or they will give discounts to large farmers while lowering data quality and increasing prices for the small ones, impoverishing or causing them to fail, such that they can be cheaply bought out, thus further consolidating the ag sector into a small handful of massive companies.
Another secondary effect that is worrying to me is in regards to how much aviation relies on weather data. Commercial aviation probably won't suffer much immediately -- they'll pay for the good data and pass on the costs to their passengers. With how much prices for airline flights already fluctuate, the difference will probably get lost in the noise so there won't be much public outrage. But GA pilots, especially ones operating from smaller airports might be affected pretty significantly. Flying a plane isn't exactly a cheap endeavor as is, and having to pay much more for weather data to fly safely might turn even more people off from flying. Down the line, that means less pilots that can fly commercial, which leads to yet another pilot shortage. It will also probably mean more GA accidents due to insufficient or shoddy weather data.
Not just farmers, pilots, and anything that flies long distances will be hugely affected.
Pilots rely not just on forecasts, but also warnings like turbulence, icing, thunderstorms, low visibility, etc. Without precision, and relying on other sources will cause a higher risk of encountering unexpected weather. It will increase costs, as they'll have to subscribe to alternative weather services, emergency responses will be affected as well. Need a helicopter to medevac someone? Well, better hope they updated their weather subscription, and that it covers the area you need to be evcuated from and to. It'll also increase delays due to the adverse weather events encountered unexpectedly.
Farmers will pay to some less precise but sufficient private weather service. It will make them poorer and consequently more resentful, so they will vote for conservatives again and more to punish perceived left.
this - as it has always been with so many sectors previously in the US. People getting screwed by corporations lobbying republicans, and conservative media propaganda pins the outrage and consequences on the "woke" left.
At what price? Privatization always turn into either monopoly or cartels, that is, price inflation. Those believing that privatization will help them also to pay lower taxes either don't know how it works or are part of the plan.
The alternative is suggested by tramp, which from what I know treats the remote as a network filesystem instead of an execution host. I don't believe that tramp deploys any binaries: it reads and writes bytes over pipes and all meaningful execution happens locally. Notably, it does not achieve persistence, because there's a difference between "VSCode plugins have access while you're SSH'd in" and "VSCode plugins have access forever".
When you’re in a buffer displaying a remote file, most commands take that in account and execute in the remote context. And more often than not, that means connecting through pipes (and files) inside the ssh tunnels. Eglot (with gopls) works fine and fast for me. Executing ‘shell’ opens a remote shell, as well as launching tasks through compilation mode. Grepping and finding files, as well as dired also work fine.
Persistence is important to me, and making it read-only significantly reduces its usefulness. I regularly SSH into a dev machine to run scripts and update configurations. As long as a tool lets me do that without getting in my way, I'm good with any solution that works.
Tramp is perfectly able to write, it's just that it does it by writing a temp file locally and then using ssh to transfer the file to the remote, rather than installing a copy of itself on the remote and acting through that. It only uses executables that it finds on the remote. So if make and gcc and sed and such are available it's basically transparent, indistinguishable from local editing except for network round trips, and the only changes it leaves behind are the files you edit.
Then tramp would be a perfect fit for you as long as you’re willing to learn emacs. If you open a remote file, almost all actions when that buffer is selected will execute in the remote context (launching commands, visiting directories, opening a shell,…)
You don't know what you're missing then and I'm not sure such opinions count. It's probably for the best to refrain from criticizing things one has no experience in.
Not wanting to pick up a tool isn't criticism. I just said that I couldn't care less. It's an editor. I work mostly with VSCode and a sometimes with Goland or Intellij and don't think about them too often. Editor wars are lame and old; so are linter and formatting wars.
The thing about jj is that it doesn't actually enable any new capabilities. I tell people to use emacs or vim or vscode or whatever instead of notepad because it gives them new capabilities, things that are simply unavailable unless you're talking to an LSP or running a full-powered scripting engine. jj doesn't make anything possible the way going from notepad to a real editor does. What jj does do is it makes everything so much easier that you can now actually use all of those features that git theoretically gave you. Rebases? No more fiddling, no more wedges, no more digging through the reflog because you fat-fingered something, you just go `jj rebase` and... that's it. And if you get it wrong, you just do `jj undo` and... that's it. And if you just had a six hour manic coding marathon without committing anything and now you want to spread its batch of changes back down through your patch stack, you just do `jj absorb` and... that's it. It's not the difference between notepad and emacs where you're going from no LSP to LSP, it's the difference between emacs@2016 where LSP support is a week-long adventure and emacs@2024 where LSP support is five lines copy-pasted out of the info page.
As a source control expert and jj's number one fan [1], I would count being able to defer merge conflict resolution as a new capability, FWIW. In general I think jj's greater than the sum of its (very good) parts because of how its features work together to create a coherent and pleasant user experience.
I guess I was thinking in terms of the patches you push up to github. `jj` is a joy to use and it absolutely enables me to implement workflows that I wouldn't even vaguely consider without it helping me; the big one I think of is the one where you work in a merged dir with like six parents and use `jj absorb` to instantly spread your changes out to the different PRs. I've been forced to do that in git. It was a nightmare and took me two days. Not impossible! Just utterly impractical. `jj` takes end results that were theoretically-possible-but-practically-infeasible and makes them usable. Which I suppose counts as a new capability from the UX perspective. :P
Absolutely! jj is a real advancement in the state of the art. I think it's the second time in the history of source control where the authors of a new system have spent a long time working on existing systems + deploying them at scale, and have brought their full expertise to bear on the design of the new system.
(The first was BitKeeper, which also was a tremendous achievement.)
> In February 2000, they contacted Karl Fogel, the author of Open Source Development with CVS (Coriolis, 1999), and asked if he'd like to work on this new project. Coincidentally, at the time Karl was already discussing a design for a new version control system with his friend Jim Blandy. In 1995, the two had started Cyclic Software, a company providing CVS support contracts, and although they later sold the business, they still used CVS every day at their jobs
You may recognize Jim as one of the authors of Programming Rust.
jj is not just a new interface for git, it has a lot of new and powerful features to offer, even when you're using the git backend with a colocated repo.
Just to name a few:
- deferred conflict resolution
- The very expressive revset language
- the op log and ability to undo any operation
Many things that you can do with the git cli are significantly easier and in some cases comparatively effortless using jj. If all you do is git add and git commit then you probably aren't missing out on much, but if you ever split or rebase commits you should definitely try jj.