never played with windows 2 but looks pretty awesome.
i think in this era the home market was largely saturated by home machines (atari st, amiga, apple 2+, a little macintosh). i don't remember a lot of pc juniors or other machines running windows 2, maybe some tandy machines but i think they were still more expensive than the home stuff.
> The actual code powering Gmail probably dates back to the late 80s or early 90s and has had several hundred thousands of hours of work put into it.
no. google did not exist until the late 90s.
various forms of internet email sure did, but most popular mtas of the google era shared very little code with predecessors from the 80s and early 90s (maybe sendmail) and google almost certainly wrote their own from scratch.
but your first point. that an archive browser that looks like gmail is not equivalent to a full tilt email service backend is valid.
julia is still clunky for these purposes! you can't even plot two things at the same time without it being weird and there's still a ton of textual noise when expressing linear algebra in it. (in fact, i'd argue the type system makes it worse!)
matlab is like what it would look like to put the math in an ascii email just like how python is what it would look like to write pseudocode and in both cases it is a good thing.
Interesting. I just looked at the page source and it is in fact using a table layout. I always assumed it was an image map, which I assume would be even more obscure for the LLM.
We should check the Wayback Machine, but in my memory this was built with an image map. Maybe like, 10 years ago or something. I was googling around when writing this post and saw that there are folks still tasked with making sure it's up and running. I wonder if they migrated it to tables at some point in the last decade.
i like to put it in live mode and point it at my plants and have conversations about how they're doing. it properly identifies them and flags any signs of disease and then provides correct next steps.
as i understand: numerical methods -> smooth out noise from sampling/floating point error/etc for methods that are analytically inspired that are computationally efficient where monte carlo -> computationally expensive brute force random sampling where you can improve accuracy by throwing more compute at the problem.
the self hosted runner host is some horrific dotnet csharp mono monstrosity and "language" is some javascript wrapper nonsense that needlessly creates a half baked dsl around running basic shell commands.
Not resistant at all because it is its weights and fine-tuning changes those weights. So that's like asking if a program is bug-free if you add a bug to it.
What's stopping it is a different thing from "resistant". If you make the model evil in one way it becomes stupid/evil in every other way at once and can't pass any benchmarks.
i think in this era the home market was largely saturated by home machines (atari st, amiga, apple 2+, a little macintosh). i don't remember a lot of pc juniors or other machines running windows 2, maybe some tandy machines but i think they were still more expensive than the home stuff.
reply