This is someone retelling a story they were told by a co-worker of an event over 20 years prior. It’s not surprising that he doesn’t go into the details of exactly what was tried, beyond the key parts of the story.
I won't repeat it here, but I posted What I saw as an insider. I think that not all of the facts were quite right. However, some of the overtones definitely are.
It’s an interesting theory, but I had to downvote as you didn’t provide any references for your bold assertion. Is there data that bears this out? And even if there were, how could it be distinguished from more expensive lawyers simply doing better at representing their clients?
I’ve gotten in several arguments over the years where webdevs insisted on showing tabular data using flexbox or hardcoded div widths or worse. They insisted that html tables were never ever to be used and couldn’t be persuaded.
If you try to render tables with millions of cells the browser does a really poor job and the performance is abysmal. The only solution when you need to render that many cells is to virtualize the table and only have the visible cells (plus some buffer) actually in the DOM at a time. That plus weird restrictions browsers put on certain table elements (looking at you thead) that prevent them from being "sticky" headers means that the developer is left with absolutely positioned divs as the only solution. Blame browser vendors for not providing a native way to present tabular data with more than a few hundred thousand rows without causing performance issues.
there's table-layout:fixed that makes rendering of large tables much faster.
I'd argue that if you have so many rows that DOM can't handle, humans won't either. Then you need search, filtering, data exports, not JS attaching a faked scrollbar to millions of rows.
In fairness, the default `display: table` setup is often a pain to work with, so I can understand why people would opt for flexbox instead. One better option, though, might be to use `table` elements under the hood, styled with `display: grid` (and judicious use of subgrid for the intermediate elements) to get more precise control over the layout, while still using the right semantic elements underneath.
Related self promotion: this factoid about spaces, along with other fun slices in the evolution of writing, features in my decade-ago Ignite talk “For the love of letters”
> Factoids are things which resemble facts, but aren't actually facts.
I think you might be right but not definitively so: the Oxford dictionary has your definition, as does the New Oxford American dictionary which also lists the following as North American usage:
Yeah, but that's the same lax descriptivist school that also tell you "literally" and "I could care less" should somehow be accepted as the exact opposites, they're just wrong. :p
Is it equally accepted for "peoples" to be possessive and "people's" to be plural? At what point does something that began as an unambiguous error become rescued by the popularity of the mistake?
As we don’t have an official or authoritative body that determines “proper” English usage as other languages do, appealing to a dictionary strikes me as a mite better than prescriptivism or pedantry, though I don’t think was your intention either.
> Is it equally accepted for "peoples" to be possessive and "people's" to be plural?
That’s entirely unrelated and uncontroversial; one is the plural of a “people,” as in multiple distinct groups of folks with shared culture, nationality, or other traits, whereas the other is the possessive form of a word that is already plural, so I’m not sure if that’s a red herring or if you’ve actually seen such incorrect usage being advocated for.
The entire English language is a series of unambiguous errors that have been rescued by the popularity of the mistake. Were it not, we would be speaking some version of Ur-German.
That's just survivorship bias on a very long timeframe: Given enough time everything accumulates the status of "historical mistake", but what about the hundreds of thousands of words that didn't change and the days they didn't change in? Quite reasonably, we just don't pay attention to the mistakes that were squelched or whose trajectory never broke the ceiling of temporary slang.
There are some analogies to biology. Virtually all our DNA is the result of an error at some point (barring creationist theories) but that backstory isn't a reason to dismiss concerns against (or even for) a particular mutation. Surely nobody would downplay the drop of 3 base-pairs as "acktually normal when you look at the big picture for our species" when talking to people suffering from Cystic Fibrosis.
Hypocrisy: You're just claiming a different community of native speakers are wrong.
For some of the samples on that site, it'd question whether they even have majority-support as "correct" when brought to people's conscious attention, as opposed to simply being a popular mistake they don't object-to. (Do any polls exist? The nature of the content evades easy search-terms.)
There were specs competing for adoption, but only tables (the old way) and CSS were actually adopted by browsers. So no point trying to use some other positioning technique.
Indeed. I just wish we could get a better sense of the scale, which is always hard in nature shots devoid of trees or human structures. A productive use of AI would be to place some houses and automobiles in the video for scale.
I used this extensively in a past job where I had to have have a ton of terminals open and monitor/use them all, with each one serving a different role. (We were prototyping some really complicated experiences) I used this tool to give each terminal a distinctive “look”, with some coding for effects. E.G. all green screens were backends, different fonts for the different OSs, etc. It looked wild while in use, but really did help.
I empathize. My dad is 98 and can mostly use his iPhone fine, but I just wish I could turn off all the “shortcuts”: He doesn’t get swiping down from different edges of the screen for control panel vs notifications. He doesn’t get hard-pressing on icons for different options (like the flashlight), and so on. Wish I could turn off Siri and Apple Pay, because hitting the “sleep” button just slightly wrong can invoke them and then he’s stumped.
Not just your dad but the vast majority don't use these features either.
The human brain has a natural upper limit in how many times it's beliefs can update per year. If the Total new features shipped by every company in the land, every year exceeds that limit, most of it is a gigantic waste.
Large, cash rich companies beyond a point attract opportunists. And soon they outnumber innovators.
After that happens we get run away Involution (change without purpose).
There is never ending amount of work going on, hyper specialization, elon/trump style self glorification/back patting, and all happening with very little purpose or meaning being produced.
The solution is well known. Orgs which have purpose are tuned into the Limits baked into the system.
Try Settings -> Apple Intelligence & Siri -> Talk and Type to Siri
You can individually turn off 1) voice activation phrases, 2) press and hold side button, and 3) double tap bottom edge to type
For the flashlight, I assume you're talking about on the lock screen. You can customize the lock screen and remove that button entirely. If he has a newer iPhone, flashlight is probably a good use for the "Action Button" on the left, if he doesn't want to use that for toggling ringer/vibrate.
I agree that you should always have a pre-recorded version as backup, but live demos communicate a confidence in your product, and that can be worth something. Whenever I see a pre-recorded demo I wonder how many takes it took them, or how many pauses were taken out in editing, etc.
It helps if you have a master of ceremonies capable of running three rings simultaneously. Jobs could do it. Looking at what Apple is doing, they feel Cook is not. You lean into your strengths and avoid your weaknesses. Meta should realize that theZuck is not Jobs and is much closer to Cook
I said that I thought not having eg. healthcare doesn't make you as sad as not having it while someone else does.
Nothing I've used in that relative comparison breaks the science you quote. And I do not need a citation to think something rather than to state it is fact. It would be odd indeed to require a citation to think of something.
reply