Hacker Newsnew | past | comments | ask | show | jobs | submit | quesera's commentslogin

That's the thing though -- no one made Jimmy Carter sell his farm[0].

But Jimmy Carter was an honorable human, and, well...there are fewer people fitting that description sitting behind the Resolute desk, today.

[0] He didn't sell it, he put it into a blind trust. He should have sold it. When he left office, the farm was $1MM in debt.


(tangent)

This is true, although entertainingly, the "server" part has always been easily confused.

In X11, the "server" runs on your local machine, and the "client" frequently runs on a remote system.


The server runs on the machine that allows clients to connect to it. What is the confusing part about this?

X has the terminology the other way around compared to all other consumer facing software.

This is because of its mainframe style history and technically it does make sense, it's just that everybody else does things the other way around.

For the people who weren't around in the ancient mainframe times who end up messing with Linux for the first time, this is confusing for a while.


Xhost and xapps

The part that is counterintuitive to most people when it comes to the "server" terminology is that, with X, your end-user workstation (which may be an incredibly dumb X terminal) is the "display server", which means you remote into a server (in the traditional sense) elsewhere, which then acts as an X client by making requests to your local machine to display windows.

The way most people think about it, "client" is your local machine and "server" is the remote machine that has lots of applications and is potentially multi-user, but X turns that backwards. The big iron is the client and the relatively dumb terminal is the server.


I think most of the confusion arises because when you are tunneling X via ssh, the X client/server is the reverse of the shh client/server.

Add to that that the user manages the ssh connection while the X connection is managed for them...


I think the confusion is obvious, given a little empathy for the range of people who use computers.

The server is usually a remote machine, especially back in the time when "client-server" architecture was emerging in mainstream (business) vernacular.


The server is not usually a remote machine. The server is the app accepting remote connections.

This has been true for decades.

https://en.wikipedia.org/wiki/Server_(computing)


Please don't imagine that I don't fully understand this.

Nevertheless, X11 "server" and "client" have confused very smart and highly technical people. I have had the entertainment of explaining it dozens of times, though rarely recently.

And honestly, still, a server is usually a remote machine in all common usage. When "the server's down", it is usually not a problem on your local machine.


Yes, it’s simultaneously logical if you look at how it works and immensely strange if you don’t understand the architecture. (As has been noted all the way back to the UNIX-HATERS Handbook[1], although, pace 'DonHopkins, the NeWS Book uses the same terminology—possibly because it was written late enough to contain promises of X11/NeWS.)

[1] https://www.donhopkins.com/home/catalog/unix-haters/x-window...


There is nothing at all strange about the terminology. Go run ps on macOS and marvel at the "WindowServer" process. The generic architectural term is "display server".

https://en.wikipedia.org/wiki/Windowing_system#Display_serve...


> Something awful that lets him invoke some obscure rule that lets him stay in power with congressional approval

There is literally no such obscure rule, and a new Congress will be seated two weeks before the 2029 Presidential Inauguration.

Elections, and the compulsory ends of terms, inauguration of new Congresses, etc, happen on schedule without regard to any exceptional cases, including Civil War.

If he can get a majority of the Electoral College for a third term, and a majority in both houses of Congress in 2028, then things get much more complicated.

But there is no other path. Elections matter, and don't let anyone discourage you from believing that they don't matter enough to vote.


State schools also offer journalism degrees.

And FWIW, in my very limited and anecdotal experience, the programs are inhabited by people who fully understand their employment and salary prospects, but believe in the work, and often have above-average family wealth to compensate for the gaps. They're good people, but they are not experts.


If you had a better understanding of math and science, you'd know the difference between the concepts of "one" and "statistically meaningful".

It’s worth pointing out one contradiction to someone who passes such vast and foolish judgement.

It really isn't.

Obviously, "who becomes a journalist in this age" does not translate to "every person who is alive now who has ever been a journalist".

I'm not sure if your error lies in parsing colloquial English, or in basic statistics. Either way, I think you have fully illustrated the commenter's point.

Journalists are not reliably selected for, or demonstrative of, comprehension or accuracy.


This is dumb trying to call others dumb. This argument is not just inhumane it’s also wrong. The average of something assumed does not negate a real data point. If you did even bit of data science you’d know that. But just another HNer calling someone dumb while confidently wrong. And ironic calling others dumb because of it. So think on that.

Maybe Christmas just leaves the worst on HN … statistically.


A failure of reading comprehension, or a visceral reaction to a generalized statement that pertains to you personally, does not make one dumb.

Defending your mistakes doesn't either, but I can understand the confusion.

Happy holidays.


Again me thinks the fool speaks of himself.

(You can’t engagement logically technically or even correctly here and keep Spouting others are wrong. Think hard on how poorly you comprehension here is even when explained why you are Wrong.)


I'm struggling to find my error in this thread. Please quote my statement and your points of disagreement.

Continuous circling is tiresome troll. Just stop.

Points have been illustrated contradicting the statement. No points have been made supporting it.

Your argument boils down to “all x is bad is valid by default and all Ys that contradict are inherently ‘statistically invalid’”. Do you not get how horribly dumb your logic is?

By this logic I could state all HNErs posting on Christmas are idiots and wrong by default. This of course can’t by contradicted by any statement you make because you are just a data point of one and therefore invalid. Also the original point is supported with exactly 0 data points so in actuality data point of 1 > 0. So my guy. Jesus. Learn stats. Or anything.


Are we agreeing that neither of us can find my error in this thread?

Modern fire trucks, and police cars usually, are built to be able to push vehicles out of the way. It's a very common need.

(Not an argument against Waymo doing better in this situation though!)


> Modern fire trucks, and police cars usually, are built to be able to push vehicles out of the way.

Not so much police cars, anymore.

Back in the time of the B-Body Caprice and the Crown Vic, sure. These days with the exception of the Tahoe the most common police vehicles are all unibody platforms. Charger, Durango, Explorer, Taurus, and the rare Australian Caprice

You can still bolt a push bumper to them and most departments do, but they have to be used with a lot more caution and a lot less aggression to avoid damaging the vehicle than in the days of body-on-frame sedans.

Fire trucks on the other hand, yeah they're basically the opposite in that there might be a couple of Explorers or Durangos in the fleet but most everything else is a medium duty truck or a custom chassis specifically for fire service.


I have not purchased a TV or car with these misfeatures, but I expect I will have to at some point in the future.

The most vulnerable part might be the antenna? Required by laws of physics to be a certain size and shape, and is not easily integrated into another more essential component?

If found, it can be removed entirely, or replaced with a dummy load to satisfy any presence detection circuits. But radiation can be minimized or eliminated.

Now obviously a device can choose not to function (or to be especially annoying in its UI) if it doesn't find a network. But people take cars (and TVs) to places with no WiFi or mobile coverage, and I don't know how the device manufacturers deal with that.


Nothing to do with Google.

(Apologies if this is pedantic, but:)

The letter "e" (for "exponent") has meant "multiplied by ten to the power of", since the dawn of computing (Fortran!), when it was impossible to display or type a superscripted exponent.

In computing, we all got used to it because there was no other available notation (this is also why we use * for multiplication, / for division, etc). And it's intuitive enough, if you already know scientific notation, which Fortran programmers did.

Scientific notation goes back even further, and is used when the magnitude of the number exceeds the available significant digits.

E.g., Avogadro's number is 6.02214076 × 10⁻²³. In school we usually used it as 6.022 × 10⁻²³, which is easier to work with and was more appropriate for our classroom precision. In E notation, that'd be 6.022E-23.

1.3076744e+12 is 1.3076744 × 10¹². The plus sign is for the positive exponent, not addition. You could argue that the plus sign is redundant, but the clear notation can be helpful when working with numbers that can swing from one to the other.


> Is there something I'm missing here about why people might prefer relative timestamps?

I think most people are uncomfortable parsing timestamps for small-interval differences, e.g. `2025-12-19T16:28:09+00:00` for "31 seconds ago".

For larger intervals, I agree that timestamps are more useful. "1 day ago" is a particular bugbear of mine. One day meaning, 13 hours, or meaning 35 hours? Sometimes that's important!

The original advice when relative timestamps became a thing was to choose based on the activity level of the content. If new content is constantly appearing and older stuff fades out of relevance quickly, then choose relative timestamps. Otherwise, use absolute timestamps.

The worst is inconsistency, and the best is sometimes both (when presented in a discoverable and convenient way -- hover text used to be that way, but this degrades on mobile).


To clarify, I don't mean to literally imply an exact timestamp format. Showing something like "December 19, 2025 4:28 PM" or "19 December 2025 16:28" seems strictly better to me than "31 seconds ago" because it doesn't either become inaccurate quickly or require having the page update in real-time.

> with no actual knowledge of cognition, psychology, biology

... but we also need to be careful with that assertion, because humans do not understand cognition, psychology, or biology very well.

Biology is the furthest developed, but it turns out to be like physics -- superficially and usefully modelable, but fundamental mysteries remain. We have no idea how complete our models are, but they work pretty well in our standard context.

If computer engineering is downstream from physics, and cognition is downstream from biology ... well, I just don't know how certain we can be about much of anything.

> this thread sure attracts a lot of armchair experts.

"So we beat on, boats against the current, borne back ceaselessly into our priors..."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: