> Saying that something happened x-number of [...]days or weeks) ago in the future) is simple
It's not, actually. Does 2 days and 1 hour ago mean 48, 49 or 50 hours, if there was a daylight saving jump in the meantime? If it's 3PM and something is due to happen in 3 days and 2 hours, the user is going to assume and prepare for 5PM, but what if there's a daylight saving jump in the meantime? What happens to "in 3 days and 2 hours" if there's a leap second happening tomorrow that some systems know about and some don't?
You rarely want to be thinking in terms of deltas when considering future events. If there is an event that you want to happen on jan 1, 2030 at 6 PM CET, there is no way to express that as a number of seconds between now and then, because you don't know whether the US government abolishes DST between now and 2030 or not.
To reiterate this point, there is no way to make an accurate, constantly decreasing countdown of seconds to 6PM CET on jan 1, 2030, because nobody actually knows when that moment is going to happen yet.
No. The problems begin because GP included the idea of saying "N <calendar units> in the future".
If the definition of a future time was limited to hours, minutes and/or seconds, then it would be true that the only hard part is answering "what calendrical time and date is that?"
But if you can say "1 day in the future", you're already slamming into problems before even getting to ask that question.
The real problem here is that people keep trying to screw up the simple thing.
If you want to know the timestamp of "two days from now" then you need to know all kinds of things like what time zone you're talking about and if there are any leap seconds etc. That would tell you if "two days from now" is in 172800 seconds or 172801 seconds or 169201 or 176400 etc.
But the seconds-counting thing should be doing absolutely nothing other than counting seconds and doing otherwise is crazy. The conversion from that into calendar dates and so on is for a separate library which is aware of all these contextual things that allow it to do the conversion. What we do not need and should not have is for the seconds counting thing to contain two identical timestamps that refer to two independent points in time. It should just count seconds.
Agree, but people often miss that there's two different use cases here, with different requirements.
"2 days from now" could either mean "after 2*86400 seconds have ticked" or it could mean "when the wall clock looks like it does now, after 2 sunset events". These are not the same thing.
The intent of the thing demanding a future event matters. So you can have the right software abstractions all you like and people will still use the wrong thing.
The problem is that programmers are human, and humans don't reason in monotonic counters :)
One might also recall the late Gregory Bateson's reiteration that "number and quantity are not the same thing - you can have 5 oranges but you can never have 5 gallons of water" [0]
Seconds are numbers; calendrical units are quantities.
[0] Bateson was, in some ways, anticipating the divide between the digital and analog worlds.
> "2 days from now" could either mean "after 2*86400 seconds have ticked" or it could mean "when the wall clock looks like it does now, after 2 sunset events". These are not the same thing.
Which is why you need some means to specify which one you want from the library that converts from the monotonic counter to calendar dates.
Anyone who tries to address the distinction by molesting the monotonic counter is doing it wrong.
I recently built a small Python library to try getting time management right [1]. Exactly because of the first part of your comment, I concluded that the only way to apply a time delta in "calendar" units is to provide the starting point. It was fun developing variable-length time spans :) I however did not address leap seconds.
You are very right that future calendar arithmetic is undefined. I guess that the only viable approach is to assume that it works based on what we know today, and to treat future changes as unpredictable events (as if earth would slow its rotation). Otherwise, we should just stop using calendar arithmetic, but in many fields this is just unfeasible...
> I guess that the only viable approach is to assume that it works based on what we know today, and to treat future changes as unpredictable events
No, the only way is to store the user's intent, and recalculate based on that intent when needed.
When the user schedules a meeting for 2PM while being in Glasgow, the meeting should stay at 2PM Glasgow time, even in a hypothetical world where Scotland achieves independence from the UK and they get different ideas whether to do daylight saving or not.
The problem is determining what the user's intent actually is; if they set a reminder for 5PM while in NY, do they want it to be 5PM NY time in whatever timezone they're currently in (because their favorite football team plays at 5PM every week), or do they want it to be at 5PM in their current timezone (because they need to take their medicine at 5PM, whatever that currently means)?
And if the number of seconds changes, such as in a batch job on a super computer, you should adjust the time of the computer first, and then adjust the billing for the job , after it completes. I asked IBM if they quantium cloud could count the time in either direction... At first they were confused, but then they got the joke.
I understand what you’re saying, but minutes, hours, days and weeks are fixed time periods that can be reduced to a number of seconds. Months and years are not which is why I did not include those in my earlier post.
Calculating the calendar date for an event that’s 365 days in the future needs to consider whether leap-time corrections need to be made during the period. We do that already for days with our standard calendar.
If someone says an event is in X days, they almost never mean a multiple of 86k seconds.
Really, I don't think you can reduce any of these to a specific number of seconds. If someone says an event is in 14 hours, the meaning is a lot closer to 14±½ * 3600 than it is to 14 * 3600.
It's not, actually. Does 2 days and 1 hour ago mean 48, 49 or 50 hours, if there was a daylight saving jump in the meantime? If it's 3PM and something is due to happen in 3 days and 2 hours, the user is going to assume and prepare for 5PM, but what if there's a daylight saving jump in the meantime? What happens to "in 3 days and 2 hours" if there's a leap second happening tomorrow that some systems know about and some don't?
You rarely want to be thinking in terms of deltas when considering future events. If there is an event that you want to happen on jan 1, 2030 at 6 PM CET, there is no way to express that as a number of seconds between now and then, because you don't know whether the US government abolishes DST between now and 2030 or not.
To reiterate this point, there is no way to make an accurate, constantly decreasing countdown of seconds to 6PM CET on jan 1, 2030, because nobody actually knows when that moment is going to happen yet.