Also "blue" is a poorly defined description of color and shouldn't be used to describe a color.
The truth in both cases is of course that while the meaning is contextual and imprecise, it's still useful because people generally don't use precise measurements in their day-to-day activities. When someone tells me that something will happen in a month, I take it to mean something in the ballpark of 28-31 days, and that information can be useful to me.
Because of leap seconds, the only really unexceptional unit of time is seconds, but someone telling me that something will happen in 2592000 seconds isn't particularly useful to me.
In the context of the calculator, you already know the start and end times, and it shows the duration in terms of days (which is unambiguous given that we know the interval when the duration takes place). I think that it's clear that the duration in terms of months, weeks and hours is there just to put it in more easily graspable terms, at the expense of precision.
As it happens - neither is a day, or even an hour. The exceptions are rarer, which only makes bugs even more insidious when they do happen (like dealing with leap-seconds, for example. Some hours have 3601 seconds!).
i don’t think either is correct. the leap second is an extra second, that’s it. the notation for the extra leap second is lacking so we are “forced” to call it 23:59:60 but that doesn’t mean that minute has 61 seconds. it just means after that minute but before the next one, there’s an inserted second.
it’s far more sane to define it that way than to say the duration of a minute (and hence hour, and day, and week) is variable.
The Python "dateutil" library includes a "relativedelta" that works how one would typically expect. Though, I've usually used it to add known deltas, not to subtract two dates and arrive at a delta, so IDK if it supports that.
(It has rules, of course, for dealing with 30 day months when you're on the 31st and you add a "month". The normal timedelta class from the standard library doesn't do "months", since, as you note, it's not really a measure of time.)
In Unix timestamps, over a leap second, a difference of 1 Unix second is occasionally 2 seconds long.
This is what we do. "One month and 3 days" is not a well defined duration, but it's what we use in casual conversation.
The date that occurs "in one month" is the same day day-of-month in the following month. The same is true for years. Between christmas day one one year and boxing day the next year, it's "one year and one day", regardless of whether that actually means 366 or 367 days.
If on christmas day you say "In one year and one day we'll travel to australia", you aren't really specifying a duration but a date in the future. Listeners understand that regardless of whether the following year is a leap year, what you are describing is boxing day the following year.
While you could specify the same thing with an more precise interval such as 367 days, that's actually less useful as a way of describing the date of the event, but a more exact way of describing the interval of time to the event.
So what the calculator shows, is how to communicate a date to another human, in the inexact way that they expect. This is actually a tricky problem in programming (as is obvious by this bug)
Months are pretty well and exhaustively defined. There are 13 of them. It's just that it's not enough to say that there is a 3-month interval. You need to specify either the dates or the months.
To say months shouldn't be used is sort of like saying that complex numbers or vectors shouldn't be used. They are absolutely useful. We just deal with their component parts separately.
I think that you are missing the point. GP means to say that "a month" is poorly defined as a unit of duration, specifically. Not even ISO 8601 specifies what exactly a month means in terms of duration, despite specifying a duration description format that includes months.
In some cases you can take "a month from now" to mean the same calendar day of the next month. This leaves what a month from January 31 means ambiguous, and also dependent on when the duration starts.
You can take an easier route and say that a month is 30 days. Still, the duration is ambiguous because it depends on when it takes place (i.e. an interval), because of summer/winter time adjustment. You can say that it's 720 hours, but that's still dependent on the start of the duration because of leap seconds. In the end, the only way to describe a duration unambiguously regardless of its start time is with seconds. There is no such thing as a well defined duration for any other unit of time.
A month is not a well defined unit. There are four different month units: 28, 29, 30, 31 days. If you do calculations respecting this there will be no problem.
Complex numbers and vectors are not a good analogy.
The mapping from "month" units to "days" units is well-defined. It's just some function that happens to be non-constant over the month ordinal.
Heck, if we start down the "precise time definition" rabbit hole, though, then neither does "day" have some philosophically unassailable notion, cf. leap seconds. Even the unit of "second" ends up pulling in a whole heck of a lot of physics machinery just to nail down some semblance of rigor.
Anyway, despite being such an intuitively simple and practically functional concept, the notion of time amd time measurement turns out to be suprisingly subtle and to have a fascinating history. I highly recommend jumping down that rabbit hole. Hehe
Anyway, I'm surprised calc.exe doesn't calculate with and store dates using some kind of epoch time.
Uh...October, November, December and Undecimber? I don't see the last one very often. (In other words, once you start delving into it, turns out it's neither simple nor unambiguous - which calendar? If Gregorian, when did the switch from Julian happen? Etc etc.)
Because approximate durations suffice for casual conversation. When you are talking about the exact date (as computers do) you need something more accurate.
It is clear that the duration between 2015-05-15 [1] and 2019-06-19 is about 2019 - 2015 = 4 years or exactly 1,496 days (or 1,497 days including both endpoints, yeah this is also somewhat ambiguous). It is much less clear that the duration in question is 4 years, 1 month and 4 days; depending on the use case months can be uniformly 30 days long, may or may count the "excess" days, or even do not matter (in which case the duration would be 4 years and 35 days). Generally such a duration is ambiguous without a context and it would be misleading to present it as exact.
June 15th 2015 and July 19th 2019 is 4 years 1 month and 4 days apart. That is the duration that occurred between the two dates and is totally unambiguous to non-programmers.
The fact that calendar distance and 'time' distance don't have a linear relationship and that math that governs the relationship between the different units isn't straightforward doesn't make it unprecise or misleading.
It's totally exact. Just because you can't convert the calendar distance to time distance without context doesn't make it any less exact. Not so different to being unable to compute distance traveled by the revolution count of your wheels.
If you are doing the financial calculation or similar, the month unit is absolutely inaccurate and you will keep asking about edge cases (otherwise you may lose money). And if you don't do that, you can cope with approximate units like months, half-months or so. There is no reason that the exact calculation of duration should be done in approximate units.
Once again, we're talking windows built-in calculator. I really hope the specification is "make it output something that the average user will understand and agree with". And the operative keyword here would be "average".
I'm saying that the Windows Calculator does disservice to the general public by showing the exact-ish year-month-day duration at all, it's like experts oversimplifying the complex situation and making laypersons more confused. Sorry if this was not clear.
But the way people use it is not for exact dates and deltas. If today is March and I know something will happen in August, I'll say "this will happen in 5 months from now", but I've never heard anyone add days and weeks to such a duration. If today is 31st o'March and you tell me "next month", that could mean anything between one or thirty days, i.e. the difference between 31st o'March and 1st o'April is one month.
Edit: Also the same way that "tomorrow" can be in 5 minutes.
An month makes sense as an absolute value, not as a relative one.
The Windows calculator is terrible design. To calculate deltas between two dates you need to convert them to seconds, subtract them, and convert the result back to an absolute time.
But then you get the (in some ways correct) result that the difference between the 15th of one month and the 15th of another month is a couple months and couple of days, instead of the intuitive whole number of months.
That gives you an absolute time result, not a practical one. Using that methodology the calculator wouldn't be able to tell whether something is in one month or not.
I think most normal people (non programmers) would be surprised by how complex such seemingly simple types like real numbers, dates and text can become.