> POSIX time, also known as Unix time, is the number of seconds since the Unix epoch, which was 1970-01-01 at 00:00:00. … I think there should be a concise explanation of the problem.
I don’t think that the definition that software engineers believe is wrong or misleading at all. It really is the number of seconds that have passed since Unix’s “beginning of time”.
But to address the problem the article brings up, here’s my attempt at a concise definition:
POSIX time, also known as Unix time, is the number of seconds since the Unix epoch, which was 1970-01-01 at 00:00:00, and does not include leap seconds that have been added periodically since the 1970s.
Seconds are a fraction of a day which is Earth rotating, and count 86400 seconds and then roll over to the next day, but Earth's rotating speed changes so how much "time passing" is in 86400 seconds varies a little. Clocks based on Earth rotating get out of sync with atomic clocks.
Leap seconds go into day-rotation clocks so their date matches the atomic clock measure of how much time has passed - they are time which has actually passed and ordinary time has not accounted for; so it's inconsistant for you to say "Unix time really is the number of seconds that have passed" and "does not include leap seconds" because those leap seconds are time that has passed.
It really is the number of seconds that have passed since Unix's "beginning of time", minus twenty-nine. Some UTC days have 86401 seconds, Unix assumes they had 86400.
It's wrong and misleading in precisely the way you (and other commenters here) were wrong and misled, so it seems like that's a fair characterization.
Strictly speaking Unix time is monotonic, because it counts integer number of seconds and it does not go backwards, it only repeats during leap seconds.
POSIX does define "the amount of time (in seconds and nanoseconds) since the Epoch", for the output of clock_gettime() with CLOCK_REALTIME [0]. That "amount of time" must be stopped or smeared or go backward in some way when it reaches a leap second. This isn't the 80s, we have functions that interact with Unix time at sub-second precision.
“Monotonic” means non-decreasing (or non-increasing if you’re going the other way). Values are allowed to repeat. The term you’re looking for is “strictly increasing.”
I guess this hinges on whether you think Unix time is an integer or a float. If you think it's just an integer, then yes, you can't get a negative delta.
If, however, you think it's a float, then you can.
Because a day, that is the time between midnight UTC and midnight UTC, is not always exactly 86400 seconds, due to leap seconds. But Unix time always increases by exactly 86400.
I think you're describing the exact confusion that developers have. Unix time doesn't include leap seconds, but they are real seconds that happened. Consider a system that counts days since 1970, but ignores leap years so doesn't count Feb 29. Those 29ths were actual days, just recorded strangely in the calendar. A system that ignores them is going to give you an inaccurate number of days since 1970.
They are not. They are inserted because two time scales, one which is based on the rotation of the earth, and the other on atomic clocks, have slowly drifted to a point that a virtual second is inserted or removed to bring them back into agreement. To they extent they exist, by the time they are accounted for, they've already slowly occurred fractionally over several months or years.
> A system that ignores them is going to give you an inaccurate number of days since 1970.
It depends on your frame of reference. If you're looking at an atomic clock it's inaccurate, if you're looking at the movement of the earth with respect to the sun and the stars, it's perfectly accurate.
It's easier to me if you separate these into "measured time" and "display time." Measured time is necessary for doing science. Display time is necessary for flying a plane. We can do whatever we want with "display time," including adding and subtracting an entire hour twice a year, as long as everyone agrees to follow the same formula.
Are you sure they actually happened? as you say, at least one of us is confused. My understanding is that the added leap seconds never happened, they are just inserted to make the dates line up nicely. Perhaps this depends on the definition of second?
Leap seconds are exactly analogous to leap days. One additional unit is added to the calendar, shifting everything down. For leap days we add a day 29 when normally we wrap after 28. For leap seconds we add second 60 when normally we wrap after 59.
Imagine a timestamp defined as days since January 1, 1970, except that it ignores leap years and says all years have 365 days. Leap days are handled by giving February 29 the same day number as February 28.
If you do basic arithmetic with these timestamps to answer the question, “how many days has it been since Nixon resigned? then you will get the wrong number. You’ll calculate N, but the sun has in fact risen N+13 times since that day.
Same thing with leap seconds. If you calculate the number of seconds since Nixon resigned by subtracting POSIX timestamps, you’ll come up short. The actual time since that event is 20-some seconds more than the value you calculate.
I think you make an interesting point here, but then your example is exactly backwards.
If you have a timestamp defined as days since January 1, 1970: If you do basic arithmetic to answer the question "How many days has it been since Nixon resigned" you will _always get the right number_. There are no leap days, they are just normal days.
The problem only comes in when you try to convert between this date type and other types. Our "days since the epoch" date type is fully internally consistent. As long as you know the correct value for "the day Nixon resigned" and "now", it's just a subtraction.
I'm honestly just diving into this now after reading the article, and not a total expert. Wikipedia has a table of a leap second happening across TAI (atomic clock that purely counts seconds) UTC, and unix timestamps according to POSIX: https://en.wikipedia.org/wiki/Unix_time#Leap_seconds
It works out to be that unix time spits out the same integer for 2 seconds.
I thought you were wrong because if a timestamp is being repeated, that means two real seconds (that actually happened) got the same timestamp.
However, after looking hard at the tables in that Wikipedia article comparing TAI, UTC, and Unix time, I think you might actually be correct-- TAI is the atomic time (that counts "real seconds that actually happened"), and it gets out of sync with "observed solar time." The leap seconds are added into UTC, but ultimately ignored in Unix time.* ~~So Unix time is actually more accurate to "real time" as measured atomically than solar UTC is.~~
The only point of debate is that most people consider UTC to be "real time," but that's physically not the case in terms of "seconds that actually happened." It's only the case in terms of "the second that high noon hits." (For anyone wondering, we can't simply fix this by redefining a second to be an actual 24/60/60 division of a day because our orbit is apparently irregular and generally slowing down over time, which is why UTC has to use leap seconds in order to maintain our social construct of "noon == sun at the highest point" while our atomic clocks are able to measure time that's actually passed.)
*Edit: Or maybe my initial intuition was right. The table does show that one Unix timestamp ends up representing two TAI (real) timestamps. UTC inserts an extra second, while Unix time repeats a second, to handle the same phenomenon. The table is bolded weirdly (and I'm assuming it's correct while it may not be); and beyond that, I'm not sure if this confusion is actually the topic of conversation in the article, or if it's just too late in the night to be pondering this.
I don’t think that the definition that software engineers believe is wrong or misleading at all. It really is the number of seconds that have passed since Unix’s “beginning of time”.
But to address the problem the article brings up, here’s my attempt at a concise definition:
POSIX time, also known as Unix time, is the number of seconds since the Unix epoch, which was 1970-01-01 at 00:00:00, and does not include leap seconds that have been added periodically since the 1970s.