A tale of two cities' time zone offsets #
A while back, I was at work testing some time-zone-related issues. I needed to provide a date. I didn’t really care about the exact date, so I just used blank or some long-ago value near the epoch for our database. (The epoch for Epic's M code is, IIRC, 1841-01-01, or something similar -- long story about why...)
For my usage, I had a time of day and date (a realistic, current date) and needed an offset from UTC, and was expecting to get 5 or 6 hours -- 18000 or 21600 seconds.
But I kept getting...21036 seconds? What? That's about 9 minutes off from 6 hours, which makes no sense.
I poked at it, and eventually discovered that for 1883-11-18, I got an offset of 21036; for 1883-11-19, I got the expected 18000.
The offset is different from one day to the next?!
It turns out that, for time zones, November 19, 1883 is a very special day: that's the day the US adopted standard time zones.
Before then, time zones as we know them really didn't exist, and most people used local solar time -- whatever time the sun was highest was 12:00 noon. But this caused havoc with railroad schedules, and the 1883 adoption of standard time zones addressed that.
So, the above 564 second discrepancy arises because local solar time in Chicago isn't quite the same as the America/Chicago time zone's offset. Chicago is at longitude -87.6293 degrees; each 15 degrees of longitude corresponds to a 1-hour offset from UTC (or, in 1883, from Greenwich time). So the offset for Chicago's local solar time is about 5.841953 hours -- that's 569 seconds away from 6 hours.
So, when my code asked the time zone conversion library API for a time-of-day on November 18, 1883, it uses the local solar time offset; for the following day, it uses the historically correct, standard time zone 6-hour offset!
But wait, there's a 5-second discrepancy between 564 (what the code says) and 569 from the above geodetic computation -- why is that?
That comes from what longitude you use for Chicago; a 5-second discrepancy corresponds to about 0.021 degrees of longitude; at Chicago's latitude, that's about 1.73 kilometers east or west -- roughly a mile. These days, Chicago is about 15 miles wide; even in 1883, Chicago was 7 or 8 miles wide, so you could easily get different solar times for Chicago depending on where in the city you use for your longitude.
(In particular, for Chicago's particular geography on the shore of Lake Michigan, one could reasonably declare "the" longitude for Chicago to be close to the easternmost point in the city, near downtown on the shore; on the other hand, if you use some kind of geographic center, you'd be several miles west.)
What was the library code using to get that value? #
Our library code ultimately uses the operating system's time zone database. That database is vastly more complicated than you would think. As we see here, time zones are date-dependent.
(And let's not even get started on falsehoods you may believe about time. For one fun example: do you think the day after December 29 is December 30? Not always! The day after December 29, 2011, in Samoa and Tokelau was December 31, 2011!)
tl;dr use recent-ish dates when testing #
The lesson here is, if you're testing code that involves time zones, and you're sending it dates, and you think "meh, the exact date doesn't matter" -- that may be true for relatively recent dates. But don't just send blank, or very old values; use something recent-ish.
The elephant Delorean in the room #
having gone down that rabbit hole, let me end by saying the classic Back to the Future III really missed an opportunity to make a very, very subtle reference to time travel -- in that movie, Marty travels back to 1885. He should have traveled back to November 1883!