Recently, I had a conversation with a junior developer on my team. Let’s call him Alan. We were talking about a new notification feature that was going to be used to send reminder e-mails to potentially thousands of people if they had forgotten to enter certain data in the last month or so. Alan was confident that the code he’d written was correct. “I’ve tested it well.”, he said…
Me neither buddy, me neither…
I had to learn this the hard way… I was working at a platform that pulled measurements from sensors. The sensors did not declare the timezone for the timestamps of the measurement and the platform broke down twice after daylight saving. The first time there were duplicated records which caused conflicts and the second one we weren’t handling impossible timestamps.
I had a client whose clock was just a few milliseconds behind the server’s, but due to timezone crap one hour in the past. And the signature was valid for one hour.
If the network just happened to be too congested, the validation failed. The next request went through just fine. Took us forever to find out.
I don’t really get why people use any time other than ms/seconds since the epoch for anything other than displaying that time to the end user. Having time just be a single number with no time zone shenanigans makes writing logic like that so much easier.
Epochs aren’t that simple either.
First of all, local time can be relevant, so you have to store timezone information somewhere anyway.
Epochs are also somewhat iffy in regards to leap years or seconds.
And finally: write me an SQL to retrieve all entries submitted in 2022 using just epochs.
Timezones are annoying as fuck, don’t get me wrong, but simply ignoring them isn’t a solution either.
I don’t really remember SQL, does it prevent you from using a range of values? I can understand why leap seconds would be an issue.
I sure do hate time zones.
So say we all
One from JWZ: mysqldump writes out a date that it cannot parse (and more)