For what it’s worth, I have been a convert from naive to aware for a couple years now. I used to like to think naive == UTC, but when data comes from unverifiable sources, you can’t know that for certain…
Honestly, I’m working on a system with data we control and we’ve been migrating pg columns from WITHOUT to WITH TIMEZONE - our server natively runs in PST rather than UTC for convenience and everytime we find a naive timestamp we have two main options to consider - whether the dev properly stored values as UTC or whether they just used a now like function and got PST.
I have to ask: does your server switch to storing PDT during daylight savings? Either way, I’m so sorry you have to deal with that.
So all of our dates should be stored in UTC… but everytime we have to deal with an unmarked column we have to check if it’s in PST because a developer forgot. If a developer did forget, they’d probably grab the PDT value during DST.
- The return value of time.time() is actually a floating-point number … It’s also not guaranteed to be monotonically increasing, which is a whole other thing that can trip people up, but that will have to be a separate blog post.
Oh god, I didn’t realize that about Python and the POSIX spec. Cautiously, I’m going to guess that GPS seconds are one of the few reliable ways to uniformly convey a monotonically-increasing time reference.
Python has long since deprecated the datetime.datetime.utcnow() function, because it produces a naive object that is ostensibly in UTC.
Ok, this is just a plainly bad decision then and now by the datetime library people. What possible reason could have existed to produce a TZ-naive object from a library call that only returns a reference to UTC?
You also have CLOCK_MONOTONIC, which could or could not be the number of seconds since last reboot.
To be honest, this mess was directly inherited from POSIX C system calls.
Professionals have standards. (implying that Python is not professional, mind you I love Python and use it since a decade)