Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nowadays I'd agree with you, UTC is probably the best bet. But back then, it wasn't.




> But back then, it wasn't.

UTC was standardized in 1963 [0]

it was already a 40 year old standard at the time you're talking about.

awareness of UTC being the correct choice has definitely increased over time, but UTC being the correct choice has not changed.

you say reddit servers use UTC now, which implies there was a cutover at some point. were you still at reddit when that happened? were you still hands-on with server maintenance? any anecdotes or war stories from that switchover you want to share?

because I can easily imagine parts of the system taking a subtle dependency on Arizona being Reddit Standard Time, and the transition to UTC causing headaches when that assumption was broken. your memory of this "clever" trick might be different if you had to clean up the eventual mess as well.

0: https://en.wikipedia.org/wiki/Coordinated_Universal_Time


using utc on servers was very common in 2005

I’d say it was common enough but not universally, given the number of arguments I had from 2005 to 2015 about this exact issue.

Hold on, I'm not a sysadmin guy. Are you folks saying the server should not know what part of the world its in, that basically it should think it's in Greenwitch?

I would have thought you configure the server to know where it is have it clock set correctly for the local time zone, and the software running on the server should operate on UTC.


From a logging perspective, there is a time when an event happens. The timestamp for that should be absolute. Then there's the interaction with the viewer of the event, the person looking at the log, and where he is. If the timestamp is absolute, the event can be translated to the viewer at his local time. If the event happens in a a different TZ, for example a sysadmin sitting in PST looking at a box at EST, it's easier to translate the sysadmin TZ env, and any other sysadmin's TZ anywhere in the world, than to fiddle with the timestamp of the original event. It's a minor irritation if you run your server in UTC, and you had to add or subtract the offset, eg. if you want your cron to run at 6PM EDT, you have to write the cron for 0 22 * * *. You also had to do this mental arithmetic when you look at your local system logs, activities at 22:00:00 seem suspicious, but are they really? Avoid the headaches and set all your systems to UTC, and throw the logs into a tool that does the time translation for you.

The server does not "know" anything about the time, that is, it's really about the sysadmin knowing what happened and when.


1) Most software gets its timestamps from the system clock 2) If you have a mismatch between the system time and the application time, then you just have log timestamps that don't match up; it's a nightmare - even more so around DST/ST transitions

you've got it backwards - the server clock should be in UTC, and if an individual piece of software needs to know the location, that should be provided to it separately.

for example, I've got a server in my garage that runs Home Assistant. the overall server timezone is set to UTC, but I've configured Home Assistant with my "real" timezone so that I can define automation rules based on my local time.

Home Assistant also knows my GPS coordinates so that it can fetch weather, fire automation rules based on sunrise/sunset, etc. that wouldn't be possible with only the timezone.


I kind of assumed all computer clocks were UTC, but that you also specified a location, and when asked what time it is, it did the math for you.

Windows assumes computer clocks are local time. It can be configured to assume UTC. Other operating systems assume computer clocks are UTC. Many log tools are not time zone aware.

Computer clock is just counter, if you set the start counting point to UTC, then it's UTC, you set it to local time, then it's local time.

that's the difference between "aware" and "naive" timestamps. Python has a section explaining it in their docs (though the concept applies to any language):

https://docs.python.org/3/library/datetime.html#aware-and-na...


AKA time zones

A server doesn't need to "know" where in the world it is (unless it needs to know the position in the sun in the sky for some reason).

A server doesn't "think" and the timezone has no relevance to where it is located physically.

I'm not sure why you're getting downvoted.

Yes, that's exactly what I'm saying :). In fact, I've run servers where I didn't even physically know where it was located. It wouldn't have been hard to find out given some digging with traceroute, but it didn't matter. It was something I could SSH into and do everything I needed to without caring where it was.

Everyone else down-thread has clarified the why of it. Keep all of your globally distributed assets all running on a common clock (UTC) so that you can readily correlate things that have happened between them (and the rest of the world) without having to do a bunch of timezone math all the time.


Common, but not universal - from 2005 to as late as 2014 I worked for companies that used Pacific time on their servers.

the standard for service providers was UTC in 1995

I have photos showing that my dad (born 1949, never in the military) kept his watch on UTC in the early 70s.

Would he by any chance refer to it as Zulu or Zebra time? The Z-suffix shorthand for UTC/GMT standardisation has nautical roots IIRC and the nomenclature was adopted in civil aviation also. I sometimes say Zulu time and my own dad, whose naval aspirations were crushed by poor eyesight, is amongst the few that don’t double-take.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: