Please don't call it Epoch Time

Every so often, I'll come across a StackOverflow question or other Internet posting that says something like:

How do I get the epoch time?

Or maybe:

I have an epoch time and I want to get the next day's epoch time.

Or even better:

I changed my epoch time to a different time zone and...

This is rubbish.  Stop.  Please, just stop and look at the words you are using.  Maybe you just have some large integer number and someone told you it was an "epoch time" so you think "epoch" is some name assigned to this sort of thing.  It's not.  Epoch is an English word.

According to Merriam-Webster:

epoch
noun | ep·och  \ˈe-pək, ˈe-ˌpäk, US also and British usually ˈē-ˌpäk\1a: an event or a time marked by an event that begins a new period or development
1b: a memorable event or date
2a: an extended period of time usually characterized by a distinctive development or by a memorable series of events
2b: a division of geologic time less than a period and greater than an age
3: an instant of time or a date selected as a point of reference (as in astronomy)

Note in particular that the word is a noun.  When one says "epoch time", they are using epoch as if it were an adjective.  It is not.

Of the above definitions, the third is the only one that applies in computing - in the same way that it does in astronomy.

An epoch is a reference point. It is the timestamp with the value 0.  It makes no sense to take a timestamp like 1507163237 and call it an epoch!

Wait, I thought epoch time was about the Unix epoch?

Well, you're getting a little closer to the truth now.  But please understand, while 1970-01-01T00:00:00Z is indeed the "Unix epoch", it is called this because that is the time we assign to 0 for a Unix timestamp.  So when you have a value like 1507163237, that is a Unix timestamp, not an epoch.

A few things you should know about Unix timestamps:

  • Unix timestamps are always based on UTC (otherwise known as GMT).  It is illogical to think of a Unix timestamp as being in any particular time zone.
  • Unix timestamps do not account for leap seconds.  They assume a perfect succession from one second to the next, without any leaps ever occurring.  This isn't the reality of course, so if you are expecting a series of Unix timestamps to be completely contiguous, you may be in for a surprise every so often.  One doesn't commonly need to concern themselves with this, however.
  • Traditionally, Unix timestamps were defined in terms of whole seconds.  However, many modern programing languages (such as JavaScript and others) give values in terms of milliseconds.  So be certain you know which you are working with.  It is reasonable to say "a Unix timestamp in seconds", or "a Unix timestamp in milliseconds".  Some prefer the phrasing "milliseconds since the Unix epoch (without regard to leap seconds)".

So if the epoch is 0 and that happened in 1970, what's the big deal?

One only needs to point at the Wikipedia article on Notable epoch dates in computing.  Yes, there are multiple of them.  Depending on what you're doing, 0 might not be the Unix epoch, but some other epoch entirely.  Also note that that while the Unix epoch is always describing time on an absolute, universal time scale, some of the other epochs are not necessarily UTC based.

As an example, consider that "ticks" in a .NET DateTime object are based on a 0001-01-01T00:00:00.0000000 epoch.  One has to consider both "ticks" and "kind" to construct a DateTime, and "kind" is sometimes unspecified, meaning it cannot necessarily be mapped back to a point in UTC time.  O_o

Does it really matter?

Probably not.  I think people know what you mean when you say "epoch time".  I just think it's a bit silly.