Sort local time by UTC time - sorting

Is there anything wrong with sorting local time by UTC timestamp?
I store time as Unix timestamps in UTC and output local time in a pretty complex format that is user friendly but would be a nightmare to sort. So I decided to sort it by UTC time.
Here is an exaggerated example:
<td data-utc-timestamp="1514372400">
<b>7:00<i>PM</i></b> December 27th,
beautiful Wednesday evening in Shanghai,
the year of the Rooster, 2017 :D
</td>
Seems perfectly fine to me to sort this by UTC timestamp, but I am not an expert in the field so I am having doubts.
Is there any scenario where an array of UTC times will have a different order than the same array of UTC times converted to local times?

UTC time is constant (i.e. there is no day light savings time or time changes ever). The only time that an issue is possible is when your local time observes DST. The sorting during the transition might mess you up. But as long as you handle that properly (i.e. your time codes are ordered correctly), UTC time and local time will mesh.
This answer has more details: Does UTC observe daylight saving time?

Related

A DATE with the ORACLE - The Time Zone Dilemma. Is it midnight or is there no time?

So I'm writing some code trying to convert some oracle dates I got into a different time zone. The issue I'm having is that when the time portion is 00:00:00 I don't now how to determine if it is legitimately midnight or if the date was meant to not include a time.
Currently, I'm making the assumption that if the time is 00:00:00 then the value is just a time-free date, because unfortunately that is sometimes the case, but while statistically small, there is a chance that the date is legitimately midnight so I'm trying to find a better approach with no success.
I can't assume all 00:00:00 are midnight, because if the data was intended to only have a date then converting to most other US time zones would change the date.
Any suggestions?
Check for other values in the same column to see what the distribution of 00:00:00's is. If all entries are 00:00:00 then the column is definitely dates, if it's roughly 1 in (24*60*60) occurrences then it's definitely times, if it's somewhere in between then you've got a problem.
You could also look for a check constraint to see if times are constrained to be midnight.
You could look at the semantics of the column also -- what does the name tell you?

Timezone confusion with Ruby's Time class

It's worth noting I am using Ruby 2.1.2, and so the Time class uses a signed 63 bit integer. The integer is used when it can represent a number of nanoseconds since the Epoch; otherwise, bignum or rational is used, according to the documentation.
When I use ::new without arguments, it gives me the current time using my local time zone (not UTC):
> Time.new
=> 2015-06-30 18:29:08 -0400
It's correct. It's 6:29pm on the East coast of the US. Now I want to check the local time in two timezones to the east of EDT:
> t = Time.new(2015,6,30,18,29,8,"+02:00")
=> 2015-06-30 18:29:08 +0200
This is where my confusion comes in. When I specify two timezones to the east of me, I expect there to be two additional hours because each timezone is 15 degrees longitude and each is represented by 1 hour.
Why did it give me the same time as my local time, instead of two hours later?
What you think is happening isn't what's happening. What you've done is give that time an offset of GMT +2 as opposed to two hours out from your current timezone.
If you want to see the time at the offset of two hours ahead of where you are, then you want to create the instance of Time, get your local time, and shift that by the GMT offset.
Time.now.getlocal("-02:00")
If you want to compute this, you have to look at your local time's utc_offset first, then either add or subtract the product of 3600 and however many timezones you want to move. Note that this only moves timezones in whole number increments and will break in cases where a timezone that requires a different precision is needed (i.e. Newfoundland).
t = Time.now
t.getlocal(t.utc_offset + (3600 * 2))

what is this elasticsearch timestamp format?

I'm reverse engineering a query from a kibana board and it has timestamp values in it like '1408884022624'.
In reading over the elastic search date mapping docs I don't see anything in there regarding what (appears to be) some sort of millisecond or tick format. Could someone tell me what the number above represents in my query? (I'm pretty sure we're not using a custom date format.)
It's the number of milliseconds since the beginning of Unix Epoch time, 00:00:00 UTC January 1 1970. Sometimes referred to as Java Epoch time. Technically it's not Unix Epoch time as that's tracked as the number of seconds since the above date, but many tools/converters handle both seconds and milliseconds.
Care should be taken, though, as it's quite easy to accidentally get the time in one format (let's say seconds) and pass it to a function or method expecting it in the other.
http://en.wikipedia.org/wiki/Unix_time
http://www.javaworld.com/article/2074293/core-java/groovy--java--and-the-unix-epoch-time.html

Subroutine to apply Daylight Bias to display time in local DST?

UK is currently 1 hour ahead of UTC due to Daylight Savings Time. When I check the Daylight Bias value from GetTimeZoneInformation it is currently -60. Does that mean that translating UTC to DST means DST = UTC + -1 * DaylightBias, ie negate and add?
I thought in this case for instance adding Daylight Bias to UTC is the correct operation, hence requiring DaylightBias to be 60 rather than -60.
It's
UTC = DST + DaylightBias [for your specific timezone]
so yes, you would subtract the Bias from UTC to get local time.
Here's a quote from the MS glossary:
time zone bias: The positive, zero, or
negative offset in minutes from
Coordinated Universal Time (UTC). For
example, Middle European Time (MET,
GMT+01:00) has a time zone bias of
"-60" because it is one hour ahead of
UTC. Pacific Standard Time (PST,
GMT-08:00) has a time zone bias of
"+480" because it is eight hours
behind UTC.

JavaScript date constructor and timezone

The Date constructor in JavaScript/ECMAScript/JScript allows passing the number of milliseconds since midnight, 1/1/1970. Nowhere can I find documentation whether this is midnight in the client machine's timezone, or midnight GMT. Which is it? Can it be relied on between different browsers and versions? Is this officially documented anywhere?
From the ECMAScript specification:
Time is measured in ECMAScript in
milliseconds since 01 January, 1970
UTC. In time values leap seconds are
ignored. It is assumed that there are
exactly 86,400,000 milliseconds per
day. ECMAScript Number values can
represent all integers from
–9,007,199,254,740,991 to
9,007,199,254,740,991; this range
suffices to measure times to
millisecond precision for any instant
that is within approximately 285,616
years, either forward or backward,
from 01 January, 1970 UTC.
The actual
range of times supported by ECMAScript
Date objects is slightly smaller:
exactly –100,000,000 days to
100,000,000 days measured relative to
midnight at the beginning of 01
January, 1970 UTC. This gives a range
of 8,640,000,000,000,000 milliseconds
to either side of 01 January, 1970
UTC.
The exact moment of midnight at
the beginning of 01 January, 1970 UTC
is represented by the value +0.
So to answer your question, it's Coordinated Universal Time.

Resources