Subroutine to apply Daylight Bias to display time in local DST? - time

UK is currently 1 hour ahead of UTC due to Daylight Savings Time. When I check the Daylight Bias value from GetTimeZoneInformation it is currently -60. Does that mean that translating UTC to DST means DST = UTC + -1 * DaylightBias, ie negate and add?
I thought in this case for instance adding Daylight Bias to UTC is the correct operation, hence requiring DaylightBias to be 60 rather than -60.

It's
UTC = DST + DaylightBias [for your specific timezone]
so yes, you would subtract the Bias from UTC to get local time.
Here's a quote from the MS glossary:
time zone bias: The positive, zero, or
negative offset in minutes from
Coordinated Universal Time (UTC). For
example, Middle European Time (MET,
GMT+01:00) has a time zone bias of
"-60" because it is one hour ahead of
UTC. Pacific Standard Time (PST,
GMT-08:00) has a time zone bias of
"+480" because it is eight hours
behind UTC.

Related

Time Zone format

When I find a time zone format like
"CET-1CEST,M3.5.0,M10.5.0/3"
what's the meaning of the three field? I understand the CET-1CEST but what about the last two M3.5.0 and M10.5.0/3?
I found this format in this tutorial for ESP32 NTP Time and in this list.
Well, if you scroll a little down, you'll notice that such string is called a timezone String (at least in the article). After a quick search, I came across this link, where the meaning of such string is found:
std offset dst [offset],start[/time],end[/time]
For example,
Here are some example TZ values, including the appropriate Daylight Saving Time and its dates of applicability. In North American Eastern Standard Time (EST) and Eastern Daylight Time (EDT), the normal offset from UTC is 5 hours; since this is west of the prime meridian, the sign is positive. Summer time begins on March’s second Sunday at 2:00am, and ends on November’s first Sunday at 2:00am.
EST+5EDT,M3.2.0/2,M11.1.0/2
So for CET-1CEST,M3.5.0,M10.5.0/3:
The standard timezone is CET (Central European Time)
The offset from UTC is −1
The DST timezone is CEST (Central European Summer Time)
DST starts at:
3: the third month of the year (March)
5: the last…
0: …Sunday of the month
(no time specifier, defaults to 2 AM)
DST ends at:
10: the tenth month of the year (October)
5: the last…
0: …Sunday of the month
3: at 3 AM

What happens to milliseconds since epoch when daylight savings time begins/ends?

Milliseconds since epoch represents the number of milliseconds that have elapsed since 1970. At the instant before daylight savings time ends, or when we set the clocks back to 1:00 from 2:00, do the milliseconds since epoch fall back as well, or do they continue?
Another question: If I live in California, US, which is on Pacific (Daylight/Standard) Time, is the milliseconds since epoch the same there as it is in, say, New York, on Eastern (Daylight/Standard) Time?
The milliseconds since epoch are not influenced by timezones and daylight saving time (daylight saving time just changed the timezone with -1 / +1).
The milliseconds/seconds since epoch are (always?) in UTC (or GMT + 0).
Milliseconds since the UNIX epoch (January 1, 1970 00:00:00 UTC) aren't affected by daylight savings and timezones as J. van Dijk mentioned.
To answer your 2nd question explicitly, which i think is important to understand UTC itself: if 2 people call System.currentTimeMillis() in Java or new Date().getTime() in Javascript at the same time, one of them being in California and one of them being in New York they should get the same number of milliseconds.

UTC time explanation

Can anybody explain me what it means when I have the following time:
2012-12-28T18:12:33+01:00
I'm new to the whole datetime stuff and I can't find a good explanation on the web.
Currently I'm in Holland. So does it mean:
2012-12-28T18:12:33+01:00 = 2012-12-28 19:12:33
or
2012-12-28T18:12:33+01:00 = 2012-12-28 17:12:33
or
2012-12-28T18:12:33+01:00 = 2012-12-28 18:12:33
The 2012-12-28T18:12:33+01:00 date string indicates that it is 2012-12-28 at 18:12 in the timezone that corresponds to +1 hour from UTC, which is CET timezone.
This appears to be the ISO 8601 format. The T indicates start of the time element.
Times are expressed in local time, together with a time zone offset in
hours and minutes. A time zone offset of "+hh:mm" indicates that the
date/time uses a local time zone which is "hh" hours and "mm" minutes
ahead of UTC. A time zone offset of "-hh:mm" indicates that the
date/time uses a local time zone which is "hh" hours and "mm" minutes
behind UTC.
The value you provided 2012-12-28T18:12:33+01:00 is an DateTime+Offset value in ISO8601 format, meaning "December 12th, 2013 at 18:12:33, one hour ahead of UTC".
The +01:00 portion represents an offset, not a time zone. See TimeZone != Offset.
The time zone for Holland is either Europe/Amsterdam in the IANA/Olson database, or the entry in the Windows database that has the Id of W. Europe Standard Time and the English display name of "(UTC+01:00) Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna".
This zone is in the +01:00 offset during part of the year, and uses the +02:00 offset during European Summer Time.
Just because you have +01:00 in December, does not mean that is the correct offset to use year-round. It also does not tell you that the timestamp is in Holland. There are several other time zones that use the same offset, and not always at the same times of year.
To convert from one time zone to another, you need to first apply the offset you have. Use the inverse of the sign you have. Since you have +01:00, you would subtract an hour to get the UTC time of 17:12:33. Then you need to know what the correct offset is for the target time zone at that time of year. For that, you need a time zone database.
International Standard ISO 8601 specifies numeric representations of date and time.
YYYY-MM-DDThh:mm:ss.sTZD (eg 1997-07-16T19:20:30.45+01:00)
where:
YYYY = four-digit year
MM = two-digit month (01=January, etc.)
DD = two-digit day of month (01 through 31)
hh = two digits of hour (00 through 23) (am/pm NOT allowed)
mm = two digits of minute (00 through 59)
ss = two digits of second (00 through 59)
s = one or more digits representing a decimal fraction of a second
TZD = time zone designator (Z or +hh:mm or -hh:mm)
Times are expressed in UTC (Coordinated Universal Time), with a special UTC designator ("Z").
Times are expressed in local time, together with a time zone offset in hours and minutes. A time zone offset of "+hh:mm" indicates that the date/time uses a local time zone which is "hh" hours and "mm" minutes ahead of UTC. A time zone offset of "-hh:mm" indicates that the date/time uses a local time zone which is "hh" hours and "mm" minutes behind UTC.
In your case: 2012-12-28T18:12:33+01:00 = 2012-12-28 18:12:33 is true. Meaning the time in Holland is 18:12 and you are 1 hour ahead of UTC.

Does Time.to_i always return number of seconds since EPOCH in UTC?

Is the timezone difference always ignored, regardless in which zone the time is expressed in?
Intuitively, the number of seconds passed since EPOCH should be higher for those who are, for example, in UTC+2. However, this seems not to be the case.
Epoch is based on the utc timezone https://en.wikipedia.org/wiki/Unix_time
it does not depend of the timezone you're currently in.

JavaScript date constructor and timezone

The Date constructor in JavaScript/ECMAScript/JScript allows passing the number of milliseconds since midnight, 1/1/1970. Nowhere can I find documentation whether this is midnight in the client machine's timezone, or midnight GMT. Which is it? Can it be relied on between different browsers and versions? Is this officially documented anywhere?
From the ECMAScript specification:
Time is measured in ECMAScript in
milliseconds since 01 January, 1970
UTC. In time values leap seconds are
ignored. It is assumed that there are
exactly 86,400,000 milliseconds per
day. ECMAScript Number values can
represent all integers from
–9,007,199,254,740,991 to
9,007,199,254,740,991; this range
suffices to measure times to
millisecond precision for any instant
that is within approximately 285,616
years, either forward or backward,
from 01 January, 1970 UTC.
The actual
range of times supported by ECMAScript
Date objects is slightly smaller:
exactly –100,000,000 days to
100,000,000 days measured relative to
midnight at the beginning of 01
January, 1970 UTC. This gives a range
of 8,640,000,000,000,000 milliseconds
to either side of 01 January, 1970
UTC.
The exact moment of midnight at
the beginning of 01 January, 1970 UTC
is represented by the value +0.
So to answer your question, it's Coordinated Universal Time.

Resources