Get time_t from microseconds in the past - c++11

Working on a c++ 11 function that returns a string from an epoch timestamp with millisecond resolution. Doing this with the current date seems straight forward:
auto currentTime = std::chrono::system_clock::now( );
const time_t time = std::chrono::system_clock::to_time_t( currentTime );
However, I'm having a hard time finding out to initialize without now() and instead using a timestamp from the past. Trying to do this using std library, but can't quite see how to initialize the time_point using a past timestamp.

How about using the std::chrono::duration class. Below is an example.
unsigned long noOfClockTicks = 10111111111; // Mar 16 10:31:59 2018
std::chrono::duration<unsigned long> duration(noOfClockTicks);
system_clock::time_point pastTime(duration);
Adjust noOfClockTicks to get the correct value you want or you can even calculate it from std::chrono::system_clock::now().

Related

How to convert a hex TimeDateStamp DWORD value into human readable format?

Can anyone explain how to convert a Hex TimeDateStamp DWORD value into human readable format?
I'm just curious as to how a value such as 0x62444DB4 is converted into
"Wednesday, 30 March 2022 10:31:48 PM"
I tried googling of course and could not find any explanation. But there are online converters available.
But I'm just interested in converting these values for myself.
Your value is a 32-bit Timestamp.
Your datetime value is a 32-bit Unix Timestamp: The number of seconds since 1/1/1970.
See https://unixtime.org/
In most programming languages you can work with the hexadecimal notation directly.
Implementation should not be done by one person alone, since a lot of engineering goes into it. Leap years, even leap seconds, timezones, daylight savings time, UTC... all these things need to be addressed when working with a timestamp.
I have added my rough calculation below as a demonstration. Definitely use an existing package or library to work with timestamps.
See the JavaScript code below for demonstration.
There I multiply your value by 1000 because JavaScript works in Milliseconds. But otherwise this applies the same to other systems.
let timestamp = 0x62444DB4;
let dateTime = new Date(timestamp * 1000);
console.log('Timestamp in seconds:', timestamp);
console.log('Human-Readable:', dateTime.toDateString() + ' ' + dateTime.toTimeString());
// Rough output, just for the time.
// Year month and day get really messy with timezones, leap years, etc.
let hours = Math.floor(timestamp/3600) % 24;
let minutes = Math.floor(timestamp/60) % 60;
let seconds = Math.floor(timestamp) % 60;
console.log('Using our own time calculation:', hours + ':' + minutes + ':' + seconds);

mktime shifts a time by one hour

I faced with an interesting problem with mktime function. I use russian time zone (UTC+03:00) Волгоград, Москва, Санкт-Петербург (RTZ 2) / Volgograd, Moscow, Saint Petersburg/ and try to construct time_t for "7.01.2009 00:00:00"
tm localTM;
localTM.tm_sec = 0;
localTM.tm_min = 0;
localTM.tm_hour = 0;
localTM.tm_mday = 7;
localTM.tm_mon = 0;
localTM.tm_year = 109;
time_t t = mktime(&localTM);
After mktime execution date&time is changed to "6.01.2009 23:00:00".
I have no problems then I construct time for "06.01.2009 00:00:00" or "08.01.2009 00:00:00".
If I switch time zone to another one, I get no problems with "7.01.2009 00:00:00".
What can be a reason of this oddity, and how can I workaround the issue?
When performing conversion to time_t, mktime needs to guess if the input is DST (Daylight Saving Time) or not.
For that, tm.tm_isdst field is used. See from man mktime
tm_isdst A flag that indicates whether daylight saving time is in
effect at the time described. The value is positive if day-
light saving time is in effect, zero if it is not, and nega-
tive if the information is not available.
Since you do not initialize tm_isdst in your code, the default value (0) is used, making mktime think it's in NO-DST period.
To fix it in your code, simply add
localTM.tm_isdst = -1
Note - that logic is necessary as for some moments in time just the "wallclock" information stored in tm is not sufficient to determine the exact time.
And yes, the fact that the default behavior is like that is a bit messed up :)

C++11 chrono create time_point from number

I'm converting a std::chrono::time_point<std::chrono::high_resolution_clock> timestamp using
std::chrono::duration_cast<std::chrono::milliseconds>(
getTimestamp().time_since_epoch()
).count()
to a 64 bit timestamp with millisecond precision. This is needed for some serialization in between of data. Later on I need to convert those timestamps back to a std::chrono::time_point<std::chrono::high_resolution_clock> for further processing. What is the proper way to do this in C++11?
Convert the number of milliseconds to a duration and add it to an epoch time_point:
auto epoch = std::chrono::time_point<std::chrono::high_resolution_clock>();
auto since_epoch = std::chrono::milliseconds(deserialised);
auto timestamp = epoch + since_epoch;

GetDateFormat() fails on dates before 1/1/1601

i am trying to format a date using Windows GetDateFormat API function:
nResult = GetDateFormat(
localeId, //0x409 for en-US, or LOCALE_USER_DEFAULT if you're not testing
0, //flags
dt, //a SYSTEMTIME structure
"M/d/yyyy", //the format we require
null, //the output buffer to contain string (null for now while we get the length)
0); //the length of the output buffer (zero while we get the length)
Now we pass it a date/time:
SYSTEMTIME dt;
dt.wYear = 1600;
dt.wMonth = 12;
dt.wDay = 31;
In this case nResult returns zero:
The function returns 0 if it does not succeed. To get extended error information, the application can call GetLastError, which can return one of the following error codes:
ERROR_INSUFFICIENT_BUFFER. A supplied buffer size was not large enough, or it was incorrectly set to NULL.
ERROR_INVALID_FLAGS. The values supplied for flags were not valid.
ERROR_INVALID_PARAMETER. Any of the parameter values was invalid.
If, however, i return a date one day later:
SYSTEMTIME dt;
dt.wYear = 1601;
dt.wMonth = 1;
dt.wDay = 1;
Then it works.
What am i doing wrong? How do i format dates?
e.g. the date of the birth of Christ:
12/25/0000
or the date when the universe started:
-10/22/4004 6:00 PM
or the date Caesar died:
-3/15/44
Bonus Reading
Sorting It All Out: GetDateFormat is Gregorian based
GetDateFormatEx function
This is actually a limitation on SystemTime.
...year/month/day/hour/minute/second/milliseconds value since 1 January 1601 00:00:00 UT... to 31 December 30827 23:59:59.999
I spent some time looking up how to get around this limitation, but since GetDateFormat() takes a SystemTime you'll probably have to bite the bullet and write your own format() method.
SYSTEMTIME struct is valid only from year 1601 through 30827, because in Windows machines, is system time counted from elapsed intervals from 1.1.1601 00:00. See
Wikipedia article.

Date and time type for use with Protobuf

I'm considering to use Protocol Buffers for data exchange between a Linux and a Windows based system.
Whats the recommended format for sending date/time (timestamp) values? The field should be small when serialized.
There is Timestamp message type since protobuf 3.0, that's how to create it in model:
syntax = "proto3";
import "google/protobuf/timestamp.proto";
message MyMessage {
google.protobuf.Timestamp my_field = 1;
}
timestamp.proto file contains examples of Timestamp using, including related to Linux and Windows programs.
Example 1: Compute Timestamp from POSIX time().
Timestamp timestamp;
timestamp.set_seconds(time(NULL));
timestamp.set_nanos(0);
Example 2: Compute Timestamp from POSIX gettimeofday().
struct timeval tv;
gettimeofday(&tv, NULL);
Timestamp timestamp;
timestamp.set_seconds(tv.tv_sec);
timestamp.set_nanos(tv.tv_usec * 1000);
Example 3: Compute Timestamp from Win32 GetSystemTimeAsFileTime().
FILETIME ft;
GetSystemTimeAsFileTime(&ft);
UINT64 ticks = (((UINT64)ft.dwHighDateTime) << 32) | ft.dwLowDateTime;
// A Windows tick is 100 nanoseconds. Windows epoch 1601-01-01T00:00:00Z
// is 11644473600 seconds before Unix epoch 1970-01-01T00:00:00Z.
Timestamp timestamp;
timestamp.set_seconds((INT64) ((ticks / 10000000) - 11644473600LL));
timestamp.set_nanos((INT32) ((ticks % 10000000) * 100));
Although you aren't saying which languages you are using or what kind of precision you need, I would suggest using Unix time encoded into a int64. It is fairly easy to handle in most languages and platforms (see here for a Windows example), and Protobufs will use a varint-encoding keeping the size small without limiting the representable range too much.
In the latest protobuf version (3.0) - For C#, Timestamp a WellKnownType is available. Check this

Resources