How the time_point created with different duration(std::chrono::milliseconds and std::chrono::nanoseconds) is so different - c++11

I have created std::chrono::milliseconds ms and std::chrono::nanoseconds ns
from std::chrono::system_clock::now().time_since_epoch(). From that duration I created timepoints and convert it to time_t using system_clock::to_time_t and print it using ctime function. But the time printed is not same. As I understand the time_point have duration and duration have rep and period (ratio). So time_point must have same value up to millisecond precision in both time_points. Why the output is different?
Here is my code
#include <ctime>
#include <ratio>
#include <chrono>
#include <iostream>
using namespace std::chrono;
int main ()
{
std::chrono::milliseconds ms = std::chrono::duration_cast < std::chrono::milliseconds > (std::chrono::system_clock::now().time_since_epoch());
std::chrono::nanoseconds ns = std::chrono::duration_cast< std::chrono::nanoseconds > (std::chrono::system_clock::now().time_since_epoch());
std::chrono::duration<unsigned int,std::ratio<1,1000>> today_day (ms.count());
std::chrono::duration<system_clock::duration::rep,system_clock::duration::period> same_day(ns.count());
system_clock::time_point abc(today_day);
system_clock::time_point abc1(same_day);
std::time_t tt;
tt = system_clock::to_time_t ( abc );
std::cout << "today is: " << ctime(&tt);
tt = system_clock::to_time_t ( abc1 );
std::cout << "today is: " << ctime(&tt);
return 0;
}

This line:
std::chrono::duration<unsigned int,std::ratio<1,1000>> today_day (ms.count());
is overflowing. The number of milliseconds since 1970 is on the order of 1.5 trillion. But unsigned int (on your platform) overflows at about 4 billion.
Also, depending on your platform, this line:
std::chrono::duration<system_clock::duration::rep,system_clock::duration::period> same_day(ns.count());
may introduce a conversion error. If you are using gcc, system_clock::duration is nanoseconds, and there will be no error.
However, if you're using llvm's libc++, system_clock::duration is microseconds and you will be silently multiplying your duration by 1000.
And if you are using Visual Studio, system_clock::duration is 100 nanoseconds and you will be silently multiplying your duration by 100.
Here is a video tutorial for <chrono> which may help, and contains warnings about the use of .count() and .time_since_epoch().

The conversions you do manually do not look right.
You should use duration_cast for conversions because they are type-safe:
auto today_day = duration_cast<duration<unsigned, std::ratio<86400>>>(ms);
auto same_day = duration_cast<system_clock::duration>(ns);
Outputs:
today is: Thu Jul 26 01:00:00 2018
today is: Thu Jul 26 13:01:08 2018

Because you throw away the duration info, and then interpret an integer value as a different duration type
std::chrono::duration<unsigned int,std::ratio<1,1000>> today_day (ms.count());
milliseconds -> dimensionless -> 1 / 1000 seconds (i.e. milliseconds)
std::chrono::duration<system_clock::duration::rep,system_clock::duration::period> same_day(ns.count());
nanoseconds -> dimensionless -> system clocks
You should instead just duration_cast again
#include <ctime>
#include <ratio>
#include <chrono>
#include <iostream>
using namespace std::chrono;
int main ()
{
milliseconds ms = duration_cast<milliseconds>(system_clock::now().time_since_epoch());
nanoseconds ns = duration_cast<nanoseconds>(system_clock::now().time_since_epoch());
system_clock::time_point abc(duration_cast<system_clock::duration>(ms));
system_clock::time_point abc1(duration_cast<system_clock::duration>(ns));
std::time_t tt;
tt = system_clock::to_time_t ( abc );
std::cout << "today is: " << ctime(&tt);
tt = system_clock::to_time_t ( abc1 );
std::cout << "today is: " << ctime(&tt);
return 0;
}

Related

Storing a time_point outside of the application

I am using an std::chrono::system_clock::time_point in my program.
When the application stops I want to save to the time_point to a file and load it again when the application starts.
If it was an UNIX-Timestamp I could simply store the value as integer. Is there a way to similarly store a time_point?
Yes. Choose the precision you desire the timestamp in (seconds, milliseconds, ... nanoseconds). Then cast the system_clock::time_point to that precision, extract its numeric value, and print it:
cout << time_point_cast<seconds>(system_clock::now()).time_since_epoch().count();
Though not specified by the standard, the above line (de facto) portably outputs the number of non-leap seconds since 1970-01-01 00:00:00 UTC. That is, this is a UNIX-Timestamp.
I am attempting to get the above code blessed by the standard to do what it in fact does by all implementations today. And I have the unofficial assurance of the std::chrono implementors, that they will not change their system_clock epochs in the meantime.
Here's a complete roundtrip example:
#include <chrono>
#include <iostream>
#include <sstream>
int
main()
{
using namespace std;
using namespace std::chrono;
stringstream io;
io << time_point_cast<seconds>(system_clock::now()).time_since_epoch().count();
int64_t i;
system_clock::time_point tp;
io >> i;
if (!io.fail())
tp = system_clock::time_point{seconds{i}};
}

std::get_time fails at midnight or noon using %I format specifier

When using std::get_time, I get an exception when parsing a datetime at midnight or noon. I am using the %I format specifier instead of %H because the hour should be between 1-12.
Exception: "ios_base::failbit set: iostream stream error"
I am using Microsoft Visual Studio 2013.
Why am I getting an exception when parsing this datetime? Is there a different format mask I can use?
#include <iostream>
#include <iomanip>
#include <ctime>
#include <cstring>
#include <sstream>
int main()
{
std::time_t time;
std::string timeString = "1/5/15 12:00 AM";
std::string formatMask = "%m/%d/%y %I:%M %p";
std::tm tm;
std::memset(&tm, 0, sizeof(std::tm));
std::istringstream ss(timeString);
ss.exceptions(std::ios::failbit | std::ios::badbit);
try
{
ss >> std::get_time(&tm, formatMask.c_str());
time = std::mktime(&tm);
}
catch (const std::exception& ex)
{
std::cout << ex.what() << std::endl;
}
std::cout << time << std::endl;
return 0;
}
ios_base::failbit generally means some logical error has occurred in processing the istringstream. Checking description of %I format it seems that the range seems wrongly specified here, it should be range [00,11] for a 12 hour clock IMO.
It should ideally work for 00:00 through 11:59 with Visual Studio 2013 compiler.
Thanks to #Cubbi posix specification does specify the range to be range [01,12] but so does MSVS2013.
The MSDN documentation for time_get::do_get specifies the range to be range[00,11] and this seems to be what std::get_time uses in VS2013 (#Cubbi found this).

chrono C++11 version of matlabs datenum

Is there a C++11 version of matlabs datenum function in #include<chrono>?
I already know it exists in boost thanks to this post
No, there is not. However here is a "how-to" manual for how to write the algorithms for your chrono-compatible date library. Using these algorithms, I can easily, for example, do this:
#include "../date_performance/date_algorithms"
#include <ratio>
#include <chrono>
#include <iostream>
typedef std::chrono::duration
<
int,
std::ratio_multiply<std::ratio<24>, std::chrono::hours::period>
> days;
typedef std::chrono::time_point<std::chrono::system_clock, days> date_point;
int
main()
{
using namespace std::chrono;
date_point datenum{days{days_from_civil(2014, 2, 5)}};
auto d = system_clock::now() - datenum;
auto h = duration_cast<hours>(d);
d -= h;
auto m = duration_cast<minutes>(d);
std::cout << "The current UTC time is " << h.count() << ':' << m.count() << '\n';
date_point datenum2{days{days_from_civil(2014, 3, 5)}};
std::cout << "There are " << (datenum2-datenum).count() << " days between 2014-03-05 and 2014-02-05\n";
}
Which for me outputs:
The current UTC time is 22:12
There are 28 days between 2014-03-05 and 2014-02-05
The first thing you need to do is create a chrono::duration to represent a day, named days above. Then it is handy to create a chrono::time_point based on the days duration. This time_point is compatible with every known implementation of system_clock::time_point. I.e. you can subtract them.
In this duration I subtract now() from the current date to get the hours::minutes of the day in the UTC timezone. I also demonstrate how to compute the number of days between any two dates.
Feel free to use these algorithms, and perhaps wrap all of this up in a type-safe date class. The link only provides the algorithms, and not an actual date class.
I might post a date class based on these algorithms in the future if I get the time...
I don't know what matlabs datenum is but here is how to do basically the same as the accepted answer of the question you link to, that is arithmetic with time points and durations but in C++11 without boost:
#include <chrono>
#include <iostream>
using namespace std;
using namespace std::chrono;
int main() {
duration<long> one_day{ hours(24) };
system_clock::time_point now = system_clock::now();
system_clock::time_point tomorrow = now + one_day;
time_t t = system_clock::to_time_t(tomorrow);
cout << "Tomorrow: " << ctime(&t) << '\n';
}
Hope this helps.

std::chrono, adding duration to time_point

I am trying to add some duration to a time_point in Qt (C++11/MinGW) and I am having trouble:
Initialization (when the program starts):
auto program_start_time = std::chrono::system_clock::now();
auto offline_time = std::chrono::system_clock::now();
...
Some activity goes offline:
offline_mark_time = std::chrono::system_clock::now();
...
When the activity resumes, I need to add the offline time to my start time:
auto now = std::chrono::system_clock::now();
program_start_time += (now - offline_mark_time); // <- Does not seem to work
Even though compilation and executions is ok, the program's behavior is as if I am adding zero.
How do you add or subtract a duration to a time_point?
This complete program, based on the snippets of code in your answer:
#include <iostream>
#include <thread>
#include <chrono>
int
main()
{
auto program_start_time = std::chrono::system_clock::now();
auto copy_of_program_start_time = program_start_time;
auto offline_mark_time = std::chrono::system_clock::now();
std::this_thread::sleep_for(std::chrono::microseconds(100));
auto now = std::chrono::system_clock::now();
program_start_time += (now - offline_mark_time);
std::cout << (program_start_time > copy_of_program_start_time) << '\n';
}
for me prints out:
1
If the time duration between the construction of offline_mark_time and now is less than than the precision of system_clock::duration (1 microsecond for me), then now and offline_mark_time will likely be equal and thus 0 would be added to program_start_time in that case.

Equivalent of gettimeofday() for Windows

Does anyone know an equivalent function of the gettimeofday() function in Windows environment? I am comparing a code execution time in Linux vs Windows. I am using MS Visual Studio 2010 and it keeps saying, identifier "gettimeofday" is undefined.
Here is a free implementation:
#define WIN32_LEAN_AND_MEAN
#include <Windows.h>
#include <stdint.h> // portable: uint64_t MSVC: __int64
// MSVC defines this in winsock2.h!?
typedef struct timeval {
long tv_sec;
long tv_usec;
} timeval;
int gettimeofday(struct timeval * tp, struct timezone * tzp)
{
// Note: some broken versions only have 8 trailing zero's, the correct epoch has 9 trailing zero's
// This magic number is the number of 100 nanosecond intervals since January 1, 1601 (UTC)
// until 00:00:00 January 1, 1970
static const uint64_t EPOCH = ((uint64_t) 116444736000000000ULL);
SYSTEMTIME system_time;
FILETIME file_time;
uint64_t time;
GetSystemTime( &system_time );
SystemTimeToFileTime( &system_time, &file_time );
time = ((uint64_t)file_time.dwLowDateTime ) ;
time += ((uint64_t)file_time.dwHighDateTime) << 32;
tp->tv_sec = (long) ((time - EPOCH) / 10000000L);
tp->tv_usec = (long) (system_time.wMilliseconds * 1000);
return 0;
}
GetLocalTime() for the time in the system timezone, GetSystemTime() for UTC. Those return the date/time in a SYSTEMTIME structure, where it's parsed into year, month, etc. If you want a seconds-since-epoch time, use SystemTimeToFileTime() or GetSystemTimeAsFileTime(). The FILETIME is a 64-bit value with the number of 100ns intervals since Jan 1, 1601 UTC.
For interval taking, use GetTickCount(). It returns milliseconds since startup.
For taking intervals with the best possible resolution (limited by hardware only), use QueryPerformanceCounter().
This is the version of c++11 that uses chrono.
Thank you, Howard Hinnant for advice.
#if defined(_WIN32)
#include <chrono>
int gettimeofday(struct timeval* tp, struct timezone* tzp) {
  namespace sc = std::chrono;
  sc::system_clock::duration d = sc::system_clock::now().time_since_epoch();
  sc::seconds s = sc::duration_cast<sc::seconds>(d);
  tp->tv_sec = s.count();
  tp->tv_usec = sc::duration_cast<sc::microseconds>(d - s).count();
  return 0;
}
#endif // _WIN32
If you really want a Windows gettimeofday() implementation, here is one from PostgreSQL that uses Windows APIs and the proper conversions.
However if you want to time code, I suggest you look into QueryPerformanceCounter() or by directly invoking the TSC if you're only going to run on x86 for example.
Nowadys I would use the following for gettimeofday() on Windows, which is using GetSystemTimePreciseAsFileTime() if compiled for Windows 8 or higher and GetSystemTimeAsFileTime() otherwise:
#include <Windows.h>
struct timezone {
int tz_minuteswest;
int tz_dsttime;
};
int gettimeofday(struct timeval *tv, struct timezone *tz)
{
if (tv) {
FILETIME filetime; /* 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601 00:00 UTC */
ULARGE_INTEGER x;
ULONGLONG usec;
static const ULONGLONG epoch_offset_us = 11644473600000000ULL; /* microseconds betweeen Jan 1,1601 and Jan 1,1970 */
#if _WIN32_WINNT >= _WIN32_WINNT_WIN8
GetSystemTimePreciseAsFileTime(&filetime);
#else
GetSystemTimeAsFileTime(&filetime);
#endif
x.LowPart = filetime.dwLowDateTime;
x.HighPart = filetime.dwHighDateTime;
usec = x.QuadPart / 10 - epoch_offset_us;
tv->tv_sec = (time_t)(usec / 1000000ULL);
tv->tv_usec = (long)(usec % 1000000ULL);
}
if (tz) {
TIME_ZONE_INFORMATION timezone;
GetTimeZoneInformation(&timezone);
tz->tz_minuteswest = timezone.Bias;
tz->tz_dsttime = 0;
}
return 0;
}
Since Visual Studio 2015, timespec_get is available:
#include <inttypes.h>
#include <stdint.h>
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
static uint64_t
time_ns(void)
{
struct timespec ts;
if (timespec_get(&ts, TIME_UTC) != TIME_UTC)
{
fputs("timespec_get failed!", stderr);
return 0;
}
return 1000000000 * ts.tv_sec + ts.tv_nsec;
}
int main(void)
{
printf("%" PRIu64 "\n", time_ns());
return EXIT_SUCCESS;
}
Compile using cl t.c and run:
C:\> perl -E "system 't' for 1 .. 10"
1626610781959535900
1626610781973206600
1626610781986049300
1626610781999977000
1626610782014814800
1626610782028317500
1626610782040880700
1626610782054217800
1626610782068346700
1626610782081375500

Resources