I can't seem to get a reliable timestamp using winapi functions. For example:
int main(int argc, char *argv[])
{
HANDLE file;
BY_HANDLE_FILE_INFORMATION finfo;
SYSTEMTIME systime;
file = CreateFile("test.txt",GENERIC_READ,FILE_SHARE_READ,NULL,OPEN_EXISTING,FILE_ATTRIBUTE_NORMAL,NULL);
GetFileInformationByHandle(file,&finfo);
FileTimeToSystemTime(&finfo.ftLastWriteTime, &systime);
printf(" %s %02d:%02d:%02d %d/%d/%d\n", "test.txt", systime.wHour,systime.wMinute,systime.wSecond,systime.wDay,systime.wMonth,systime.wYear);
}
gives non-sense on all my files like:
test.txt 00:03:30 33/5/3
wDay seems to have values outside of range 1-31 and the times and dates are totally wrong. All other values in the BY_HANDLE_FILE_INFORMATION like name and size are correct, and in I full code I check for errors from all functions, but they all return passed. Anyone know what I am doing wrong?
I think the problem is your use of filesize.QuadPart. Try removing that from both the format string and the argument list, and see if it works.
If that helps, then look up the correct format string to use for a 64-bit quantity in printf().
Related
I've used the code from one of the answers on this forum but it's doesn't appear to be correct. I don't know if it's the code or the way I'm logging it.
unsigned long long int NQCTestInstance::getCurrentTimeInMs() {
unsigned long milliseconds_since_epoch =
std::chrono::duration_cast<std::chrono::milliseconds>
(std::chrono::system_clock::now().time_since_epoch()).count();
Log("Timestamp = %u\n\n", milliseconds_since_epoch);
return milliseconds_since_epoch;
}
The Log output is 119682234, which is only nine characters when it should be ten. Is this as simple as the %u in the Log statement being incorrect?
#Igor Tandetnik and #John Zwinck were on the right tracks.
I changed everything to unsigned long long and then used %llu (not %lu) for the logging.
I still have some trouble with understanding this with UNICODE and ANSI in win32 api..
For example, i have this code:
SYSTEMTIME LocalTime = { 0 };
GetSystemTime (&LocalTime);
SetDlgItemText(hWnd, 1003, LocalTime);'
That generates the error in the title.
Also, i should mention that it automatically adds a W after "setdlgitemtext" Some macro in VS probably.
Could someone clarify this for me?
In C or C++ you can't just take an arbitrary structure and pass it to a function that expects a string. You have to convert that structure to a string first.
The Win32 functions GetDateFormat() and GetTimeFormat() can be used to convert a SYSTEMTIME to a string (the first one does the "date" part and the second one does the "time" part) according to the current system locale rules.
For example,
SYSTEMTIME LocalTime = { 0 };
GetSystemTime (&LocalTime);
wchar_t wchBuf[80];
GetDateFormat(LOCALE_USER_DEFAULT, DATE_SHORTDATE, &LocalTime, NULL, wchBuf, sizeof(wchBuf) / sizeof(wchBuf[0]));
SetDlgItemText(hWnd, 1003, wchBuf);
I can not write integer into the LCD using those functions :S it shows something weird in screen
I just added the function below!!! please check it for me
I added everything needed
my_delay(1000);
LCDWriteStringXY(0,0,"Welcome..");
my_delay(1000);
LCDWriteStringXY(0,0,"Welcome...");
my_delay(1000);
LCDClear();
LCDWriteStringXY(4,0,"Testing");
LCDGotoXY(2,1);
int m=952520;
LCDWriteInt(m,6);//I can not write it!!!
void LCDWriteInt(int val,unsigned int field_length)
{
char str[5]={0,0,0,0,0};
int i=4,j=0;
while(val)
{
str[i]=val%10;
val=val/10;
i--;
}
if(field_length==-1)
while(str[j]==0) j++;
else
j=5-field_length;
if(val<0) LCDData('-');
for(i=j;i<5;i++)
{
LCDData(48+str[i]);
}
}
I think the function is written for 16-bit integers for which the maximum value would be 65535 (5 digits - same as the length of str[]). You are giving it 6 digit value, which first overruns the string when it tries to write to str[5], and then produces j = -1.
My suggestion is to either use smaller integers (16-bit only), or write another function like the one you showed us to do the same thing for larger values.
Lastly, I don't know if the if(val<0) LCDData('-') would actually ever work properly since you overwrite 'val' in the first while loop.
Use itoa function. That will help you converting integer to string and displaying on lcd. Best of luck!
I am doing a random file access to write log records in it. Later, I do access the log based on the log number. Using the log number I do calculate the offset of the record and directly access that. The function SetFilePointerEx is used to set the current location in the file and from there I can directly read the record.
The function expects the LARGE_INTEGER as parameter. How do I use LARGE_INTEGER for the SetFilePointerEx function? The req. notes says that the program will be targeted to 64 bit OS.
Assuming LARGE_INTEGER li;, just set li.QuadPart to the LONGLONG value you need for your file offset and use li for the offset argument in the call. Or did I miss something obvious.?
LARGE_INTEGER li, lo={0};
li.QuadPart = yourOffsetValue;
SetFilePointerEx(hFile, li, &lo, FILE_BEGIN);
I have a simple question:
Is there a way to do a strlen()-like count of characters in zero-terminated char16_t array?
use
char_traits<char16_t>::length(your_pointer)
see 21.2.3.2 struct char_traits<char16_t> and table 62 of the C++11-Std.
Use pointers :), create a duplicate pointer to the start, then loop through it while (*endptr++);, then the length is given by endptr - startptr. You can actually template this, however, its possible the the compile won't generate the same intrinsic code it does for strlen(for different sizes ofc).
Necrolis answer includes the NULL byte in the length, which probably is not what you want. strlen() does not include the NULL byte in the length.
Adapting their answer:
static size_t char16len(uint16_t *startptr)
{
uint16_t *endptr = startptr;
while (*endptr) {
endptr++;
}
return endptr - startptr;
}
I realize there is an accepted C++ answer, but this answer is useful for anyone who stumbles on this post using C (like I did).