Getting surprising elapsed time in windows and linux - windows

I have written one function which is platform independent and working nicely in windows as well as linux. I wanted to check the execution time of that function. I am using QueryPerformanceCounter to calculate the execution time in windows and "gettimeofday" in linux.
The problem is in windows the execution time is 60 mili seconds and in linux its showing 4 ms. Its a huge difference b/w them. Can anybody suggest what might went wrong....or If any body knows the some other APIs better than these to calculate elapsed time please let me know...
here is the code for i have written using gettimeofday......
void main()
{
timeval start_time;
timeval end_time;
gettimeofday(&start_time,NULL);
function_invoke(........);
gettimeofday(&end_time,NULL);
timeval res;
timersub(&start_time,&end_time,&res);
cout<<"function_invoke took seconds = "<<res.tv_sec<<endl;
cout<<"function_invoke took microsec = "<<res.tv_usec<<endl;
}
OUTPUT :
function_invoke took seconds = 0
function_invoke took microsec = 4673 ( 4.673 mili seconds )

Related

Convert TMediaPlayer->Duration to min:sec (FMX)

I'm working with the TMediaPlayer1 control in an FMX app using C++ Builder 10.2 Version 25.0.29899.2631. The code below runs fine in Win32 and gives the expected result after loading an mp3 file that is 35 minutes, 16 seconds long.
When i run this same code targeting iOS i get the following error:
[bcciosarm64 Error] Unit1.cpp(337): use of overloaded operator '/' is ambiguous (with operand types 'Fmx::Media::TMediaTime' and 'int')
Here is my code that takes the TMediaPlayer1->Duration and converts it to min:sec,
UnicodeString S = System::Ioutils::TPath::Combine(System::Ioutils::TPath::GetDocumentsPath(),"43506.mp3");
if (FileExists(S)) {
MediaPlayer1->FileName = S;
int sec = MediaPlayer1->Duration / 10000000; // <-- this is problem line
int min = sec / 60;
sec = sec - (60 * min);
lblEndTime->Text = IntToStr(min) + ":" + IntToStr(sec);
}
How should i be doing that division?
UPDATE 1: I fumbled around and figured out how to see the values with this code below. When i run on Win32 i get 21169987500 for the Duration (35 min, 16 seconds) and i get 10000000 for MediaTimeScale - both correct. When i run on iOS i get 0 for Duration and 10000000 for MediaTimeScale. But, if i start the audio playing (e.g. MediaPlayer1->Play();) first and THEN run those 2 showmessages i get the correct result for Duration.
MediaPlayer1->FileName = S; // load the mp3
ShowMessage(IntToStr((__int64) Form1->MediaPlayer1->Media->Duration));
ShowMessage(IntToStr((__int64) MediaTimeScale));
It looks like the Duration does not get set on iOS until the audio actually starts playing. I tried a 5 second delay after setting MediaPlayer1->Filename but that doesn't work. I tried a MediaPlayer1->Play(); followed by MediaPlayer->Stop(); but that didn't work either.
Why isn't Duration set when the FileName is assigned? I'd like to show the Duration before the user ever starts playing the audio.

Algorithm to calculate a date for complex occupation management

Hello fellow Stack Overflowers,
I have a situation, where I need some help choosing the best way to make an algorithm work, the objective is to manage the occupation of a resource (Lets consider the resource A) to have multiple tasks, and where each task takes a specified amount of time to complete. At this first stage I don't want to involve multiple variables, so lets keep it the simple way, lets consider he only has a schedule of the working days.
For example:
1 - We have 1 resource, resource A
2 - Resource A works from 8 am to 4 pm, monday to friday, to keep it simple by now, he doesn't have lunch for now, so, 8 hours of work a day.
3 - Resource A has 5 tasks to complete, to avoid complexity at this level, lets supose each one will take exactly 10 hours to complete.
4 - Resource A will start working on this tasks at 2018-05-16 exactly at 2 pm.
Problem:
Now, all I need to know is the correct finish date for all the 5 tasks, but considering all the previous limitations.
In this case, he has 6 working days and additionaly 2 hours of the 7th day.
The expected result that I want would be: 2018-05-24 (at 4 pm).
Implementation:
I thought about 2 options, and would like to have feedback on this options, or other options that I might not be considering.
Algorithm 1
1 - Create a list of "slots", where each "slot" would represent 1 hour, for x days.
2 - Cross this list of slots with the hour schedule of the resource, to remove all the slots where the resource isn't here. This would return a list with the slots that he can actually work.
3 - Occupy the remaining slots with the tasks that I have for him.
4 - Finnaly, check the date/hour of the last occupied slot.
Disadvantage: I think this might be an overkill solution, considering that I don't want to consider his occupation for the future, all I want is to know when will the tasks be completed.
Algorithm 2
1 - Add the task hours (50 hours) to the starting date, getting the expectedFinishDate. (Would get expectedFinishDate = 2018-05-18 (at 4 pm))
2 - Cross the hours, between starting date and expectedFinishDate with the schedule, to get the quantity of hours that he won't work. (would basically get the unavailable hours, 16 hours a day, would result in remainingHoursForCalc = 32 hours).
3 - calculate new expectedFinishDate with the unavailable hours, would add this 32 hours to the previous 2018-05-18 (at 4 pm).
4 - Repeat point 2 and 3 with new expectedFinishDate untill remainingHoursForCalc = 0.
Disadvantage: This would result in a recursive method or in a very weird while loop, again, I think this might be overkill for calculation of a simple date.
What would you suggest? Is there any other option that I might not be considering that would make this simpler? Or you think there is a way to improve any of this 2 algorithms to make it work?
Improved version:
import java.util.Calendar;
import java.util.Date;
public class Main {
public static void main(String args[]) throws Exception
{
Date d=new Date();
System.out.println(d);
d.setMinutes(0);
d.setSeconds(0);
d.setHours(13);
Calendar c=Calendar.getInstance();
c.setTime(d);
c.set(Calendar.YEAR, 2018);
c.set(Calendar.MONTH, Calendar.MAY);
c.set(Calendar.DAY_OF_MONTH, 17);
//c.add(Calendar.HOUR, -24-5);
d=c.getTime();
//int workHours=11;
int hoursArray[] = {1,2,3,4,5, 10,11,12, 19,20, 40};
for(int workHours : hoursArray)
{
try
{
Date end=getEndOfTask(d, workHours);
System.out.println("a task starting at "+d+" and lasting "+workHours
+ " hours will end at " +end);
}
catch(Exception e)
{
System.out.println(e.getMessage());
}
}
}
public static Date getEndOfTask(Date startOfTask, int workingHours) throws Exception
{
int totalHours=0;//including non-working hours
//startOfTask +totalHours =endOfTask
int startHour=startOfTask.getHours();
if(startHour<8 || startHour>16)
throw new Exception("a task cannot start outside the working hours interval");
System.out.println("startHour="+startHour);
int startDayOfWeek=startOfTask.getDay();//start date's day of week; Wednesday=3
System.out.println("startDayOfWeek="+startDayOfWeek);
if(startDayOfWeek==6 || startDayOfWeek==0)
throw new Exception("a task cannot start on Saturdays on Sundays");
int remainingHoursUntilDayEnd=16-startHour;
System.out.println("remainingHoursUntilDayEnd="+remainingHoursUntilDayEnd);
/*some discussion here: if task starts at 12:30, we have 3h30min
* until the end of the program; however, getHours() will return 12, which
* substracted from 16 will give 4h. It will work fine if task starts at 12:00,
* or, generally, at the begining of the hour; let's assume a task will start at HH:00*/
int remainingDaysUntilWeekEnd=5-startDayOfWeek;
System.out.println("remainingDaysUntilWeekEnd="+remainingDaysUntilWeekEnd);
int completeWorkDays = (workingHours-remainingHoursUntilDayEnd)/8;
System.out.println("completeWorkDays="+completeWorkDays);
//excluding both the start day, and the end day, if they are not fully occupied by the task
int workingHoursLastDay=(workingHours-remainingHoursUntilDayEnd)%8;
System.out.println("workingHoursLastDay="+workingHoursLastDay);
/* workingHours=remainingHoursUntilDayEnd+(8*completeWorkDays)+workingHoursLastDay */
int numberOfWeekends=(int)Math.ceil( (completeWorkDays-remainingDaysUntilWeekEnd)/5.0 );
if((completeWorkDays-remainingDaysUntilWeekEnd)%5==0)
{
if(workingHoursLastDay>0)
{
numberOfWeekends++;
}
}
System.out.println("numberOfWeekends="+numberOfWeekends);
totalHours+=(int)Math.min(remainingHoursUntilDayEnd, workingHours);//covers the case
//when task lasts 1 or 2 hours, and we have maybe 4h until end of day; that's why i use Math.min
if(completeWorkDays>0 || workingHoursLastDay>0)
{
totalHours+=8;//the hours of the current day between 16:00 and 24:00
//it might be the case that completeWorkDays is 0, yet the task spans up to tommorrow
//so we still have to add these 8h
}
if(completeWorkDays>0)//redundant if, because 24*0=0
{
totalHours+=24*completeWorkDays;//for every 8 working h, we have a total of 24 h that have
//to be added to the date
}
if(workingHoursLastDay>0)
{
totalHours+=8;//the hours between 00.00 AM and 8 AM
totalHours+=workingHoursLastDay;
}
if(numberOfWeekends>0)
{
totalHours+=48*numberOfWeekends;//every weekend between start and end dates means two days
}
System.out.println("totalHours="+totalHours);
Calendar calendar=Calendar.getInstance();
calendar.setTime(startOfTask);
calendar.add(Calendar.HOUR, totalHours);
return calendar.getTime();
}
}
You may adjust the hoursArray[], or d.setHours along with c.set(Calendar.DAY_OF_MONTH, to test various start dates along with various task lengths.
There is still a bug , due to the addition of the 8 hours between 16:00 and 24:00:
a task starting at Thu May 17 13:00:00 EEST 2018 and lasting 11 hours will end at Sat May 19 00:00:00 EEST 2018.
I've kept a lot of print statements, they are useful for debugging purposes.
Here is the terminology explained:
I agree that algorithm 1 is overkill.
I think I would make sure I had the conditions right: hours per day (8), working days (Mon, Tue, Wed, Thu, Fri). Would then divide the hours required (5 * 10 = 50) by the hours per day so I know a minimum of how many working days are needed (50 / 8 = 6). Slightly more advanced, divide by hours per week first (50 / 40 = 1 week). Count working days from the start date to get a first shot at the end date. There was probably a remainder from the division, so use this to determine whether the tasks can end on this day or run into the next working day.

Get duration in microseconds

Considering the exemple :
final Duration twoSeconds = Duration.ofSeconds(2);
// final long microseconds = twoSeconds.get(ChronoUnit.MICROS); throws UnsupportedTemporalTypeException: Unsupported unit: Micros
final long microseconds = twoSeconds.toNanos() / 1000L;
System.out.println(microseconds);
I wonder if there is a nicer way to get a Duration in microseconds than converting manually from nanoseconds.
I wouldn’t use the java.time API for such a task, as you can simply use
long microseconds = TimeUnit.SECONDS.toMicros(2);
from the concurrency API which works since Java 5.
However, if you have an already existing Duration instance or any other reason to insist on using the java.time API, you can use
Duration existingDuration = Duration.ofSeconds(2);
// Since Java 8
long microseconds8_1 = existingDuration.toNanos() / 1000;
// More idiomatic way
long microseconds8_2 = TimeUnit.NANOSECONDS.toMicros(existingDuration.toNanos());
// Since Java 9
long microseconds9 = existingDuration.dividedBy(ChronoUnit.MICROS.getDuration());
// Since Java 11
long microseconds11 = TimeUnit.MICROSECONDS.convert(existingDuration);
Based on Holger answer, my favorite would be:
final long microseconds = TimeUnit.NANOSECONDS.toMicros(twoSeconds.toNanos())

Optimizing Groovy Performance

I'm working on groovy code perfomance optimization. I've used jvisualvm to connect to running applicaton and gather CPU samples. Samples say that org.codehaus.groovy.reflection.CachedMethod.inkove takes the most CPU time. I don't see any other application methods in samples.
What is the right way to dig into CachedMethod.invoke and understand what code lines really give perfomance penalties?
Thanks.
UPD:
I do use Indy, it didn't help me.
I didn't try to introduce #CompileStatic since I want to find my bottlenecks before rewriting groovy to java.
My problem a bit similar to this thread: Call site caching faster than invokedynamic?
I have a code that dynamically composes groovy script. Script template looks this way:
def evaluateExpression(Map context){
def user = context.user
%s
}
where %s replaced with
user.attr1 == '1' || user.attr2 == '2' || user.attr3 = '3'
There is a set (20 in total) of replacements have taken from Databases.
The code gets replacements from DB, creates GroovyScript and evaluates it.
I suppose the bottleneck is in the script execution. What is the right way to fix it?
So, I've tried various things
groovy-indy, doesn't work
groovy-indy with some code "optimization", doesn't work. BTW, I'started to play around with try/catch and it as a result I made my "hotspot" run 4 times faster. I'm not good at JVM internals, but internet says - try/catch prevents optimizations. I assumed it as a ground truth. Need to g deeper to understand who it really works.
I gave up, turned off invokedynamic and rewrote my "hottest" code with #CompileStatic. It took about 3-4 hours and I my code runs 100 time faster now.
Here are initial metrics with "invokedynamic support"
count = 83043
mean rate = 395.52 calls/second
1-minute rate = 555.30 calls/second
5-minute rate = 217.78 calls/second
15-minute rate = 82.92 calls/second
min = 0.29 milliseconds
max = 12.98 milliseconds
mean = 1.59 milliseconds
stddev = 1.08 milliseconds
median = 1.39 milliseconds
75% <= 2.46 milliseconds
95% <= 3.14 milliseconds
98% <= 3.44 milliseconds
99% <= 3.76 milliseconds
99.9% <= 12.19 milliseconds
Here are #CompileStatic metrics with ind turned off. BTW, there is no reason to use #CompileStatic if "indy" is turned on.
count = 139724
mean rate = 8950.43 calls/second
1-minute rate = 2011.54 calls/second
5-minute rate = 426.96 calls/second
15-minute rate = 143.76 calls/second
min = 0.02 milliseconds
max = 24.18 milliseconds
mean = 0.08 milliseconds
stddev = 0.72 milliseconds
median = 0.06 milliseconds
75% <= 0.08 milliseconds
95% <= 0.11 milliseconds
98% <= 0.15 milliseconds
99% <= 0.20 milliseconds
99.9% <= 1.27 milliseconds

How to get the current time in millisecond in Fortran?

I want to get the current system time in milliseconds in Fortran. I can't use system_clock, because I can't figure out how to get the current time from it.
This illustrates how to get the time since midnight in milliseconds using date_and_time:
program time
integer :: values(8)
real :: rTime
! Get the values
call date_and_time(values=values)
! From https://gcc.gnu.org/onlinedocs/gfortran/DATE_005fAND_005fTIME.html
! values(5) ... The hour of the day
! values(6) ... The minutes of the hour
! values(7) ... The seconds of the minute
! values(8) ... The milliseconds of the second
! Calculate time since midnight
rTime = ( values(5) )*60. ! Hours to minutes
rTime = ( rTime + values(6) )*60. ! Minutes to seconds
rTime = ( rTime + values(7) )*1e3 ! Seconds to milliseconds
rTime = rTime + values(8) ! Add milliseconds
! Time in seconds
print *, 'Time (ms) since midnight', rTime
end program
I think your question is: "how can I call date_and_time subroutin and access it to calculate ms?" Am I right?
Alexander's answer was true.also you can use this code:
program time
integer :: values(8)
call date_and_time(values=values)
print *, values(5),":",values(6),":",values(7),":",values(8)
end program time
You can use ITIME(), a function embedded in Fortran. It returns a real number in units of milliseconds. You just need to call it twice and subtract the two values in order to calculate the time interval.

Resources