Convert TMediaPlayer->Duration to min:sec (FMX) - firemonkey

I'm working with the TMediaPlayer1 control in an FMX app using C++ Builder 10.2 Version 25.0.29899.2631. The code below runs fine in Win32 and gives the expected result after loading an mp3 file that is 35 minutes, 16 seconds long.
When i run this same code targeting iOS i get the following error:
[bcciosarm64 Error] Unit1.cpp(337): use of overloaded operator '/' is ambiguous (with operand types 'Fmx::Media::TMediaTime' and 'int')
Here is my code that takes the TMediaPlayer1->Duration and converts it to min:sec,
UnicodeString S = System::Ioutils::TPath::Combine(System::Ioutils::TPath::GetDocumentsPath(),"43506.mp3");
if (FileExists(S)) {
MediaPlayer1->FileName = S;
int sec = MediaPlayer1->Duration / 10000000; // <-- this is problem line
int min = sec / 60;
sec = sec - (60 * min);
lblEndTime->Text = IntToStr(min) + ":" + IntToStr(sec);
}
How should i be doing that division?
UPDATE 1: I fumbled around and figured out how to see the values with this code below. When i run on Win32 i get 21169987500 for the Duration (35 min, 16 seconds) and i get 10000000 for MediaTimeScale - both correct. When i run on iOS i get 0 for Duration and 10000000 for MediaTimeScale. But, if i start the audio playing (e.g. MediaPlayer1->Play();) first and THEN run those 2 showmessages i get the correct result for Duration.
MediaPlayer1->FileName = S; // load the mp3
ShowMessage(IntToStr((__int64) Form1->MediaPlayer1->Media->Duration));
ShowMessage(IntToStr((__int64) MediaTimeScale));
It looks like the Duration does not get set on iOS until the audio actually starts playing. I tried a 5 second delay after setting MediaPlayer1->Filename but that doesn't work. I tried a MediaPlayer1->Play(); followed by MediaPlayer->Stop(); but that didn't work either.
Why isn't Duration set when the FileName is assigned? I'd like to show the Duration before the user ever starts playing the audio.

Related

How to automatically check time?

Is it possible to automatically check time then execute certain codes?
timer = os.date('%H:%M:%S', os.time() - 13 * 60 * 60 )
if timer == "18:04:40" then
print("hello")
end
I am trying to print hello on "18:04:40" everyday (os.date's time) without setting up a timer (which counts how much time past since the program's initiation) as I can't run the program 24 hours non-stop...
Thanks for reading.
This may not be the best solution but, when using a library like love2d for example you could run something like this:
function love.update(dt)
timer = os.date('%H:%M:%S', os.time() - 13 * 60 * 60 )
if timer >= value then
--stuff here
end
end
Or if you wanna make it so you have a whole number something like
tick = 0
function love.update(dt)
tick = tick + dt
if tick > 1 then
timer = os.date('%H:%M:%S', os.time() - 13 * 60 * 60 )
if timer >= value then
--stuff here
end
end
end
Lua has to check the time in some way.
Without a loop that can be realized with debug.sethook().
Example with Lua 5.1 typed in an interactive Lua (lua -i)...
> print(_VERSION)
Lua 5.1
> debug.sethook() -- This clears a defined hook
> -- Next set up a hook function that fires on 'line' events
> debug.sethook(function() local hour, min, sec = 23, 59, 59 print(os.date('%H:%M:%S', os.time({year = 2021, month = 12, day = 11, hour = hour, min = min, sec = sec}))) end, 'l')
-- just hit return/enter or do other things
23:59:59
5.9 - The Debug Library
https://www.lua.org/manual/5.1/manual.html#5.9

Tibco Spotfire - time in seconds & milliseconds in Real, convert to a time of day

I have a list of time in a decimal format of seconds, and I know what time the series started. I would like to convert it to a time of day with the offset of the start time applied. There must be a simple way to do this that I am really missing!
Sample source data:
\Name of source file : 260521-11_58
\Recording from 26.05.2021 11:58
\Channels : 1
\Scan rate : 101 ms = 0.101 sec
\Variable 1: n1(rpm)
\Internal identifier: 63
\Information1:
\Information2:
\Information3:
\Information4:
0.00000 3722.35645
0.10100 3751.06445
0.20200 1868.33350
0.30300 1868.36487
0.40400 3722.39355
0.50500 3722.51831
0.60600 3722.50464
0.70700 3722.32446
0.80800 3722.34277
0.90900 3722.47729
1.01000 3722.74048
1.11100 3722.66650
1.21200 3722.39355
1.31300 3751.02710
1.41400 1868.27539
1.51500 3722.49097
1.61600 3750.93286
1.71700 1868.30334
1.81800 3722.29224
The Start time & date is 26.05.2021 11:58, and the LH column is elapsed time in seconds with the column name [Time] . So I just want to convert the decimal / real to a time or timespan and add the start time to it.
I have tried lots of ways that are really hacky, and ultimately flawed - the below works, but just ignores the milliseconds.
TimeSpan(0,0,0,Integer(Floor([Time])),[Time] - Integer(Floor([Time])))
The last part works to just get milli / micro seconds on its own, but not as part of the above.
Your formula isn't really ignoring the milliseconds, you are using the decimal part of your time (in seconds) as milliseconds, so the value being returned is smaller than the format mask.
You need to convert the seconds to milliseconds, so something like this should work
TimeSpan(0,0,0,Integer(Floor([Time])),([Time] - Integer(Floor([Time]))) * 1000)
To add it to the time, this would work
DateAdd(Date("26-May-2021"),TimeSpan(0,0,0,Integer([Time]),([Time] - Integer([Time])) * 1000))
You will need to set the column format to
dd-MMM-yyyy HH:mm:ss:fff

how to interprete sinfo cpu load %O?

sinfo --format "%O" gives the load of nodes.
Is this an average value of a specific time period?
And how is this value related with the load averages (1m,5m,15m) of uptime command?
Thanks
Yes, it returns the 5min load average value.
SLURM uses sysinfo to measure the cpu load value (am using slurm 15.08.5).
In the source code of slurm, the following line measures the cpu load value.
float shift_float = (float) (1 << SI_LOAD_SHIFT);
if (sysinfo(&info) < 0) {
*cpu_load = 0;
return errno;
}
*cpu_load = (info.loads[1] / shift_float) * 100.0;
From the sysinfo man page:
unsigned long loads[3]; /* 1, 5, and 15 minute load averages */
info.loads[1] returns the 5min average. sysinfo reads from the file /proc/loadavg
To understand why SI_LOAD_SHIFT is used, please read the reference

ExSetTimerResolution does not work

I have got Windows XP installed on my computer.
I want my DPC routine to be called every 10 ms.
That is why I wrote this code:
ASSERT( KeGetCurrentIrql() <= APC_LEVEL );
KeRaiseIrql( APC_LEVEL, &level );
resolution = ExSetTimerResolution( 100000, TRUE );
KdPrint((DRIVERNAME " - RESOLUTION = %d\n", resolution));
KeLowerIrql( level );
KeSetTimerEx( &pExt->timer, duetime, 10, &pExt->dpc );
DebugView shows me that return value (RESOLUTION) equals 156250.
As a result my DPC routine is called every 15.5 ms
What am I doing wrong?
Out of curiosity I tried ExSetTimerResolution with different values.
Here is what I got:
10000 -> 9766
50000 -> 39063
75000 -> 39063
90000 -> 156250
Left column contains values that I used as DesiredTime parameter.
Right column contains return values.
As you can see, it looks like Windows cannot change global timer resolution to any desired number.

Getting surprising elapsed time in windows and linux

I have written one function which is platform independent and working nicely in windows as well as linux. I wanted to check the execution time of that function. I am using QueryPerformanceCounter to calculate the execution time in windows and "gettimeofday" in linux.
The problem is in windows the execution time is 60 mili seconds and in linux its showing 4 ms. Its a huge difference b/w them. Can anybody suggest what might went wrong....or If any body knows the some other APIs better than these to calculate elapsed time please let me know...
here is the code for i have written using gettimeofday......
void main()
{
timeval start_time;
timeval end_time;
gettimeofday(&start_time,NULL);
function_invoke(........);
gettimeofday(&end_time,NULL);
timeval res;
timersub(&start_time,&end_time,&res);
cout<<"function_invoke took seconds = "<<res.tv_sec<<endl;
cout<<"function_invoke took microsec = "<<res.tv_usec<<endl;
}
OUTPUT :
function_invoke took seconds = 0
function_invoke took microsec = 4673 ( 4.673 mili seconds )

Resources