I need to detect Idle Time, Time Since the user has input anything into their machine, I've made this application before for windows only and this function worked brilliantly :-
function IdleTime: DWord;
var
LastInput: TLastInputInfo;
begin
LastInput.cbSize := SizeOf(TLastInputInfo);
GetLastInputInfo(LastInput);
Result := (GetTickCount - LastInput.dwTime) DIV 1000;
end;
However, this function doesn't work on Multi-Device Application(as far as I can tell). I've messed around with this for a while now and done some severe googling to no avail.
The target OS is OS X and Windows.
The equivalent of GetLastInputInfo on OSX is CGEventSourceCounterForEventType.
See: https://developer.apple.com/library/mac/documentation/Carbon/Reference/QuartzEventServicesRef/index.html#//apple_ref/c/func/CGEventSourceCounterForEventType
See here: Detecting user-activity on mac os x
The API interface for this call is in: Macapi.CoreGraphics
So you'll need to add that unit to your uses clause.
If you're not familair with OSX programming under Delphi, have a look at:https://delphihaven.wordpress.com
Related
I have one problem, and I tried to search a solution but can't achieve what I want. Sorry if that is actually simple, please just point me to correct way of how to do it.
So! I have a C program that is a loader. It must call my DLL written in Delphi or Lazarus (Free Pascal). The DLL is actually a standalone GUI application: during debugging I conditionally compile it as EXE and it working.
My build script compiles it as DLL with one entry point that must execute it just as it works standalone. I expect exactly the same behavior, but I can do some things different (especially setting the Application icon) if needed.
Loader is a console-style program but compiled without a console – no windows, no anything. It just loads DLL and calls a function.
Problem is that when I build even empty default project with one form as an EXE – it will actually have "master" Application (.Handle <> 0) window in taskbar. So I can set its title independently from main form caption.
But when the same thing is inside a DLL – there is no Application window (.Handle = 0), the title will be the form caption, but the most important bug: a form cannot be minimized!
In Delphi 7 it goes background under other windows (but taskbar thing stays!); in Lazarus it just minimizes to nowhere (hided, no way to restore anymore); both without any minimizing animation.
Other than that, my application seems to behave normally. This is only issue I have.
OK, I know that forms in libraries is a bad thing to do, but:
I’m fine to instantiate "another" VCL completely independent from host’s instance, maybe even in different thread.
There is no VCL in my particular host application! For me, it must work exactly as it will in EXE alone…
I searched something about Application.Handle in DLL, and now understand than I need to pass a handle to host’s Application object, so DLL will be joined with others host forms, but I have none! It’s even not Delphi… (and Application:=TApplication.Create(nil); didn’t help either)
Anything of following will probably help me:
A) How to instruct VCL to create a normal Application object for me? How it does it when in EXE, maybe I can copy that code?
B) How to create a suitable master window from C (proper styles, etc.) to pass it’s handle to DLL? Also, I believe, in Free Pascal there is no direct access to TApplication handle value, so I couldn’t probably assign it.
C) How to live without a taskbar window, but have my form (good news: my program has only one form!) to minimize correctly (or just somehow…)?
I now you all love to see some code, so here it is:
// default empty project code, produces valid working EXE:
program Project1;
uses Forms, Unit1 in 'Unit1.pas' {Form1};
{$R *.res}
begin
Application.Initialize;
Application.CreateForm(TForm1, Form1);
Application.Run;
end.
+
// that's how I tried to put it in a DLL:
library Project1;
uses Forms, Unit1 in 'Unit1.pas' {Form1};
{$R *.res}
function entry(a, b, c, d: Integer): Integer; stdcall;
begin
Application.Initialize;
Application.CreateForm(TForm1, Form1);
Application.Run;
Result := 0;
end;
exports
entry;
begin
end.
I specially crafted entry() function to be callable with rundll32, just for testing.
Also, I tried to put the body directly to "begin end." initialization section – same wrong behavior.
// To call a DLL, this can be used:
program Project1;
function entry(a, b, c, d: Integer): Integer; stdcall; external 'Project1.dll';
begin
entry(0, 0, 0, 0);
end.
Also, CMD-command "rundll32 project1.dll entry" will run it instantly. (Yeah, that way I might get a handle that Rundll gives me, but it isn’t what I want anyway.)
Last notes: (a) the DLL must be compiled in Lazarus; actually first thing I thought that it is a bug in LCL, but now when tested in Delphi7 I see the same; and since Delphi case is more simpler and robust, I decided to put here that; (b) my C loader doesn’t call LoadLibrary, it uses TFakeDLL hack (that OBJ file was tweaked to work without Delphi wrapper) and loads my DLL from memory (so I don’t have a handle to DLL itself), but otherwise their behavior is the same.
Okay, thanks to #Sertac Akyuz, I tried with .ShowModal:
// working Delphi solution:
library Project1;
uses Forms, Dialogs, SysUtils, Unit1 in 'Unit1.pas' {Form1};
{$R *.res}
function entry(a, b, c, d: Integer): Integer; stdcall;
begin
Result := 0;
Application.Initialize;
Form1 := TForm1.Create(nil);
try
Form1.ShowModal;
except
on e: Exception do
ShowMessage(e.message);
end;
Form1.Free;
end;
exports
entry;
begin
end.
There is still no Application window (taskbar title equal to form caption), but now my form can be successfully minimized (with system animation). Note that for EXE compilation I have to go default way with Application, because when I tried to create the form like this – it started to minimize to desktop (iconify) instead of the taskbar.
It works perfect in empty default Lazarus project too. But when I tried to implement it to my production program, it gave me "Disk Full" exception at .ShowModal!
That thing was frustrating me little earlier (and that’s why I got rid of modality altogether, tried it no more), but now I was determined enough to get the bottom of this.
And I found the problem! My build script doesn’t pass "-WG" ("Specify graphic type application") compiler option. Looks like something in LCL was using console-alike mode, and modality loop failed for some reason.
Then, I had another very confusing issue that I want to share. My form’s OnCreate was rather big and complex (even starting other threads), and some internal function give me access violation when tried to do some stuff with one of controls on the form. It looked like the control is not constructed yet, or the form itself…
Turns out that the actual call Form1:=TForm1.Create(nil); obviously will leave the global variable "Form1" unassigned during FormCreate event. The fix was simple: to add Form1:=Self; in the beginning of TForm1.FormCreate(Sender: TObject);
Now everything is working without any problems. I can even use other forms with a normal Form2.Show(); if I firstly add them to my entry() function, like Form2:=TForm2.Create(Form1);
(edit: minor note, if you would use Lazarus and try to run entry() function from any different thread than one that loaded DLL library itself – then you should put MainThreadID:=GetCurrentThreadId(); just above Application.Initialize;)
Yay, this question is solved!
I am currently looking for a way around an apparent memory leak in the Mac implementation of the REST client. The code to generate the memory leak is the following (running XE8, update 1):
program mac_REST_leak_test;
{$APPTYPE CONSOLE}
{$R *.res}
uses
System.SysUtils, REST.Client, REST.Types, IPPeerClient;
var
request : TRestRequest;
ii, iMax : integer;
begin
iMax := 1;
for ii := 0 to iMax do
begin
request := TRestRequest.Create(nil);
// Fake Online REST API for Testing and Prototyping
request.Client := TRestClient.Create('http://jsonplaceholder.typicode.com/');
request.Method := rmPOST;
request.Execute();
request.Client.Free();
request.Free();
end;
end.
This is the smallest block of code that demonstrates the leak. Essentially, I have a synching service that makes REST requests every so often.
When I run this on Windows, using MadExcept, no leaks are found. Examining the running process in ProcessMonitor shows no increase in the amount of memory being used.
When run on a Mac, however, the Activity Monitor shows the memory allocated to the app continue to rise. Further, when run using Instruments, there appear to be leaks dealing with several URL and HTTP classes on mac.
Does anybody know how to resolve this leak?
(As an aside, it would be really helpful to know exactly where the leak is coming from on Mac, but the only Delphi classes listed are the TMethodImplementationIntercept. I'm to believe that this is due to the fact that Delphi doesn't generate a dSYM file for Mac. If anybody knows a way around that, that would be awesome too!)
UPDATE
By varying iMax from 1 to 10 and comparing the FastMM4 output, it appears that the leak is in the class Macapi.ObjectiveC.TConvObjID.XForm. The 10 iteration output contains 9 more leaks with this as a stack trace compared to the 1 iteration. I have reported this to Embarcadero as RSP-12242.
Yes FastMM4 has OSX leak reporting support in latest SVN revision. Unfortunatly, the "global" leaks from a simple empty Delphi FMX application makes it difficult to analyse the mem-logfile. A few leaks has been fixed in XE10 but some objects in MacApi.ObjectiveC bridge still generate leaks. I have reported that in Quality Central & Quality Portal (QC & QP). So it's difficult to use FastMM4 for leak finding.
Please separate Delphi object leaks and ObjectiveC leaks, second you can find with instruments.
I am converting a threaded timer pool unit for cross platform use.
The current unit uses timeGetTime to ensure high accuracy and to report the actual elapsed interval when the timer event is called.
I have used gettimeofday in OSX before to get a high resolution timer but cannot find any reference to it for use in Delphi XE3.
Looking for help on how I can call this in Delphi or an alternative cross platform way to get a high res timer. I want ms accuracy (I know its OS dependent) for this.
Thanks in advance, Martin
A better option, multi-platform ready, may be to use the TStopWatch record from the System.Diagnostics unit.
The TStopWatch is a true high resolution timer if available, and in that case have close to nano-second precision (depend on the OS and hardware), and if not available (in Windows) use standard timer functions to provide millisecond precision.
If you want only millisecond precision, use the ElapsedMilliseconds property, like this:
var
sw : TStopWatch;
ElapsedMilliseconds : Int64;
begin
sw := TStopWatch.Create() ;
try
sw.Start;
Whatever();
sw.Stop;
ElapsedMilliseconds := sw.ElapsedMilliseconds;
finally
sw.Free;
end;
end;
The TStopWatch relies on QueryPreformanceFrequency/QueryPerformanceCounter functions in windows and mach_absolute_time on OS-X
The following code is to fade my application on close.
procedure TfrmMain.btnClose1Click(Sender: TObject);
var
i : Integer;
begin
for i := 255 downto 0 do begin
frmMain.AlphaBlendValue := i;
application.ProcessMessages;
end;
Close;
end;
With Windows performance set to “Let Windows choose…”
When closing my Delphi app with the above code the fade is almost
instantaneous (maybe ¼ second at the most, if I blink I miss the
transition).
If I set the performance Option to ‘Adjust for best performance”
When exiting the same app the fade takes over 12 seconds.
Using the same code but commenting out the AlphaBlendValue change removes the delay.
I tested this out on both Delphi 2010 and DelphiXE2 and the results are the same.
This was tested on Windows 7 Ultimate 64bit if that makes any difference.
To say the least this behavior puzzles me.
I thought that the forms Alpha property was handled by the GPU and would therefore not be affected by Windows performance settings that should would be targeted at maximizing CPU performance.
So as far as this is concerned I'm not sure if this is a Windows 7 bug, a Delphi bug or just my lack of knowledge.
As far as a fix...
Is there a way to tell if Windows is running in crap graphics/max performance mode so that I can disable Alpha fade effects in my apps?
Edit for clarity:
While I would like to fix the fade what I am really looking for is a way to determine what the Windows performance setting is.
I am looking for how to determine a specific Windows setting - when you go into Windows Performance Options there are 3 tabs. On the first tab "Visual Effects" there are 3 canned options and a 4th option for 'Custom'. Minimally I am trying to determine if the option chosen is 'Adjust for best performance', if I could determine what the settings are on this tab even better.
Appreciate any help.
The fundamental problem with your code is that you are forcing 256 distinct updates irrespective of the performance characteristics of the machine. You don't have to use every single alpha blend value between 255 and 0. You can skip some values and still have a smooth fade.
You need to account for the actual graphics performance of the machine. Since you cannot predict that, you should account for real time in your fade code. Doing so will give you a consistent rate of fade irrespective of the performance characteristics of your machine.
So, here's a simple example to demonstrate tying the fade rate to real time:
procedure TfrmMain.btnClose1Click(Sender: TObject);
var
Stopwatch: TStopwatch;
NewAlphaBlendValue: Integer;
begin
Stopwatch := TStopwatch.StartNew;
while True do
begin
NewAlphaBlendValue := 255-(Stopwatch.ElapsedMilliseconds div 4);
if NewAlphaBlendValue>0 then
AlphaBlendValue := NewAlphaBlendValue
else
break;
end;
Close;
end;
The fade has a 1 second duration. You can readily adjust the mathematics to modify the duration to your requirements. This code will produce a smooth fade even on your low performing machine.
I would also comment that you should not use the global variable drmMain in a TfrmMain method. The TfrmMain method already has access to the instance. It is Self. And of course you can omit the Self. What's more the call to ProcessMessages is bad. That allows re-entrant handling of queued input messages. You don't want that to happen. So remove the call to ProcessMessages.
You actually ask about detecting the Adjust for best performance setting. But I think that's the wrong question. For a start you should fix your fade code so that the fade duration is independent of graphics performance.
Having done that you may still wish to disable the fade if the user has asked for lower quality appearance settings. I don't think you should look for one of the 3 canned options that you mention. They are quite possibly Windows version specific. Personally I would base the behaviour on the Animate windows when minimizing and maximizing setting. My rationale is that if the user does not want minimize and maximize to be animated, then presumably they don't want window close to be faded.
Here's how to read that setting:
function GetWindowAnimation: Boolean;
var
AnimationInfo: TAnimationInfo;
begin
AnimationInfo.cbSize := SizeOf(AnimationInfo);
if not SystemParametersInfo(SPI_GETANIMATION, AnimationInfo.cbSize,
#AnimationInfo, 0) then
RaiseLastOSError;
Result := AnimationInfo.iMinAnimate<>0;
end;
I think that most of the other settings that you may be concerned with can also be read using SystemParametersInfo. You should be able to work out how to do so by following the documentation.
Sorry for the tardy followup but it took me a while to figure out a working answer to my question and some of the issues behind it.
First, a thank you to David Heffernan for insight on a better way to handle the fade loop and information on the TStopWatch function from the Delphi's Diagnostics unit, much appreciated.
In regards to being able to determine the Windows' Performance settings...
When using the following un-optimized fade loop
procedure TfrmMain.btnFadeNCloseClick(Sender: TObject);
var
i : Integer;
begin
for i := 255 downto 0 do
frmMain.AlphaBlendValue := i;
Close;
end;
the actual Windows Performance Option settings causing the performance issue are "Enable desktop composition" and "Use visual styles on Windows and buttons". If both options are enabled there is no issue, if either setting is not enabled the loop crawls** (about 12 seconds on my system if the form is maximized).
Turns out that turning Aero Glass on or off affects these same 2 settings. So being able to detect if Aero Glass is on or not enables me to determine whether to not to enable the form effects, such as transition fades and other eye candy, in my apps. Plus now I can also capture that information in my bug reports.
**Note this appears to be an NVidia issue/bug, or at least an issue that is much more severe on systems with NVidia graphics cards. On 2
different NVidia systems (with recent, if not latest drivers) I
got similar results for a miximized form fade - less than
.001 seconds if Aero Glass is on, around 12 seconds if Aero Glass is
off. On a system with an Intel graphics card - less than .001 seconds
if Aero Glass is on, about 3.7 seconds if Aero Glass is off. Now
granted my test sampling is small, 3 NVidia systems (counting my
customer who initially reported the issue) and one non-NVidia system
but if I was using a decent NVidia graphic card I would not bother
turning Aero Glass off.
Below is the working code to detect if Aero Glass is enabled via Delphi:
This function has been tested on a Windows7 64 bit system and works with Delphi 2007, 2010 and Xe2 (32 & 64bit).
All of the various versions of the Delphi function below that I found on the net were broken - along with comments of people complaining about getting Access Violation errors.
What finally shed the light on fixing the bad code was Gerry Coll's response to: AccessViolationException in Delphi - impossible (check it, unbelievable...) which was about trying to fix AV errors in a function of the same type.
function ISAeroEnabled: Boolean;
type
_DwmIsCompositionEnabledFunc = function(var IsEnabled: Bool): HRESULT; stdcall;
var
Flag : BOOL;
DllHandle : THandle;
OsVersion : TOSVersionInfo;
DwmIsCompositionEnabledFunc: _DwmIsCompositionEnabledFunc;
begin
Result:=False;
ZeroMemory(#OsVersion, SizeOf(OsVersion));
OsVersion.dwOSVersionInfoSize := SizeOf(TOSVERSIONINFO);
if ((GetVersionEx(OsVersion)) and (OsVersion.dwPlatformId = VER_PLATFORM_WIN32_NT) and
(OsVersion.dwMajorVersion = 6) and (OsVersion.dwMinorVersion < 2)) then //Vista&Win7 only (no Win8)
begin
DllHandle := LoadLibrary('dwmapi.dll');
try
if DllHandle <> 0 then
begin
#DwmIsCompositionEnabledFunc := GetProcAddress(DllHandle, 'DwmIsCompositionEnabled');
if (#DwmIsCompositionEnabledFunc <> nil) then
begin
if DwmIsCompositionEnabledFunc(Flag)= S_OK then
Result:= Flag;
end;
end;
finally
FreeLibrary(DllHandle);
end;
end;
end;
Why might SCardEstablishContext hang, never to return, when called from a service?
I have code that works fine on lots of Windows installations. It accesses a Cherry keyboard's Smart Card reader (6x44) to read data on a smart card. It works fine on most PCs it has been tried on. However, on some PCs, running in Spain with Spanish Windows, the SCardEstablishContext function never returns. I cannot work out why this might be. I have logging either side of it, but the log entry after it does not appear. I cannot then shut it down (the worker thread is getting stuck), and have to kill it.
Exactly the same thread code works fine if run from an application, and not a service. Giving the service login settings of a user instead of system makes no difference.
I've installed Spanish XP on a machine here, but it works just fine. The far end has the same Winscard.dll version as I have here (both at XP SP3 status). No errors are shown in the event log.
How might I work out what is going wrong, and what might be fixing it? (Delphi code below)
// based on code by Norbert Huettisch
function TPCSCConnector.Init: boolean;
var
RetVar: LongInt;
ReaderList: string;
ReaderListSize: integer;
v: array[0..MAXIMUM_SMARTCARD_READERS] of string;
i: integer;
begin
Result := false;
FNumReaders := 0;
{$IFDEF MJ_ONLY}
LogReport(leInformation, 'About to call SCardEstablishContext');
{$ENDIF}
RetVar := SCardEstablishContext(SCARD_SCOPE_USER, nil, nil, #FContext);
{$IFDEF MJ_ONLY}
// never gets to report this (and logging known good etc)
LogReport(leInformation, 'SCardEstablishContext result = ' + IntToStr(RetVar));
{$ENDIF}
if RetVar = SCARD_S_SUCCESS then
begin
There may be different reasons why the API function appears to hang, like a deadlock, or an invisible message box or dialog waiting for user input. You should try to get a stacktrace using WinDbg.
You should also make sure that you are trying to reproduce the bug in the same environment. Important points might be whether Fast User Switching is active and whether other users are logged on, also that there are the same device drivers and services running.