Disable F7 caret browsing prompt in UWP development - xamarin

We are developing a Xamarin application focused on mobile and desktop (UWP). We found that F7 key is bounded to prompt a dialog about "caret browsing".
Looks like the implementation of this functionality is application dependent and the dialog only provides a setter for this configuration (please correct me if I am wrong).
We do not intent to implement such behavior and want to use this key to other end (our web app uses it and we are replicating the same functionality - long time customers are familiarized with this).
After hours searching, I could not find a way to permanently disable the prompt. It really does not make any sense why MS provide this dialog if we do not want to implement it!
Has anyone managed to solve this problem and can help us?

Related

Intercepting keyboard and mouse events from focused applications on OS X

Soon I will have to work with OS X and tools like hammerspoon are missing some important capabilities for me. I need to be able to intercept keyboard and mouse events completely from the focused application. Say I ctrl+alt+apple+left_click on an application, I don't want the application to know about that left click. So far the only thing I came up with was to build a transparent fullscreen application, though I'm not sure how feasible that is yet.
Any better idea or hints how to go about this in a language of your choice?
Thanks!
You will need to create an event tap. However, the application will have to run as the root user, or the user will have to authorize that the application has been granted rights to accessibility features.
Apple's documentation can be found here.
Interestingly enough, I am in the process of writing a blog post about how to use event taps (including an ObjectiveC API that I wrote for my own use), but the post won't be made available for another week or so.

Is it possible to call a function in a different, but currently executing process?

I have a friend who's working at a company that offers pretty poor support for its developers (scoring a 1/12 on the Joel Test).
Their build process is locked down pretty tight, and depending on the size of project it could take 40+(x2) mouse clicks to deploy. So I thought, "Hey, why not automate it the clicks using the win32api?" (Specifically using Python). I've got him a real nice tool that works just fine except for one issue - the tool that they use has a navigation pane that may or may not be open.
You can open and close it with a button press, but I'm not sure how I could make sure it was either open or closed. It's irrelevant to the build process - the only problem is that it alters where the mouse needs to click on the screen depending on its open status. The application is written in .NET and it exposes a function call that applications are able to use to toggle the panel, so I've been looking around for ideas and so far I've got two of them:
Attach to the process via a debugger and execute the function call somehow.
Take a screenshot at the location of the panels titlebar (which I've got through the win32 API and doesn't appear to change regardless if the panel is hidden or not).
Is there an easier way to figure out the state of this panel? The developers are given an admin account on their machine in addition to their regular account, so I can entertain ideas that require admin access, though I don't think that should be necessary?
UPDATE:
It looks like there's a button that can close the pane. In UIAVerify something shows up as "text" "Navigation" "btnClose". It says its AutomationId is btnClose but it's a ControlType.Text
What technology is this panel built from? Is it standard GDI or WPF? If its GDI, it should have a HWND. You should be able to find this HWND through either a class name or window title. Once you have the HWND, you can get its width.
If its built with WPF, er, I have no idea, but Snoop does this kind of thing, so I know its possible.

Detecting GUI state and generating user input programmatically in Windows Xp

I am looking for a lightweight solution that would allow me to detect which form/ dialog is open in an application, then emit some keystrokes / mouse moves and clicks. I do not have control over (nor the source-code for) the application.
I am familiar with MacroMaker, also testing products like SQA / Mercury offer similar functionality. The last time I had hands on exposure in this are is late 2004, I welcome any pointers to bring my knowledge up to date.
AutoIt is a scripting environment for Windows with a long history. It's quite easy to use and flexible to do things like detect the open window or dialog, change to another one, type something, etc. I would definitely recommend it.
In case anybody is curious, in the end I decided to use Microsoft UI Automation. Here is nice intro:
http://msdn.microsoft.com/en-us/magazine/cc163288.aspx

How to hide an application based on the NetBeans Platform?

I'm not referring to a GUI-less application. I'm trying to have an application based on the NetBeans platform as a System Tray application. I was able to do the System Tray part quite easy but I'm having issues trying to figure out how to Hide/Show the GUI. I'll keep looking in the API meanwhile.
Any ideas?
WindowManager.getDefault().getMainWindow().setVisible(true/false) should work to hide and show the entire GUI, unless it has multiple windows (pure Swing Frame.getFrames() should give you all JFrame based windows, if that helps).
Not sure if that will solve the problem if you want the main window hidden on startup (but if it is a very simple UI, as is true of many tray apps, you might be able to just work with a dead-simple implementation of WindowManager such as WindowManager.Trivial and leave out the standard NetBeans windowing system entirely).

new Windows 7 systray - how to show information to users now

new Windows 7 hides systray icons by default.
what is the recommended way to show information to users now?
I need to have a small clickable icon visible to user so user can access my "tool" anytime. Should I use the gadget to show my GUI instead? Can it communicate with my Delphi app somehow?
Without more information it's a little difficult to provide a recommendation.
However, I would imagine that a sufficiently important tool, the user would simply keep minimized. They could then use Jumplists to access quick functionality.
For example, Live Messenger uses this setup on Win7.
If your users really like your icon/application they can always choose to not hide your application.
The only difference is that only the user can choose which icon is shown, instead of every application claiming it's "real estate".
In my opinion this is a good functionality and if I were you I wouldn't change the application, just provide a first run GUI which explains how to make your tray icon visible in windows 7.
The entire reason why change was made, was to stop programs like yours. If you need to show information, go ahead and do so. But the notification areas ("systray") is not where shortcuts go. For that, you've got the start menu, desktop and/or the quick launch bar (and please let the user decide).

Resources