I have been using ADB to simulate touch commmands in my application, the plan is to develop a sort of keymapper application that simulates touches in external applications(I don't feel comfortable sharing the app idea in public, despite there already being some similar applications on the play store).
Essentially I use am using adb to input tap commands from my C# code like so:
Runtime r = Runtime.GetRuntime();
Java.Lang.Process p = null;
try
{
p = r.Exec("input tap 900 1900");
}
catch (System.Exception e)
{
Console.WriteLine(e.Message);
}
However, it only seems to work in my application (eg: views in my application or the screen area of my app).
It does not work on other apps or even on the three buttons on the bottom bar.
If I use the terminal provided by visual studio (Open Android Adb Command Prompt), then I am to fire tap commands (adb shell input tap x y) and they work everywhere, including on other apps.
However from my application's code when I fire these tap commands, they do not work.
Could you guys tell me of a way to do this?
P.S. I am using a xamarin.android foreground service to execute these taps on external apps.
Related
I want to open an Electron JS app when I click or touch the screen.
Another software is gonna be running in the main screen, but at any click/touch, my app is supposed to open.
Is there some way to listen to the Windows clicks/touches anywhere, and open certain app?
Obs.: The system is being developed to be executed on a "Midia Totem", and this is the reason I have to do all of this stuff.
They answer a similar question here. https://stackoverflow.com/a/41836178/4778613
"Once the OS is handling touch events, those are piped through to Blink via the Electron wrapper. However, you need to set the touch-events command-line switch to enable it."
For example I try to print out to the console and it doesn't, my script is attached to my main camera so we can rule that out and yes the script is active also, any help will be appreciated, right now what I am working on is to click on a certain part of my GameObject in AR and while my phone is connected to the computer I want to see the name of the certain part I clicked on in the console.
When you run the app on and Android device the log does not go to the Unity editor console. In order to see it you need to open Android Studio and then use the Logcat tab to see the device log.
Alternatively, you can run logcat from the command line
adb logcat.
More information: https://developer.android.com/studio/command-line/logcat.html
I have worked with ARCore and had the same frustating experience when it comes to debugging.
If you just want console prints, then use Log viewer which can catch and show them on Android too. (https://assetstore.unity.com/packages/tools/log-viewer-12047)
However, because I wanted more control and the ability to test and debug my game logic right inside the editor (without deploying to the phone all the time), I wrote a little plugin that allows me to do just that. This plugin simulates the operation of ARCore inside Unity editor. You can just hit play and ARCore will be simulated for you, so you can freely develop and debug your game logic. Moreover, you can then just build and deploy the project without changing anything, and ARCore will work like normal on your phone.
Using it is very similar to native ARCore, so you will not have much difficulty getting into it. It does not cover ALL features of ARCore yet, but it covers the basics. You can still use native ARCore for the rest.
You can find it here: https://github.com/VR-House/Eazy-ARCore-Interface
In current versions of Unity you can output logs from connected devices to Unity Editor. To do this, build your project in Development Mode and connect the Console to your device.
But I would recommend a more advanced way of testing AR in Editor with a plugin I wrote. I wrote it for my project and decided to make it into a plugin so everyone else can benefit from it.
AR Foundation Editor Remote plugin:
https://forum.unity.com/threads/ar-foundation-editor-remote-test-and-debug-your-ar-project-in-the-editor.898433/
I use debugging like this in C# script:
Debug.Log("Debug message and image name "+Image.Name);
To see this real time, I use Android Device Monitor (it's in Android's sdk folder, usually \Users\AppData\Local\Android\sdk\tools\monitor.bat. If device is connected, it appears on the devices list and it can be selected.
Or if I want to see the debug log in device, I put these lines in script:
private void OnGUI()
{
GUIStyle style = new GUIStyle();
style.fontSize = 50;
GUI.Box(new Rect(350, 0, 500, 500), Image.Name, style);
}
The Rect(350,0 is the position, 500,500 is the width and height of rectangle.
I have created a UWP app for Windows 10 using the Desktop Bridge. Mostly it works just fine, however my app needs to re-launch its own executable (with different command-line arguments). The two processes work together.
This works just fine for the non-UWP app, but when run as a UWP, I can't seem to re-launch my own executable (as derived from the process command-line). Should this be possible? Is there a particular way that I need to do it an UWP app?
Currently I get the error: Access is denied.
To launch your app the same way it would be launched when the user taps the app list entry, you can do this:
private async void StartMyApp()
{
var appListEntries = await Windows.ApplicationModel.Package.Current.GetAppListEntriesAsync();
await appListEntries.First().LaunchAsync();
}
This code assumes your package manifest contains only one application node. In case you have multiple, you need to pick the right one to call LaunchAsync on.
My app can run with a UI and run headless, chosen via a command-line argument. (I don't want to build two separate apps.) When running headless, I'm calling:
[NSApp setActivationPolicy:NSApplicationActivationPolicyProhibited];
during app initialization, and don't show any windows.
This all works fine, and I can run the app headed (via the dock) and headless (via another app passing the command-line argument) at the same time. The little dot shows up next to the dock icon only for the headed app. Good. The problem happens if the headless app is launched first. In that state (no dot in the dock), any clicking on the app icon (or launching via Spotlight or the Finder) does nothing, or sometimes brings up "You can’t open the application “Foo” because it is not responding." Presumably the OS thinks the app is already running headed, and is trying to activate it. Is there a way to convince it to ignore that background app and launch a new instance?
Quitting the background app allows the app to launch normally again from the dock.
I don't want to set LSBackgroundOnly in the plist, because that would require two separate apps unless I changed the setActivationPolicy to NSApplicationActivationPolicyRegular in the case of the headed app. But I've read that doing that causes various other bugs, like the menu bar not always showing up.
Any ideas?
I'm on OS 10.10.5 (the oldest OS I need to support). Thanks for any help!
Update: I just noticed that when clicking on the dock and it appears to do nothing, if there are no app windows open I can see that it actually unhides my app's main window (which was created hidden) but doesn't bring it to the front (presumably it notices that NSApplicationActivationPolicyProhibited is set, though that wasn't enough to prevent it from showing the window!).
Update 2: At this point, I'd settle for a way to notify the user that the reason the app isn't launching is that they need to quit the other app that has sublaunched the headless process. Of course this code would need to be in the headless app, since the headed doesn't even launch.
It looks like you can call setActivationPolicy from applicationShouldHandleReopen and set the policy back to NSApplicationActivationPolicyRegular at that time. I made a test app with the following code and it seems to behave as you describe you would like it to above (although I've only run it on 10.11.5):
#NSApplicationMain
class AppDelegate: NSObject, NSApplicationDelegate {
func applicationDidFinishLaunching(aNotification: NSNotification) {
if NSUserDefaults.standardUserDefaults().stringForKey("background") == "true" {
NSApp.setActivationPolicy(.Prohibited)
print("launched in background")
}
}
func applicationShouldHandleReopen(sender: NSApplication, hasVisibleWindows flag: Bool) -> Bool {
NSApp.setActivationPolicy(.Regular)
return true
}
}
Is there a way to close running applications in swift? For instance, if the application I create needs to close safari.
Here's a Swift 5 version for closing running applications without using AppleScript (AppleScript is a perfect way but it isn't the only way), Safari is used as the example in this case:
let runningApplications = NSWorkspace.shared.runningApplications
if let safari = runningApplications.first(where: { (application) in
return application.bundleIdentifier == "com.apple.Safari" && application.bundleURL == URL(fileURLWithPath: NSWorkspace.shared.fullPath(forApplication: "Safari")!)
}) {
// option 1
safari.terminate()
// option 2
kill(safari.processIdentifier, SIGTERM)
}
SIGTERM instead of SIGKILL, referencing from here
Of course, make sure you notify the user of this activity since this may cause negative impact on the user-experience (for example, user-generated contents in the targeted application are not saved before terminating)
It is certainly possible via an applescript directly IF:
your app is not running sandboxed (note that if you plan to distribute it via the App Store, your app will be sandboxed)
OR
your app has the necessary entitlements to use applescript: com.apple.security.scripting-targets (apple needs to approve that AND you need to know which apps to target. this isn't a blanket permission)
then
https://apple.stackexchange.com/questions/60401/how-do-i-create-an-applescript-that-will-quit-an-application-at-a-specific-time
Can you execute an Applescript script from a Swift Application
if you aren't going for App Store complicity anyways, you might also use NSTask directly
scripts / code snippets:
How to force kill another application in cocoa Mac OS X 10.5
Can you execute an Applescript script from a Swift Application
short & sweet: technically yes, 'politically' maybe :D