Citrix Receiver Resolution 4K on MacOS - macos

I have a Retina screen MacBook Pro with a 15.6" 4K external monitor attached. I've discovered the resolution of the Citrix session is terrible unless I set the (MacOS) display resolution to 3840x2160, in which case the session resolution is perfect.
The trouble is at this setting, the resolution outside of the session, in MacOS is tiny - I usually have it set to scaled slightly larger. Without the external screen attached, the situation is the same with the default native display resolution resulting in very poor quality in the session.
Is there any solution where both the Citrix session and MacOS both look good?

defaults write com.citrix.receiver.nomas EnableHighDPI -bool YES

I have a Retina screen MacBook Pro with a 15.6" 4K external monitor attached.
According to Citrix they don't have a solution yet.
Prerequisites
To try out on individual MacBooks without enabling for the entire organization/store or if you do not want to whitelist LaunchDarkly on your internal network, you can run the following command in macOS Terminal
defaults write com.citrix.receiver.nomas EnableHighDPI -bool YES
Limitations
Supported devices: MacBook, MacBook Pro. Devices not supported: Mac mini, iMacs, Mac Pro.
No Multi monitor support. This feature will not work if you have connected a secondary monitor. Even if you don't use the secondary monitor with the Citrix Workspace app, simply connecting it will cause this feature to not be available. This feature works only when you use a single monitor for your device.
Source: https://support.citrix.com/article/CTX270704/citrix-workspace-app-for-macos-and-retina-displays
I tried it on my M1 Mac. After I disconnected the Thunderbolt dock with all peripherals the running session changed the resolution by itself to higher DPI. As soon as I connected an external monitor I was back to 1080p.

Related

Weird Android Emulator and Mac tap-to-click sensitivity issue

I'm experiencing a really weird and frustrating issue with the Android Emulator on macOS Monterey.
I have "tap to click" enabled on my Macbook Pro (Mid 2015 15"), and it works fine in all other apps. But somehow, when the emulator window is active it seems to miss almost every other tap. If I click hard instead of tapping, it catches every click. The tap sensitivity in the Trackpad settings is set to "light".
So, it seems that the emulator window is somehow less sensitive to tapping than all other apps. I don't even know how this is possible, is there even such a thing as app-specific tap-sensitivity??
What's more, it's not only the emulator window itself that has this issue, but the emulator settings window as well. If I tap the "Enable clipboard sharing" toggle, it misses about 50% of the taps. If I click hard, it catches them 100%. If I try the same in some other app (tested with the "System Preferences" window), it catches 100% of the taps.
I have tested and tested this again to make sure I'm not biasing the results, but there really is a difference, and it's driving me nuts. I think it appeared after updating to Monterey, but not 100% sure of the exact timing correlation.
Any ideas??
My problem was really similar, I am using MAC with apple mouse, so I could fix it by disabling the mouse wheel on Android Emulator Extended Controls.
Hope that help
I've noticed the same issue some time ago. Unfortunately, I didn't find any solutions.
However, there are a couple of good enough workarounds:
Launch the emulator in a tool window. Anyway, this is a default approach for modern versions of Android Studio. To enable/disable it check Preferences -> Tools -> Emulator -> Launch in a tool window.
Use alt emulators. For instance, Genymotion doesn't have such an issue.
I ran into a simmiliar issue for me and the solution that i found was enabling "tap on click" to in the "system preferences" -> "trackpad".
I am new to the Android Emulator, but am experiencing the same issue in Ubuntu, even though I have tap-to-click disabled in the OS. I hate tap-to-click, so having an ultra-sensitive-to-touch Android screen emulated on my laptop is beyond frustrating.
Looking at the documentation, I came across the SOURCE_CLASS_POINTER method, which states:
The input source is a pointing device associated with a display. Examples: SOURCE_TOUCHSCREEN, SOURCE_MOUSE. A MotionEvent should be interpreted as absolute coordinates in display units according to the View hierarchy. Pointer down/up indicated when the finger touches the display or when the selection button is pressed/released. Use getMotionRange(int) to query the range of the pointing device. Some devices permit touches outside the display area so the effective range may be somewhat smaller or larger than the actual display size.
In reading that, I've come to believe this may actually be the default behavior due to touchpad events being interpreted through the SOURCE_TOUCHSCREEN method, rather than SOURCE_TOUCHPAD or SOURCE_MOUSE.
Unfortunately, I don't have a solution as much as a workaround:
I plugged in a mouse and tested the pointer up/down movements over the screen, which this part of the document suggests should register as a press. However, with the mouse it only responds to clicks. So it suggests to me that it is indeed properly interpreted as a SOURCE_MOUSE controlled pointer and not a SOURCE_TOUCHSCREEN controlled pointer.
So unless we can find out how to make the AVD properly interpret a touchpad as a touchpad, and not a touchscreen, using a mouse seems like the best solution.
For reference, I'm including this link to the AVD manual: https://developer.android.com/studio/run/emulator
UPDATE: Somehow over the period of about 18 hours and several restarts, my AVD no longer does tap-to-click on its virtual screen. It would be very hard to pinpoint exactly what changed because I've been updating packages frequently since I'm running a pre-alpha release of Ubuntu, but I think it's from using X11 instead of Wayland.
Which got me thinking, you could try changing your display server from Cocoa to X11. Thankfully, MacPorts, the MacOS version of the FreeBSD Ports Tree, makes it fairly easy to cross-compile software. It contains build recipes for multi-platform unix-like software, much like HomeBrew but often allowing for more customization.
That tap issue was annoying enough it's probably worth giving a shot.
(from macports website) The X11 windowing environment, for ports that depend on the functionality it provides to run. You have multiple choices for an X11 server: https://www.macports.org/install.php
I would build them in this order:
MacPorts: X11 - If you build it, you'll have a bunch of libraries already
MacPorts: QEMU - use make configure menu to select GTK3+, if there's no option for X11, try this build flag with make after you install X11 (pointing it at your X11's lib dir):
make -L/opt/X11/lib -lX11
Lastly, MacPorts: Android Platform tools
Related StackOverflow Q/As:
Compiling a C program that uses OpenGl in Mac OS X
Running x11 on Mac OS

Low FPS when naming application explorer.exe

I have an Unity application let's call it X it usually runs at about 60 FPS. But when I rename the executable to Explorer (the same as Windows Explorer) my FPS drops to 7-8.
This is a screenshot from the profiler when the app is named X when the FPS is about 60:
This is another screenshot from the profiler with the same app but named explorer.exe when the FPS is about 8:
This is a screenshot of the GPU usage when named X.exe:
The same app named explorer.exe:
I'm can and will rename the app to something else, but I'd like to know what causes this and how I can figure this out on my own.
Things I've tried:
Disabling my AV which is windows defender and restarting with no effect.
Trying to reproduce it on a colleagues PC, with no success.
This makes me things that it might be specific to my machine and that maybe some process is trying to make some API calls on the other Explorer.exe and somehow affects my app.
In case it's relevant
I'm using Unity 2019.3.5f1
It's happening in the built app both debug and release
OS Name: Microsoft Windows 10 Pro
OS Version: 10.0.18362 N/A Build 18362
System Model: Alienware 17 R4
System Type: x64-based PC
Disappointingly the issue was cause by the fact the my windows was using the CPU graphics card instead of the dedicated one when the app was named explorer.exe.
To change this, I had to:
Go to Settings > System > Display > Graphics Settings.
Browse and pick my app.
Select my app from the list.
Select Options.
Select High performance.
Select Save.
Original source
This made my app use dedicated GPU instead of the integrated one.

Why does macOS handle screen resolution differently from Windows?

Apple iMac & MacBook lineup uses high-resolution display branded as "Retina Display" which by default, macOS will set the screen resolution below the native screen resolution. For example on the 13-inch Retina MacBook Pro, which comes with native resolution of 2560-by-1600, macOS will set the default resolution to 1280-by-800 with option to scaled to other screen resolution (1024-by-640, 1440-by-900, and 1680-by-1050) on Settings app.
However, on Windows (including the latest version) screen resolution are by default, set to the native screen resolution with "Scaling" function to increases the elements sizing. For example on a 15-inch laptop, which comes with a Full HD display, Windows will set the default resolution to the 1920-by-1080 with scaling set to 100% but recommended to 125%. Setting a higher scaling in turns result in certain elements displayed blurry.
Hence, why is this case?
An operating system can work without screens. You could for example start your computer (either running Windows, or MacOSX, or Linux), and run some application app (or app.exe on Windows) on that command line.
Now, imagine you type app (without the ENTER key) in some command window, unplug your screen, and type the ENTER key. You app has still started (and perhaps detected that no screen is available, only if that app is a GUI application opening a window by using some widget toolkit). If your app is not a GUI application but a command line one (eg cp which copy files) it can successfully execute.
In practice, your screen is today used by some display server. And your application don't show directly pixels on the screen: it interacts with your display server which generally is the only process accessing the screen (more precisely, your graphics card).
So you need to learn how to tune or configure your display server. And that of course is operating system and display server specific. On MacOSX and and Windows and on Linux they are very different (and Linux even has several ones, e.g. Xorg or Wayland). On MacOSX it is Quartz.

iMac developing Bluetooth 4.0 application with external dongle

I have a Late 2009 iMac which doesn't support Bluetooth 4.0 and an USB Bluetooth 4.0 dongle (CSR).
I need to write an OSX application which transfers data with a Bluetooth 4.0 device.
If I plug the dongle, the CBCentralManager returns the state CBCentralManagerStateUnsupported since OSX has loaded the driver for the internal Bluetooth device.
If I issue the following command:
sudo nvram bluetoothHostControllerSwitchBehavior="always"
the OSX loads the driver for the dongle, the CBCentralManager recognize the device but unfortunately both wireless keyboard and magic mouse stop working. Presumably because they use a previous version of Bluetooth.
Is there any way to have both internal Bluetooth and external dongle working together ?
Aside from the development, which I can manage with USB keyboard and mouse, this issue prevents older iMacs to use my app.
Thank you in advance.
...unfortunately both wireless keyboard and magic mouse stop working. Presumably because they use a previous version of Bluetooth.
This is not true. The iMac takes some time to recognize that it has not keyboard and mouse configured. After waiting some time, the iMac asks for pairing keyboard and everything work correctly with the external dongle.

Flash Player interface problem

I am facing an interface freeze issue with Flash Projector running a flex state based application. A Flash Projector exe was generated from a standalone flash player ver 10.2. The target machine on which the problem occuresd has 10.3.
Basically "screen freeze" means that the user interface is running as usual on Flash Player, but it's not responding to any user input (like button presses). But if we alt-tab to another application, the state changes in the Flash player. There is display with buttons on the screen, but touching the buttons or doing anything else - it did not respond. Rebooting the computer fixes the problem.
Can you suggest why this is happening? Is there any known bug in Flash Player.
The problem is this is being hard to reproduce on the developer workstation as it doesn't happen always. But it happens quite often on the target machine running an Intel Atom N270. What debugging steps can you suggest?
Problem : http://www.youtube.com/watch?v=z25oV9QWRyk
Have you tried publishing the projector in 10.1 or a version of Flash newer than 10.2? If you are able to publish it as a SWF first, you can use the stand-alone projector exe (downloadable here) to load and create a projector from it.
According to this Adobe bug issue (registration required), version 10.1 was supposed to have resolved this, but it sounds like it may have reappeared in 10.2.

Resources