QtWebEngine (tried 5.9.2, 5.9.3, 5.9.10) does not obey macOS scrollbar setting to auto-show scrollbars. The scroll-indicator is always visible. We're seeing it in a C++ Qt app, but happen elsewhere, such as in the Qt sample quicknanobrowser. Non-Qt Chromium based apps such as Electron (and Chrome itself) behave correctly.
Anybody found a way to work around this with Qt-supplied binaries?
Quicknanobrowser example:
And in our product where it's particular ugly:
I opened a Qt issue: https://bugreports.qt.io/browse/QTBUG-65745
More detail on expected behavior: macOS system preference Show scroll bars is set to Automatically based on mouse or trackpad: a scrollable control should hide the scrollbar until you begin scrolling with trackpad or a mouse with scrolling capability. Then it will show a scroll-indicator, and if you mouse over that it turns into a scrollbar.
Related
I use SkiaSharp canvas to draw the main game screen, and then there are various Xamarin.Forms Buttons around the UI. This all works fine on when used directly on iPhone or iPad using a finger. However, when I connect a mouse (e.g., through a MacBook or otherwise), the buttons start working with about 10% chance after mouse-clicking on the SkiaSharp canvas (and not receiving the mouse click events with 90% chance). The SkiaSharp canvas itself works just fine.
If I bring up the iOS app launch menu from the bottom (which probably somehow temporarily exists the mouse navigation on the app), the buttons start working again with the mouse. But if I click the SkiaSharp canvas again with the mouse, the buttons have a high chance of becoming disabled again. If I change to using a finger, all works fine (even if the mouse clicks were not being registered immediately before). However, mouse clicks are not being registered even after touching with a finger, so finger-touching does not reset the issue with the mouse (but bringing up the menu from the bottom does).
We found this bug by testing the iOS game on MacBook Pro (the iOS apps recently came available on the App Store) but the same issue persists also directly with an iPad / mouse combination. It seems to be some sort of an issue between using a mouse (on iPad or on MacBook Pro), SkiaSharp canvas and Xamarin.Forms buttons.
Does anyone know what the root cause of the problem is and what is the workaround?
Not an answer as such, but some more information about reproducing the issue: A simpler repro case may be this small project: https://github.com/jrc14/TraceMatching/ .
Don't worry too much about what it's doing, but note that you're mean to click in the grey Skia canvas in the middle to create 'targets' - and that after you've done that, mouse-clicks are getting lost.
If you run it on a Mac, you'll see that, though the clicks get lost after you've clicked on the Skia canvas, they will start being received again if you click on something else (another app, or the Mac background).
(further edit) - after some noodling around I did find a workaround. If, once you've finished processing the touch action on the SKCanvasView, you reset its EnableTouchEvents property (i.e. set it to false, then back to true again), it seems that the clicks don't get lost any more,
On the iPad, if a QML TextInput or TextEdit gains focus, the soft keyboard appears and the app content slides up as necessary so that both the text input field and the soft keyboard are visible.
On a Windows Surface Tablet, in tablet mode, Qt supports automatic showing of the soft keyboard, but the app content does not automatically slide up and the input field can be hidden behind the keyboard.
Is there a way to make the app content slide up automatically as it does on the iPad? Or is there a way to detect the presence and dimensions of the soft keyboard so that I can handle the slide up manually in code?
NB: the automatic showing of the soft keyboard on Windows in tablet mode was reported broken in this bug report (reported as early as Qt 5.3.2) but has been fixed as of Qt 5.11.2. Note also: these bugs affected Qt Widgets class QLineEdit also, and my issue may relate to that too but I have not tested it with Widgets.
This worked up until yesterday.
Before and after updating to MacOS Sierra 10.12.5 did not solve problem.
Config / Specs:
Macbook pro OSX
Dell monitor
iPad using Duet app
Previous functionality
I could expand fullscreen any app/program in Mac (the green button top right) and use 3 fingers on the trackpad to toggle between 'screens' on that individual physical screen.
Current bug
When I fullscreen an app on one display and swipe with three fingers, swiping controls all 3 physical displays. The display in question swipes to the app, but the other two displays also swipe to the right showing blank black or white backgrounds and are mouseover-able.
Desired functionality -- the Previous Functionality
Swiping on a screen only controls that display.
Had the same issue, there is a setting in Mission Control -> Displays have separate Spaces, checking this solves the problem for me.
I have an application built with JSF and PrimeFaces. I am using a layoutPane and within it are two panels. I have set up CSS to scroll the content sections of the panels however the scrollwheel will not work on OSX using Chrome version 51. I can however use the arrow keys to scroll the section. The scrolling works as expected when using Safari and Firefox but not Chrome.
I should note also that I am using a Mac Pro with Retina display. I also have a second monitor attached that is a HP w2207. To make things even more interesting, if I drag the Chrome window to the HP monitor the scrolling works as expected. Dragging the window back to the laptop Retina display and the scrolling no longer works.
I have tried various system settings and nothing has worked. I have also tried altering the HTML/CSS thinking maybe there is some kind of collision between the parent panel and the child panels but I have not been able to come up with a fix.
Has anyone experienced this issue before or could point me in the right direction?
Upgrade Chrome to version 52.
I'm working on a different stack, but the issue seems to be exactly the same: in some cases scroll doesn't work and it happens only when I'm using Chrome 51 and Retina. I wasn't able to find the cause, and the only solution that I know is upgrading Chrome.
I'm having some undesired behavior with movable panels in wxpython. I'm using the wxpython Cocoa build 2.9.2.3 for Python 2.7 on Mac OS X 10.6.7. I'm importing wx.aui and trying to create dockable panels.
I have a panel that I've created a wx.aui.AuiManager on and have added two panels, one on top and one on below. For both of them I have disabled the close button. Right now, the panels can be dragged into different dockable positions on the frame or off of the frame to create a floating window. This window shows up as the Mac native MiniFrame with a disabled close button. I do not want users to be able to separate the panels from the main frame.
I have passed .Floatable(False) to each pane's PaneInfo, but this won't allow the panels to be moved around at all, even if I pass a .Dockable(True)
Can I have panes in AUI that are dockable and movable, but not floatable?
I don't know if there's a way to do that or not. It may be a limitation of wx.aui. You should ask on the wxPython mailing list. Or you could try the mostly drop-in replacement: wx.agw.aui (http://xoomer.virgilio.it/infinity77/AGW_Docs/aui_module.html#aui). It fixes a bunch of bugs in the default wx.aui and is written in pure Python.