Does anyone else have infuriating usability issues while trying to work with MonoDevelop in OS X Snow Leopard? The ones that interrupt my flow the most are associated with button clicks not responding until I move the window around a little bit. Afterwards, I can get maybe one or two button presses in before I have to repeat moving the window around.
I've heard this is a GTK problem that has nothing to do with MonoDevelop in the past and would like to know if anyone else experiences this or has found a way to fix it?
Yes, there are quite a few minor Mac-specific issues in MD, mostly due to the GTK toolkit. You will find some listed in Bugzilla, and others on the known issues page. The best place to ask about this stuff is on the monodevelop mailing list or, better, in the bug reports.
FWIW, I don't think you have to actually move the window to reset GTK's tracking, you just have to click on the window title bar.
Related
I was wondering, would it be possible to create a small piece of software that would allow the user to minimize a window by scrolling down on it? (on the top part that can be used to move it via drag and drop, obviously).
Following the same idea, it would be cool to be able to scroll up on the task bar icons to restore a minimized windows. I thought it would be cool (and since the user does not actually click but scrolls instead, it would prevent accidental nearby program openings when trying to restore a window)!
I am a total newbie when it comes to things like these. Could you please indicate me:
if Windows would let me do that (I doubt it wouldn't)?
How to code something like that (what language, and so on...)?
A way of doing this for maximizing the tabs open in the taskbar would be to create a custom taskbar identical to that of the main one. The second taskbar from Dual Monitor Taskbar works this way. There are also libraries in languages such as C++ which can detect scroll wheel movements.
Although this wasn't the most helpful answer, hopefully this could give you some ideas.
It is definitely possible. I found a software that does that 6 years ago and I have been using it ever since. It is called Preme for windows (http://www.premeforwindows.com/)
I have been using the software for about 6 years now and I cannot use windows without it. It also allows closing windows when clicking the mouse scroll key and maximizing a window with scroll up.
I hope this helps you. I am always wondering why Microsoft does have these options built in in the OS!!
I was thinking it would be cool if I could write a little OS X app that would look at the current app that's in focus and display all the available keyboard short cuts for that app in a small window so if you are working in a new program and you want to at a glance see what's available, that would be helpful for learning the commands.
My question is, does OS X provide a way for you to inspect menu items and look for keyboard short cuts within the app that is "in focus"?
Anyone's thoughts on how to accomplish this or if it's even worth exploring are much appreciated.
I have a cross platform Qt application that's running into some trouble in OSX. There's a feature that OSX has that I didn't even know existed - the 'Help' key. My MBP doesn't have one, and neither does my Apple wired keyboard purchased a year ago. It seems that this is mostly something that older Macs have. Apparently it generates the same scan code as the Insert key on PC keyboards.
Anyways, when the Help key is pressed, the cursor over our application (or any application that receives the Help key event) turns into a little question mark. This seems to be part of what's called 'context-sensitive help mode', as documented in the NSHelpManager's setContextHelpModeActive: method and in the NSApplication's activateContextHelpMode: method docs. From the docs:
In this mode, the cursor becomes a question mark, and help appears for any user interface item the user clicks.
Most applications don’t use this method. Instead, applications enter
context-sensitive mode when the user presses the Help key.
Applications exit context-sensitive help mode upon the first event
after a help window is displayed.
How many Cocoa developers actually know about this? I'm assuming that clicking on something in the application with this question mark cursor should do something like bring up a help message, but I haven't found a single Cocoa application where it actually does anything at all - not even Apple's apps do anything. In fact, it even seems to put a lot of applications into a strange mode where the cursor text selection is enabled.
The problem is that when we change the application cursor programmatically in Qt when we're in this help-question-cursor-mode, bad things happen. Specifically, our application actually crashes. The crash happens deep inside Cocoa in the NSApplication's NSHelpManager. I'd like to find out why we're seeing this crash, but I'm actually more interested in how we can suppress this 'help' mode. There's nothing in Qt or Cocoa that I can see that would stop it, other than perhaps intercepting and squashing an event, which I haven't tried yet.
Does anyone know any more about this?
I'm a total rookie when it comes to Objective-C so please bear with me...
Been thinking of learning the basics and trying out creating some software of my own. One thing that's bothering me (and never seem to show up as an alternative in any updates) is the ability to require a double-click in order to start an app in the Dock. I always seem to manage to click at the wrong place when switching between apps...
Yes, I am very well aware of Cmd+Tab thank you :) I really want this feature and it shouldn't be too hard to set up as long as overriding the default functionality of the Dock is possible. Thoughts/suggestions? Perhaps just a Terminal command is enough...
My manager thinks he's seen other people "lock" the windows on screen keyboard to the bottom of their applications, effectively docking it with their window, and wants me to reproduce this. They're using vb6 and occasionally vb.net.
I've done a good amount of googling on the subject and I'm resorting to looking into the windows SDK at the minute, but if someone out there can save me a few days of pain by either confirming that it's not possible or pointing me in the right direction if it is I'd appreciate it.
I find that the keyboard locks if I open it from the taskbar icon. It will stay on the bottom of the screen then, even if I'm not on a text field.
No idea how to achieve that programmatically though