I am trying to create a text field in the Mac OSX environment that allows a user to select a number and drag horizontally to adjust that number up and down.
I know this can be done because Apple have implemented it in the inspector panel of the Sprite Kit emitter section: see the image
I have tried sub classing NSTextField to capture mouse drag events and doing the math, but they don't seem to get passed through. The mouse down event does though...
I have also tried placing a dummy view over the top and catching events as they come through. This works for mouse down, but again, the mouse dragged is never called. I know that the view can receive the mouse dragged event though, because if I place that view somewhere that a text field isn't under it, everything works fine.
What are my options here? I feel I have tried everything and the only option left is to create a new control from scratch. I don't mind that too much, but I also want the user to be able to add equations in the text field and drag individual values - that means the full functionality of NSTextField will need to be rebuilt just to add this dragging feature... Is there anything else???
Related
I'm using FabricJS for some project, and my goal now is to save the new coordinates when the user drag an object. My issue is that when I select an object and move it, the selection box moves but not the object.
Image there:
I have some code which deal with object selection and mouse click and move events, so I tried to remove them but the issue persists.
Edit: looks like it's only a graphical issue, because the object:moved event is well triggered.
Edit 2 : after some progress in the development, this bug has mutated: now the object and the selection box move together, but I have to click two times on the object to make the selection box appear (while the selection events are triggered on the first click).
I have an NSToolbar with NSToolbarItem instances. One of the toolbar buttons is in one of two modes, depending on whether it currently operating (has been clicked) or not. I am handling this in code by changing the icon for the button to have a background rectangle when the command it represents is operational, but I can't help thinking there must be another way.
I've tried using the Selectable checkbox in XCode Interface Builder attribute inspector, and it sort of gives the result I want, except when it is selected I can't click any of the other toolbar items. I also can't see how to deselect it.
I'm a bit of a Cocoa noob so I expect the two state toggle thing is just waiting for me to find it, except so far I haven't been able to.
This seems like it would be a common thing to want to do, thing is how?
I am a rookie Cocoa guy. I need to design and implement a view which will show collection of labels on Mac OS using Xamarin. These labels will have a text and color associated with them. When shown inside the view, label should expand till it covers whole text and it will be shown with background and foreground colors.
I have attached the picture of this user control on Windows, you can see that labels inside the StackPanel are expanding till they cover the whole text. Hope this gives better idea about my ask.
The $64,000 question is "are these labels controls?" In other words, do you expect the user to click on these to do something, or are they just for display?
If your answer is "just for display", the solution is super simple: Use an NSTextField and programmatically add attributed text (NSAttributedString) to it. Attributed text attaches display properties to runs of text within the field; properties like "background color".
If you want these to be buttons that you can click on, then things get a lot more complicated.
Since you apparently want the button layout to "flow", you might look into imbedding buttons (well, button cells) into an NSTextField using attachments. This is normally how non-text content (say, an image) can be inserted, but with some fiddling it can actually be anything a control cell can draw. See How to insert a NSButton into a NSTextView? (inline).
Warning: this is not a "rookie" topic and will involve control cells and custom event handling.
If I were doing this, I'd probably just create NSButton objects for each label (choosing an appropriate style/look like NSRecessedBezelStyle), create a custom subclass of NSView to contain them, and then override the layout method to position all of the buttons the way I want.
To be thorough, I'd also override the intrinsic size methods so the whole thing could participate in auto-layout, based on the number and size of buttons it contained.
I'm using NSTrackingArea to define 2 areas in a NSView subclass. Then I'm using mouseEntered/mouseExited to change the cursor to a custom one.
So all works fine when the mouse enters the top tracking area and the custom cursor gets set as expected. All still good when I mouseDown and drag on the top tracking area. But I have another part of the UI that updates when the mouse is dragged and it adds subviews to a view elsewhere on the same window.
As soon as the first subview is added elsewhere, my custom cursor disappears and it reverts back to the arrow cursor. I thought I might be able to force the cursor back to the custom one using cursorUpdate for my view but for some reason it never gets called, even when set as an option in the NSTrackingArea.
Am a bit stumped with this one...
I solved it by overriding cursorUpdate in the window's custom contentView. An empty cursorUpdate method stopped the update from getting passed up the chain and the custom cursor now remains as I've set it.
What I am trying to accomplish with Cocos2d, is to create a horizontal menu, which can be swiped from left to right. I posted an image, to show my idea.
The image below has a white bar, where I want to show MenuItem objects, now I want to be able to do a swipe in the white region, so that the next menu item is centered.
Example http://www.wimhaanstra.com/images/MenuExample.png
The problem I am facing is, I would really like to use the Menu (& MenuItem) functionality of Cocos2d, but it seems somehow that the MenuItem object does not accept touches other than just tapping it. Also I want the swipe to be detected not only on the MenuItem, but the whole white bar.
What would be the best approach for this?
Somehow incorporate an UIScrollView, but that would be a shame, because I would like to use OpenGL for everything
Subclasss the MenuItem class, to create one where the ccTouchBegan is handled, and somehow move the whole Menu?
Just leave the whole Menu idea behind, and replace the Menu and MenuItems with sprites which support touching.
I read somewhere that MenuItem's shouldn't really be used for this kind of work, but why not?
Layer can handle touches event.
you can put you menuItems into one layer.
detect a swipe, and move the the menuItems