Cocoa: Getting the current mouse position on the screen - cocoa

I need to get the mouse position on the screen on a Mac using Xcode. I have some code that supposedly does that but i always returns x and y as 0:
void queryPointer()
{
NSPoint mouseLoc;
mouseLoc = [NSEvent mouseLocation]; //get current mouse position
NSLog(#"Mouse location:");
NSLog(#"x = %d", mouseLoc.x);
NSLog(#"y = %d", mouseLoc.y);
}
What am I doing wrong? How do you get the current position on the screen?
Also, ultimately that position (saved in a NSPoint) needs to be copied into a CGPoint to be used with another function so i need to get this either as x,y coordinates or translate this.

The author's original code does not work because s/he is attempting to print floats out as %d. The correct code would be:
NSPoint mouseLoc = [NSEvent mouseLocation]; //get current mouse position
NSLog(#"Mouse location: %f %f", mouseLoc.x, mouseLoc.y);
You don't need to go to Carbon to do this.

CGEventRef ourEvent = CGEventCreate(NULL);
point = CGEventGetLocation(ourEvent);
CFRelease(ourEvent);
NSLog(#"Location? x= %f, y = %f", (float)point.x, (float)point.y);

Beware mixing the NS environment with the CG environment. If you get the mouse location with the NS mouseLocation method then use CGWarpMouseCursorPosition(cgPoint) you will not be sent to the point on the screen you expected. The problem results from CG using top left as (0,0) while NS uses bottom left as (0,0).

The answer to this question in Swift
let currentMouseLocation = NSEvent.mouseLocation()
let xPosition = currentMouseLocation.x
let yPosition = currentMouseLocation.y

NSLog(#"%#", NSStringFromPoint(point));
NSLog is true;

Related

Simulator vs Device Contact Point Conversion Differences?

I have been struggling with this for a while now and cant seem to find the issue.
I have an SKScene which i will refer to as self, and an SKNode that is called chapterScene that gets added to self. I have a boundary set up that contains a movable character. Here is how I have set this up
ViewController.m (the controller that presents the SKScene subclass OLevel
- (void)viewDidLoad {
[super viewDidLoad];
// Configure the view.
SKView *skView = (SKView *)self.view;
skView.showsFPS = YES;
skView.showsNodeCount = YES;
// Create and configure the scene.
scene = [OLevel sceneWithSize:self.view.frame.size];
scene.scaleMode = SKSceneScaleModeAspectFit;
// Present the scene.
[skView presentScene:scene];
// Do things after here pertaining to initial loading of view.
}
Here is my OLevel.m
- (id)initWithSize:(CGSize)size {
if (self = [super initWithSize:size]) {
NSLog(#"Creating scene");
[self setUpScene];
}
return self;
}
- (void)setUpScene {
NSLog(#"Setting up scene");
//self.speed = 0.9f;
#pragma 1 Set up scene
// Set up main chapter scene
self.anchorPoint = CGPointMake(0.5, 0.5); //0,0 to 1,1
chapterScene = [SKNode node];
chapterScene.position = CGPointZero;
chapterScene.name = #"chapterScene";
[self addChild:chapterScene];
// Set up physics boundary
self.physicsWorld.gravity = CGVectorMake(0.0, 0.0);
self.physicsWorld.contactDelegate = self;
.
.
.
}
The main point here, is that ultimately I have set up my scene and its child nodes correctly (as I expected until of late). When I run on the simulator (iPhone 6), I am using the - (void)didBeginContact:(SKPhysicsContact *)contact method to monitor and collisions. Whenever contact does begin, I am logging the following
CGPoint contactPoint = contact.contactPoint;
NSLog(#"non conv: %f, %f", contactPoint.x, contactPoint.y);
CGPoint sceneContactPoint = [self convertPoint:contactPoint toNode:chapterScene];
NSLog(#"1 conv pt: %f, %f", sceneContactPoint.x, sceneContactPoint.y);
I am also logging the characters position as well to make sure that this converted point is correct.
When I run this on the simulator, and the moving node character hits the wall, I get this:
2016-02-25 20:02:31.102 testGame[43851:14374676] non converted point: 0.143219, 29.747963
2016-02-25 20:02:31.102 testGame[43851:14374676] 1 conv pt: -140.206223, 615.699341
2016-02-25 20:02:31.102 testGame[43851:14374676] Player hit the wall
2016-02-25 20:02:31.103 testGame[43851:14374676] player pos: -140.206238, 590.749268
which is quite correct.
HOWEVER, for whatever reason I can not seem to find, this is the exact same code running on my iPhone 5C...
2016-02-25 20:04:50.062 testGame[2907:1259447] non converted point: 160.337631, 310.808350
2016-02-25 20:04:50.063 testGame[2907:1259447] 1 conv pt: 70.996162, 900.004272
2016-02-25 20:04:50.064 testGame[2907:1259447] Player hit the wall
2016-02-25 20:04:50.065 testGame[2907:1259447] player pos: -89.003845, 593.984009
I really hope this is a simple fix that I am missing. If anyone can help me I would greatly appreciate it. thanks
UPDATE
It appears that all that is happening is that when I run it on the simulator, the point is being referenced from the center of the screen (0,0) whereas on the device, the reference point is the true origin, the top left corner being (0,0), with the center being, in the iPhone 5c's case, (160, 284). Still not sure how to correct this...or why it is even happening.
So far this is the only solution I can think of...
if (!TARGET_OS_SIMULATOR) {
contactPoint = CGPointMake(sceneContactPoint.x - screenBounds.size.width/2.0f, sceneContactPoint.y - screenBounds.size.height/2.0);
}
else {
contactPoint = CGPointMake(sceneContactPoint.x, sceneContactPoint.y);
}
But this is embarrassing. Either this is a bug with Xcode or Apple, or there is reason this is happening and a different solution.

ZBar not cropping scan region

I'm cropping the scanning region of Zbar via the following code:
- (void)startScanning
{
NSLog(#"Scanning..");
reader = [AACZBarViewController new];
reader.readerDelegate=self;
reader.supportedOrientationsMask = ZBarOrientationMask(UIInterfaceOrientationPortrait);
reader.showsZBarControls = NO;
CGFloat x,y,w,h;
x =0;
y =0.25;
w=1;
h=0.50;
reader.scanCrop = CGRectMake(x,y,w,h); //Crop scan region
reader.cameraOverlayView = [self myOverlay];
ZBarImageScanner *scanner = reader.scanner;
[scanner setSymbology: ZBAR_I25 config: ZBAR_CFG_ENABLE to: 0];
[self presentViewController:reader animated:YES completion:nil];
}
The problem however is that the program still uses the entire screen area to find a barcode - not the middle 50%. I don't think the issue is the reader.scanCrop method, but as to what the real culprit is, I can't fathom.
Edit:
Anyone?
I had a look at the zbar documentation again and noticed it said the x axis on the camera is verticle - not horizontal. Now I'd set the reader to be portrait only but apparently this does not affect the camera in any way. I didn't find a way to change this, but I did manage to crop to the scanning region I wanted.
The solution:
If you want the following scan region (x,y,w,h) then you set the rectangle by swapping the x and y and width and height so do this (y,x,h,w). It doesn't seem to crop to the bounding box exactly, but it's close enough for my purposes.

How to move to the right the label Core Plot?

I use the following method to display the labels for my plot:
-(CPTLayer *)dataLabelForPlot:(CPTPlot *)plot recordIndex:(NSUInteger)index{
...
CPTTextLayer *label=[[CPTTextLayer alloc] initWithText:stringValue style:textStyle];
}
which for every index should return the label
I know that it's possible to move label up or down using:
plot.labelOffset=10;
The question is: how can i move the label a bit to the right?
I tried to use
label.paddingLeft=50.0f;
but it doesn't work.
Adding padding as in your example does work, but maybe not in the way you expect. Scatter and bar plots will center the label above each data point (with a positive offset). The padding makes the whole label wider so when centered, the test appears off to the side. It's hard to control, especially if the label texts are different lengths.
There is an outstanding issue to address this (issue 266). No guarantees when it will be fixed, but it is something we're looking at.
I ran into the same problem and came up with a different solution.
What I decided to do was to create the label using the CPTAxisLabel method initWithContentLayer:
CPTTextLayer *textLayer = [[CPTTextLayer alloc] initWithText:labelStr style:axisTextStyle];
CGSize textSize = [textLayer sizeThatFits];
// Calculate the padding needed to center the label under the plot record.
textLayer.paddingLeft = barCenterLeftOffset - textSize.width/2.0;
CPTAxisLabel *label = [[CPTAxisLabel alloc] initWithContentLayer:textLayer];
Here barCenterLeftOffset is the offset of the center of the plot record.
I wrote an article about this:
http://finalize.com/2014/09/18/horizontal-label-positioning-in-core-plot-and-other-miscellaneous-topics/
A demo project I created that uses this solution can be found at:
https://github.com/scottcarter/algorithms
You can subclass CPTTextLayer and include an offset.
#interface WPTextLayer : CPTTextLayer
#property (nonatomic) CGPoint offset;
#end
#implementation WPTextLayer
-(void)setPosition:(CGPoint)position
{
CGPoint p = CGPointMake(position.x + self.offset.x, position.y + self.offset.y);
[super setPosition:p];
}
Then Use:
WPTextLayer *tLayer = [[WPTextLayer alloc] initWithText:#"blah" style:textStyle];
tLayer.offset = CGPointMake(3, -3);
return tLayer;
There may be consequences of this that I'm not aware of, but it seems to be working so far.

Flipped NSView mouse coordinates

I have a subclass of NSView that re-implements a number of the mouse event functions. For instance in mouseDown to get the point from the NSEvent I use:
NSEvent *theEvent; // <- argument to function
NSPoint p = [theEvent locationInWindow];
p = [self convertPoint:p fromView:nil];
However the coordinates seem to be flipped, (0, 0) is in the bottom left of the window?
EDIT: I have already overridden the isFlipped method to return TRUE, but it has only affected drawing. Sorry, can't believe I forgot to put that straight away.
What do you mean by flipped? Mac uses a LLO (lower-left-origin) coordinate system for everything.
EDIT I can't reproduce this with a simple project. I created a single NSView implemented like this:
#implementation FlipView
- (BOOL)isFlipped {
return YES;
}
- (void)mouseDown:(NSEvent *)theEvent {
NSPoint p = [theEvent locationInWindow];
p = [self convertPoint:p fromView:nil];
NSLog(#"%#", NSStringFromPoint(p));
}
#end
I received the coordinates I would expect. Removing the isFlipped switched the orientation as expected. Do you have a simple project that demonstrates your problmem?
I found this so obnoxious - until one day I just sat down and refused to get up until I had something that worked perfectly . Here it is.. called via...
-(void) mouseDown:(NSEvent *)click{
NSPoint mD = [NSScreen wtfIsTheMouse:click
relativeToView:self];
}
invokes a Category on NSScreen....
#implementation NSScreen (FlippingOut)
+ (NSPoint)wtfIsTheMouse:(NSEvent*)anyEevent
relativeToView:(NSView *)view {
NSScreen *now = [NSScreen currentScreenForMouseLocation];
return [now flipPoint:[now convertToScreenFromLocalPoint:event.locationInWindow relativeToView:view]];
}
- (NSPoint)flipPoint:(NSPoint)aPoint {
return (NSPoint) { aPoint.x,
self.frame.size.height - aPoint.y };
}
- (NSPoint)convertToScreenFromLocalPoint:(NSPoint)point
relativeToView:(NSView *)view {
NSPoint winP, scrnP, flipScrnP;
if(self) {
winP = [view convertPoint:point toView:nil];
scrnP = [[view window] convertBaseToScreen:winP];
flipScrnP = [self flipPoint:scrnP];
flipScrnP.y += [self frame].origin.y;
return flipScrnP;
} return NSZeroPoint;
}
#end
Hope this can prevent just one minor freakout.. somewhere, someday. For the children, damnit. I beg of you.. for the children.
This code worked for me:
NSPoint location = [self convertPoint:theEvent.locationInWindow fromView:nil];
location.y = self.frame.size.height - location.y;
This isn't "flipped", necessarily, that's just how Quartz does coordinates. An excerpt from the documentation on Quartz 2D:
A point in user space is represented by a coordinate pair (x,y), where x represents the location along the horizontal axis (left and right) and y represents the vertical axis (up and down). The origin of the user coordinate space is the point (0,0). The origin is located at the lower-left corner of the page, as shown in Figure 1-4. In the default coordinate system for Quartz, the x-axis increases as it moves from the left toward the right of the page. The y-axis increases in value as it moves from the bottom toward the top of the page.
I'm not sure what your question is, though. Are you looking for a way to get the "flipped" coordinates? If so, you can subclass your NSView, overriding the -(BOOL)isFlipped method to return YES.

Cocoa Move Mouse

I'm writing a Mac OS X application on Snow Leopard. I have a step method which is fired at a regular interval by a NSTimer. In this method I would like to move the mouse to the center of the screen, with no buttons being pressed or released. Here's what I have:
-(void) step: (NSTimer *) timer
{
NSRect bounds = [self bounds];
CGPoint point = CGPointMake(bounds.origin.x + bounds.size.width / 2.0f, bounds.origin.y + bounds.size.height / 2.0f);
CGEventCreateMouseEvent(NULL, kCGEventLeftMouseDragged, point, 0);
}
This doesn't do anything. Can somebody tell me what's wrong?
It sounds like CGWarpMouseCursorPosition is precisely what you're after (it moves the pointer without generating events - see the Quartz Display Services Reference for more info).

Resources