I have this animation I've made which is an arm moving from above some guy's head to the centre of his body. The distance from the edge of the arm to the mid point of the guy's body is the radius of a circle, so the distance is exactly the same whether the guy's arm is to his right, left, up, and in addition, the animation is exactly the same no matter what direction the arm is coming from to the center. If there were just 4 attack directions, I know I could just make the 4 different animation, but in my game, there are infinite directions the arm could extend in because the user swipes and the arm extends in that exact direction.
No matter what direction the arm is extended in, the same animation works just at a different angle (the animation was made with the guy's arm directly above his head).
I can calculate the angle at which the arm is extended, but how can I play an animation at this angle, if you know what I mean?
well, if you set the rotation of the first sprite of the animation, the entire array of frames will be tilted, as this example from one of my games :
CCAnimation *anim;
NSString *frameName;
NSString *animName;
anim = [self getAnimationFor:mapAnimationTypeIdle];
animName = [self getAnimationNameFor:mapAnimationTypeIdle];
frameName = [self getFrameNameForAnimationNamed:animName andFrame:1];
CCAction *forever = [CCRepeatForever actionWithAction:
[CCAnimate actionWithAnimation:anim]
];
_soldierAnim =
[CCSprite spriteWithSpriteFrame:
[[CCSpriteFrameCache sharedSpriteFrameCache] spriteFrameByName:frameName]
];
[self addChild:_soldierAnim z:0 tag:_tagForSoldierAnimation];
_soldierAnim.rotation=22;
[_soldierAnim runAction:forever];
i just tested this, it tilts the entire animation by 22 degrees.
Related
SKSpriteNode not showing movement when told to move ?
As explained shortly, I believe that I need to convert SKScene coordinates to SKView coordinates. So my question reduces to "How do I do that?"
Specifics:
I have a .sks file from which I manually extract size and position data for a SKSpriteNode as its supposed to move, which movement is inhibited only by the surrounding wall from which it bounces when it hits same.
This SKSpriteNode's changing position within this wall is based on its anchor = 0.5, 0.5.
Every time the object moves, I call this, for example:
func drawBall() {
newPosition = CGPoint(x: ballPosX, y: ballPosY)
moveTO = SKAction.move(to: newPosition, duration: TimeInterval(0))
myBall!.run(moveTO)
}
The fact that I do not see physical movement indicates that I may have a coordinate problem.
Specifically, the fact that the position of the SKSpriteNode is based on its anchor = 0.5, 0.5 shows me that I am dealing with SKScene coordinates and I need to convert these coordinates to SKView coordinates.
If short how do I do that? .. or .. if I have another error, how do I correct it?
In macOS programming, We know that
Quartz uses a coordinate space where the origin (0, 0) is at the top-left of the primary display. Increasing y goes down.
Cocoa uses a coordinate space where the origin (0, 0) is the bottom-left of the primary display and increasing y goes up.
Now am using a Quartz API - CGImageCreateWithImageInRect to crop an image , which takes a rectangle as a param. The rect has the Y origin coming from Cocoa's mousedown events.
Thus i get crops at inverted locations...
I tried this code to flip my Y co-ordinate in my cropRect
//Get the point in MouseDragged event
NSPoint currentPoint = [self.view convertPoint:[theEvent locationInWindow] fromView:nil];
CGRect nsRect = CGRectMake(currentPoint.x , currentPoint.y,
circleSizeW, circleSizeH);
//Now Flip the Y please!
CGFloat flippedY = self.imageView.frame.size.height - NSMaxY(nsRectFlippedY);
CGRect cropRect = CGRectMake(currentPoint.x, flippedY, circleSizeW, circleSizeH);
But for the areas on the top, i wrong FlippedY coordinates.
If i click near top edge of the view, i get flippedY = 510 to 515
At the top edge it should be between 0 to 10 :-|
Can someone point me to the correct and reliable way to Flip
the Y coordinate in such circumstances? Thank you!
Here is sample project in GitHub highlighting the issue
https://github.com/kamleshgk/SampleMacOSApp
As Charles mentioned, the Core Graphics API you are using requires coordinates relative to the image (not the screen). The important thing is to convert the event location from window coordinates to the view which most closely corresponds to the image's location and then flip it relative to that same view's bounds (not frame). So:
NSView *relevantView = /* only you know which view */;
NSPoint currentPoint = [relevantView convertPoint:[theEvent locationInWindow] fromView:nil];
// currentPoint is in Cocoa's y-up coordinate system, relative to relevantView, which hopefully corresponds to your image's location
currentPoint.y = NSMaxY(relevantView.bounds) - currentPoint.y;
// currentPoint is now flipped to be in Quartz's y-down coordinate system, still relative to relevantView/your image
The rect you pass to CGImageCreateWithImageInRect should be in coordinates relative to the input image's size, not screen coordinates. Assuming the size of the input image matches the size of the view to which you've converted your point, you should be able to achieve this by subtracting the rect's corner from the image's height, rather than the screen height.
I'm trying to get the real life angle of the point of view in ARKit scene (0 - 360 degrees). I'm using euler angles from SCNNode of pointOfView.
print("\(pointOfView.eulerAngles.y.radiansToDegrees)")
Problem is, that when looking north, I'm getting 0 as a result and when looking south, I'm also getting 0. When looking NE, I get -45 degrees and when looking SE, I also get -45 degrees. Seems like SCNNode can not determine between North and South, only between West and East. Any advice?
I generally need to implement radar view in my ARKit real world scene. And expected behavior is North: 0, East: 90, South: 180, West: 270.
Thanks in advance!
I've just been working on a similar situation. What you are after I call the "heading" which isn't as easy to define cleanly as you might think.
Quick background: FYI, there are two kinds of rotation, "Euler" which are relative to the real world space but which suffer what they call Gimbal Lock at the "pitch" extremes. And then there are the rotation angles relative to the device's axis, held in the transform property of ARCamera.
To illustrate the difference euler.y alway means the way the device is facing (except when it is flat in which case gimbal lock mucks it up, hence our problem), whereas the transform y always means rotation around the vertical axis through the phone (which, just to make things extra confusing, is based on the device held landscape in ARKit).
(Side note: If you are used to CoreMotion, you may have notice that in ARKit, Gimbal Lock occurs when the device is held flat, whereas in CM it is upright).
So how do we get a "heading" that works whether the device is flat or upright? The solution below (sorry it's objective-c!) does the following:
Take two normal vector, one along the phone's Z axis (straight out from the screen) and one that sticks out the bottom of the phone, which I call the -Y axis (though it's actually the +X axis when held landscape).
Rotate the vector by the device's transform (not the Eulers), project onto the XZ plane and get the angle of the projected vectors wrt the Z-axis.
When the phone is upright, the Z Normal will be the perfect heading, but when the phone is flat, the Y normal is the one to use. In between we'll "crossfade" based on the phone's "tilt", ie the euler.x.
One small issue is the when user holds the phone slightly down past flat, the heading given by the Z Normal flips. We don't really want that (more from a UX perspective than a mathematical one) so let's detect this "downward tilt" and flip the zHeading 180˚ when it happens.
The end result is a consistent and smooth heading regardless of the device orientation. It even works when the device is changed moved between portrait and landscape...huzzah!
// Create a Quaternion representing the devices curent rotation (NOT the same as the euler angles!)
GLKMatrix3 deviceRotM = GLKMatrix4GetMatrix3(SCNMatrix4ToGLKMatrix4(SCNMatrix4FromMat4(camera.transform)));
GLKQuaternion Q = GLKQuaternionMakeWithMatrix3(deviceRotM);
// We want to use the phone's Z normal (in the phone's reference frame) projected onto XZ to get the angle when the phone is upright BUT the Y normal when it's horizontal. We'll crossfade between the two based on the phone tilt (euler x)...
GLKVector3 phoneZNormal = GLKQuaternionRotateVector3(Q, GLKVector3Make(0, 0, 1));
GLKVector3 phoneYNormal = GLKQuaternionRotateVector3(Q, GLKVector3Make(1, 0, 0)); // why 1,0,0? Rotation=(0,0,0) is when the phone is landscape and upright. We want the vector that will point to +Z when the phone is portrait and flat
float zHeading = atan2f(phoneZNormal.x, phoneZNormal.z);
float yHeading = atan2f(phoneYNormal.x, phoneYNormal.z);
// Flip the zHeading if phone is tilting down, ie. the normal pointing down the device suddenly has a +y component
BOOL isDownTilt = phoneYNormal.y > 0;
if (isDownTilt) {
zHeading = zHeading + M_PI;
if (zHeading > M_PI) {
zHeading -= 2 * M_PI;
}
}
float a = fabs(camera.eulerAngles.x / M_PI_2);
float heading = a * yHeading + (1 - a) * zHeading;
NSLog(#"euler: %3.1f˚ %3.1f˚ %3.1f˚ zHeading=%3.1f˚ yHeading=%3.1f˚ heading=%3.1f˚ a=%.2f status:%li:%li zNorm=(%3.2f, %3.2f, %3.2f) yNorm=(%3.2f, %3.2f, %3.2f)", GLKMathRadiansToDegrees(camera.eulerAngles.x), GLKMathRadiansToDegrees(camera.eulerAngles.y), GLKMathRadiansToDegrees(camera.eulerAngles.z), GLKMathRadiansToDegrees(zHeading), GLKMathRadiansToDegrees(yHeading), GLKMathRadiansToDegrees(heading), a, camera.trackingState, camera.trackingStateReason, phoneZNormal.x, phoneZNormal.y, phoneZNormal.z, phoneYNormal.x, phoneYNormal.y, phoneYNormal.z);
I am using CGPathAddEllipseInRect to draw a circle and then using that in CAKeyframeAnimation. My issue is that the animation always starts in the same spot. I thought that I could do the following with a CGAffineTransform to make it start in a different point:
CGAffineTransform temp = CGAffineTransformMakeRotation(M_PI / 2);
CGPathAddEllipseInRect(animationPath , &temp, rect);
I do not know what this is doing. When it runs, I don't even see this portion of the animation. It is doing something offscreen. Any help understanding this would be great.
The rotation happens around the origin (0,0) by default, but you want to rotate around the center of the circle, so you have to do additional transformations:
float midX = CGRectGetMidX(rect);
float midY = CGRectGetMidY(rect);
CGAffineTransform t =
CGAffineTransformConcat(
CGAffineTransformConcat(
CGAffineTransformMakeTranslation(-midX, -midY),
CGAffineTransformMakeRotation(angle)),
CGAffineTransformMakeTranslation(midX, midY));
CGPathAddEllipseInRect(animationPath, &t, rect);
Essentially, this chains three transformations: First, the circle is moved to the origin (0,0), then the rotation is applied and afterwards it is moved back to its original position. I've made a little visualization to illustrate the effect:
I chose a square instead of a circle and 45° instead of 90° to make the rotation easier to see, but the principle is the same.
The use case: I am subclassing UIView to create a custom view that "mattes" a UIImage with a rounded rectangle (clips the image to a rounded rect). The code is working; I've used a method similar to this question.
However, I want to stroke the clipping path to create a "frame". This works, but the arc strokes look markedly different than the line strokes. I've tried adjusting the stroke widths to greater values (I thought it was pixelation at first), but the anti-aliasing seems to handle arcs and lines differently.
Here's what I see on the simulator:
This is the code that draws it:
CGContextSetRGBStrokeColor(context, 0, 0, 0, STROKE_OPACITY);
CGContextSetLineWidth(context, 2.0f);
CGContextAddPath(context, roundRectPath);
CGContextStrokePath(context);
Anyone know how to make these line up smoothly?
… but the anti-aliasing seems to handle arcs and lines differently.
No, it doesn't.
Your stroke width is consistent—it's 2 pt all the way around.
What's wrong is that you have clipped to a rectangle, and your shape's sides are right on top of the edges of this rectangle, so only the halves of the sides that are inside the rectangle are getting drawn. That's why the edges appear only 1 px wide.
The solution is either not to clip, to grow your clipping rectangle by 2 pt on each axis before clipping to it, or to move your shape's edges inward by 1 pt on each side. (ETA: Or, yeah, do an inner stroke.)
Just in case anyone is trying to do the same thing I am (round rect an image):
The UIImageView class has a property layer, of type CALayer . CALayer already has this functionality built-in (it WAS a little surprising to me I couldn't find it anywhere):
UIImageView *thumbnailView = [UIImage imageNamed:#"foo.png"];
thumbnailView.layer.masksToBounds = YES;
thumbnailView.layer.cornerRadius = 15.0f;
thumbnailView.layer.borderWidth = 2.0f;
[self.view addSubview:thumbnailView];
Also does the trick.