How to enable MKMapView 3D view? - macos

I have an MKMapView in a window, and pitchEnabled is true (and I've confirmed this in the debugger). The "3D" thingy in the middle of the compass is grayed out, and clicking or dragging it does nothing. Option-dragging the map (like I do in Maps.app) doesn't do anything, either.
From my interpretation of the docs, setting pitchEnabled should let me use the 3D view, like Maps.app does. Am I mistaken? Is there something else I need to do to allow my users to get a 3D map view?

You can get close to the experience of 3D mode by adjusting the camera angle from which you view the map and making buildings visible. The example below allows you view the Eiffel Tower in 3D:
viewDidLoad() {
super.viewDidLoad()
mapView.mapType = MKMapType.Standard
mapView.showsBuildings = true // displays buildings
let eiffelTowerCoordinates = CLLocationCoordinate2DMake(48.85815, 2.29452)
mapView.region = MKCoordinateRegionMakeWithDistance(eiffelTowerCoordinates, 1000, 100) // sets the visible region of the map
// create a 3D Camera
let mapCamera = MKMapCamera()
mapCamera.centerCoordinate = eiffelTowerCoordinates
mapCamera.pitch = 45
mapCamera.altitude = 500 // example altitude
mapCamera.heading = 45
// set the camera property
mapView.camera = mapCamera
}
example from: this question

Since OS X El Capitan v10.11 they added a new map type: "3D flyover mode"
For some reason this option doesn't show up in XCode attributes inspector of the mapview. You have to set it programmatically. This makes the map look and behave as the one seen in the maps app.
self.mapView.mapType = MKMapTypeSatelliteFlyover;

I was able to do this in swift on iOS 11:
mapView.mapType = .hybridFlyover
That is giving me the 3D view.

Use this setup method to configure you MapView in viewDidLoad.
func setup() {
objMapView.showsUserLocation = true
objMapView.delegate = self
objMapView.showsBuildings = true
objMapView.mapType = .hybridFlyover
objLocationManager.startUpdatingLocation()
if let center = self.objLocationManager.location?.coordinate {
let currentLocationCoordinates = CLLocationCoordinate2DMake(center.latitude, center.longitude)
objMapView.region = MKCoordinateRegion.init(center: currentLocationCoordinates, latitudinalMeters: 1000, longitudinalMeters: 100)
// create a 3D Camera
let mapCamera = MKMapCamera()
mapCamera.centerCoordinate = currentLocationCoordinates
mapCamera.pitch = 45
mapCamera.altitude = 100
mapCamera.heading = 45
// set the camera property
objMapView.camera = mapCamera
}
}
In above code snippet:
mapType property hybridFlyover displays satellite map along with location names if available.
MkMapCamera instance helps in creating 3D view of map. altitude property determine altitude from where location is to be projected.
Check below screenshot for output:

Related

macOS, how resize window across screens?

I'm trying to programmatically resize macOS windows. Similar to Rectangle.
I have the basic resizing code working, for example, move the window to the right half, and when there is only one screen it works fine, however when I try to resize with two screens (in a vertical layout) the math does not work:
public func moveRight() {
guard let frontmostWindowElement = AccessibilityElement.frontmostWindow()
else {
NSSound.beep()
return
}
let screens = screenDetector.detectScreens(using: frontmostWindowElement)
guard let usableScreens = screens else {
NSSound.beep()
print("Unable to obtain usable screens")
return
}
let screenFrame = usableScreens.currentScreen.adjustedVisibleFrame
print("Visible frame of current screen \(usableScreens.visibleFrameOfCurrentScreen)")
let halfPosition = CGPoint(x: screenFrame.origin.x + screenFrame.width / 2, y: -screenFrame.origin.y)
let halfSize = CGSize(width: screenFrame.width / 2, height: screenFrame.height)
frontmostWindowElement.set(size: halfSize)
frontmostWindowElement.set(position: halfPosition)
frontmostWindowElement.set(size: halfSize)
print("movedWindowRect \(frontmostWindowElement.rectOfElement())")
}
If my window is on the main screen then the resizing works correctly, however if it is a screen below (#3 in the diagram below) then the Y coordinate ends up in the top monitor (#2 or #1 depending on x coordinate) instead of the original one.
The output of the code:
Visible frame of current screen (679.0, -800.0, 1280.0, 775.0)
Raw Frame (679.0, -800.0, 1280.0, 800.0)
movedWindowRect (1319.0, 25.0, 640.0, 775.0)
As far as I can see the problem lies in how Screens and windows are positioned:
I'm trying to understand how should I position the window so that it remains in the correct screen (#3), but having no luck so far, there doesn't seem to be any method to get the absolute screen dimensions to place the screen in the correct origin.
Any idea how can this be solved?
I figured it out, I completely missed one of the functions used in the AccessibilityElement class:
static func normalizeCoordinatesOf(_ rect: CGRect) -> CGRect {
var normalizedRect = rect
let frameOfScreenWithMenuBar = NSScreen.screens[0].frame as CGRect
normalizedRect.origin.y = frameOfScreenWithMenuBar.height - rect.maxY
return normalizedRect
}
Basically, since everything is calculated based on the main screen then there is no other option than to take the coordinates of that one and then offset to get the real position of the screen element.

Animate CALayer - zoom and scroll

I want to create an animation with CALayers.
I have a parent layer with multiple sublayers and I would like to zoom in and scroll.
First I trying to zoom on the parent layer, as follows:
let transformAnimation = CABasicAnimation(keyPath: "bounds.size.width")
transformAnimation.duration = 2.3
transformAnimation.timingFunction = CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseInEaseOut)
transformAnimation.toValue = 650*2
transformAnimation.beginTime = CACurrentMediaTime() + 4
transformAnimation.autoreverses = false
transformAnimation.isRemovedOnCompletion = false
transformAnimation.fillMode = kCAFillModeForwards
parentLayer.add(transformAnimation, forKey: "transformAnimation")
//
let transformAnimation2 = CABasicAnimation(keyPath: "bounds.size.height")
transformAnimation2.duration = 2.3
transformAnimation2.timingFunction = CAMediaTimingFunction(name: kCAMediaTimingFunctionEaseInEaseOut)
transformAnimation2.toValue = 650*2 //CGAffineTransform.identity
transformAnimation2.beginTime = CACurrentMediaTime() + 4
transformAnimation2.autoreverses = false
transformAnimation2.isRemovedOnCompletion = false
transformAnimation2.fillMode = kCAFillModeForwards
parentLayer.add(transformAnimation2, forKey: "transformAnimation2")
When the animation is applied, the sublayers are left in wrong position and size. Should I also apply the animation to them?
How can I do this?
Thanks!
I'm guessing that you're updating width and height of one layer and you're wondering whether the bounds of sublayers will also change. No, you'll probably have to animate those separately, too. But rather than initiating separate animations, you can them with a CAAnimationGroup.
If these were views, you could define constraints that coordinate the resizing of subviews more gracefully. But with layers, you're going to have to do this yourself. (There might be reasons why you're doing it the way you are, but it's not clear from the question.)

Sprite Particle System animation in viewController

I create a macOS single window application and add a Sprite Particle System file with template Stars. and the visual effect just like this:
And I want to add it to my viewController, as the guide of this answer, I got the result like this, and it was not which I desired:
override func viewDidLoad() {
super.viewDidLoad()
let scene = SCNScene()
let particlesNode = SCNNode()
let particleSystem = SCNParticleSystem(named: "Welcome", inDirectory: "")
particlesNode.addParticleSystem(particleSystem!)
scene.rootNode.addChildNode(particlesNode)
skView.backgroundColor = .black
skView.scene = scene
}
So, I'm wondering what's wrong and what should I do?
Here is the demo repo: Link Here
The particle system itself is the standard "star" SceneKit particle system available in Xcode, with no changes.
Well I made a little progress. If I swivel the camera around 180 degrees, I can see the stars receding, so we can tell that the particle system is running ok. In the default orientation, though, all I saw was blinking lights. So I think the particles are being generated with a Z position of 0, the same as the camera's.
If I move the system's node away from the camera
particlesNode.position = SCNVector3(0, 0, -20)
I still just see blinking lights. But if I click on the SCNView, the animation works correctly. I see stars coming at me.
I don't understand why I have to click the view to get it to work right. I tried isPlaying = true but that made no difference.

Physics body as an SKTexture does not rotate

I am making a game where the main character rotates, however, I want to use an SKTexture to set custom boundaries to the character sprite node. However, when I do this, my character does not rotate. When I use a regular circleOfRadius (because my sprite is round, but no perfectly round) it rotates, but is not accurate on collisions. Here is my code:
var mainSpriteTexture = SKTexture(imageNamed: "Main")
mainSprite = SKSpriteNode(texture: mainSpriteTexture)
mainSprite.setScale(0.2)
mainSprite.position = CGPoint(x: self.frame.width / 2, y: 100)
mainSprite.physicsBody = SKPhysicsBody(texture: mainSpriteTexture, size: mainSprite.size)
mainSprite.physicsBody?.categoryBitMask = PhysicsCatagory.player
mainSprite.physicsBody?.collisionBitMask = PhysicsCatagory.platform | PhysicsCatagory.ground
mainSprite.physicsBody?.contactTestBitMask = PhysicsCatagory.platform | PhysicsCatagory.ground | PhysicsCatagory.score
mainSprite.physicsBody?.affectedByGravity = true
mainSprite.physicsBody?.dynamic = true
mainSprite.physicsBody?.allowsRotation = true
self.addChild(mainSprite)
override func update(currentTime: CFTimeInterval) {
/* Called before each frame is rendered */
updateSpritePosition()
if gameStarted == true {
if died == false {
mainSprite.physicsBody?.velocity = CGVectorMake(direction, (mainSprite.physicsBody?.velocity.dy)!)
}
else if died == true {
mainSprite.physicsBody?.velocity = CGVectorMake(0, (mainSprite.physicsBody?.velocity.dy)!)
}
}
Here is the shape I am using: http://imgur.com/JCeEAbv
Not really sure how you rotate your sprite, or move it, but if I give it a certain default rotation and drop it on the surface, it rotates for me as it should (I used a picture you have provided). This way, it will not land perfectly on the ground (zRotation 0.0).
Try to use this:
mainSprite.zRotation = 0.1 //note that these are radians
and drop it on a flat surface.
Same thing you could probably achieve by applying a small force or impulse which should affect sprite's zRotation property... What you are going to use, depends on how you move your node actually - using physics, or by changing its position property directly (see my comment about mixing physics & manual node movement for important details).

MKMapView is misaligned to its region property

I want to display a certain map region in MKMapView but when I put a rectangular overlay on the map with the very same parameters it is displayed misaligned vertically. It looks good enough close to the equator but the misalignment is increasing with the latitude and the span.
This is for a mac app, but it should be the same for iOS.
This is my relevant code:
MKCoordinateRegion mapRegion = MKCoordinateRegionMake(CLLocationCoordinate2DMake(latCenter, lonCenter), MKCoordinateSpanMake(mapWidthY, mapWidthX));
self.radarMap.region = mapRegion;
CLLocationCoordinate2D coordinates[4];
coordinates[0] = CLLocationCoordinate2DMake(latCenter+mapWidthY/2, lonCenter+mapWidthX/2);
coordinates[1] = CLLocationCoordinate2DMake(latCenter+mapWidthY/2, lonCenter-mapWidthX/2);
coordinates[2] = CLLocationCoordinate2DMake(latCenter-mapWidthY/2, lonCenter-mapWidthX/2);
coordinates[3] = CLLocationCoordinate2DMake(latCenter-mapWidthY/2, lonCenter+mapWidthX/2);
self.boundaryOverlay = [MKPolygon polygonWithCoordinates:coordinates count:4];
[self.radarMap addOverlay:self.boundaryOverlay];
It shows this: (Notice the blue rect overlay is moved up so the upper region is not displayed):
Instead of something like this: (I'm aware of that it is displayed in aspect fill):
When you set the region property of an MKMapView object, MapKit adjusts the value of the region property so that it matches the actual region that's visible. That means that the actual value of region isn't going to be exactly what you assigned to it. So instead of creating the polygon using the region that you assigned to the map, you should get the updated value of region from the MKMapView object and use that to create the polygon.
MKCoordinateRegion mapRegion = MKCoordinateRegionMake(CLLocationCoordinate2DMake(latCenter, lonCenter), MKCoordinateSpanMake(mapWidthY, mapWidthX));
self.radarMap.region = mapRegion;
CLLocationCoordinate2D coordinates[4];
// Get the actual region that MapKit is using
MKCoordinateRegion actualMapRegion = self.radarMap.region;
CLLocationDegrees actualLatCenter = actualMapRegion.center.latitude;
CLLocationDegrees actualLonCenter = actualMapRegion.center.longitude;
CLLocationDegrees actualLatSpan = actualMapRegion.span.latitudeDelta;
CLLocationDegrees actualLonSpan = actualMapRegion.span.longitudeDelta;
// And use that to create the polygon
coordinates[0] = CLLocationCoordinate2DMake(actualLatCenter+ actualLatSpan/2, actualLonCenter+ actualLonSpan/2);
coordinates[1] = CLLocationCoordinate2DMake(actualLatCenter+ actualLatSpan/2, actualLonCenter-actualLonSpan/2);
coordinates[2] = CLLocationCoordinate2DMake(actualLatCenter-actualLatSpan/2, actualLonCenter-actualLonSpan/2);
coordinates[3] = CLLocationCoordinate2DMake(actualLatCenter-actualLatSpan/2, actualLonCenter+ actualLonSpan/2);
self.boundaryOverlay = [MKPolygon polygonWithCoordinates:coordinates count:4];
[self.radarMap addOverlay:self.boundaryOverlay];
I was curious about the increasing misalignment that you were seeing as you moved north. It occurred to me that you were probably using a fixed ratio for the mapWidthX and mapWidthY. MapKit uses a projection that is non-conformal. One consequence of that is that the map gets stretched in the North-South direction, with more stretching the farther you get from the equator.
If you create your region using a ratio that's correct for the equator, it will be incorrect as you move toward the poles. MKMapView will take the region you give it and display something close to it. But the farther you get from the equator, the more of an adjustment it needs to make. And the bigger the difference between the region you give it and the actual region it uses.

Resources