I have a node in the middle of my screen and currently have this code that moves the node to the left with each tap on the left side of the screen and to the right with each tap on the right side of the screen. I want to change this to a long press instead of a tap. How would I do this? Thanks! This is the code for the tap.
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
var touch: UITouch = touches.anyObject() as UITouch
var location = touch.locationInNode(self)
var node = self.nodeAtPoint(location)
var speedOfTouch:CGFloat = 30
//moves to the left and right by touch and plays sound effect by touch.
for touch: AnyObject in touches {
let location = touch.locationInNode(self)
if location.x < CGRectGetMidX(self.frame) {
hero.position.x -= speedOfTouch
AudioPlayer.play()
} else {
hero.position.x += speedOfTouch
AudioPlayer.play()
}
This should be very straightforward. Initialize a UILongPressGestureRecognizer in your view's initializer. You can optionally adjust the minimumPressDuration and allowableMovement properties if desired before adding the gesture recognizer to the view.
override init(frame aRect: CGRect) {
// ...
// Create the long press gesture recognizer, and configure it
// to call a method named viewLongPressed: on self
let longPress = UILongPressGestureRecognizer(target: self, action: "viewLongPressed:")
// Optionally configure it - see the documentation for explanations
// longPress.minimumPressDuration = 1.0
// longPress.allowableMovement = 15
// Add the gesture recognizer to this view (self)
addGestureRecognizer(longPress)
// ...
}
Then just implement the callback method:
func viewLongPressed(gestureRecognizer: UIGestureRecognizer) {
// Handle the long press event by examining the passed in
// gesture recognizer
}
Related
Essentially I am trying to incorporate X2 gamescene buttons that do the following functions:
1) Tap to fly (this I have working)
2) Tap to shoot a projectile from Player position (I do not have working).
My problem is I currently have the fly func set when touched anywhere on the screen. I have tried the following :
This is in reference to my GameScene : I thought in order to split this out I would need a node on the screen to reference this function. This does not error in the console but does not appear in the GameScene.
// Button to trigger shooting :
let btnTest = SKSpriteNode(imageNamed: "Crater")
btnTest.setScale(0.2)
btnTest.name = "Button"
btnTest.zPosition = 10
btnTest.position = CGPoint(x: 100, y: 200)
self.addChild(btnTest)
Next in the Player class I have the following broken down:
var shooting = false
var shootAnimation = SKAction()
var noshootAnimation = SKAction()
Init func:
self.run(noshootAnimation, withKey: "noshootAnimation")
let projectile = SKSpriteNode(imageNamed: "Crater")
projectile.position = CGPoint (x: 100, y: 50)
projectile.zPosition = 20
projectile.name = "projectile"
// Assigning categories to Game objects:
self.physicsBody?.categoryBitMask =
PhysicsCategory.plane.rawValue
self.physicsBody?.contactTestBitMask =
PhysicsCategory.ground.rawValue
self.physicsBody?.collisionBitMask =
PhysicsCategory.ground.rawValue
self.physicsBody?.applyImpulse(CGVector(dx: 300, dy: 0))
self.addChild(projectile)
// Start the shoot animation, set shooting to true:
func startShooting() {
self.removeAction(forKey: "noshootAnimation")
self.shooting = true
}
// Stop the shoot animation, set shooting to false:
func stopShooting() {
self.removeAction(forKey: "shootAnimation")
self.shooting = false
}
The node appears in the GameScene which looks promising, finally I move to the last bit of code in the GameScene as follows:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) for touch in (touches) {
let location = touch.location(in: self)
let nodeTouched = atPoint(location)
if let gameSprite = nodeTouched as? GameSprite {
gameSprite.onTap()
}
// Check the HUD buttons which I have appearing when game is over…
if nodeTouched.name == "restartGame" {
// Transition to a new version of the GameScene
// To restart the Game
self.view?.presentScene(GameScene(size: self.size), transition: .crossFade(withDuration: 0.6))
}
else if nodeTouched.name == "returnToMenu"{
// Transition to the main menu scene
self.view?.presentScene(MenuScene(size: self.size), transition: . crossFade(withDuration: 0.6))
}
}
Player.startFly()
player.startShooting()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
Player.stopFly()
player.stopShooting()
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
Player.stopFly()
player.stopShooting()
}
override func update(_ currentTime: TimeInterval) {
player.update()
}
}
Unfortunately nothing happens in the GameScene and the node doesn’t fire when the screen is pressed, with the code above is there anyway I can amend this to allow for both ‘tap to fly’ + ‘tap to shoot’ functions. I still can’t figure out how to get the button I had declared early on in the GameScene to appear in which my gesture / touch position can be localised to this node on the screen as oppose to the whole screen in which I have currently..
I can say this sounded more simple in my head to begin with than actually coding together.
Firstly, the above code does not have any calls to run the animation with the key "shootAnimation" and is missing the call to re-run the not-shooting animation when stopShooting() and the call to re-run the shooting animation in startShooting(). The methods should include these as shown below
func startShooting() {
self.removeAction(forKey: "noshootAnimation")
// Add this call
self.run(shootAnimation)
self.shooting = true
}
func stopShooting() {
self.removeAction(forKey: "shootAnimation")
// Add this call
self.run(noshootAnimation)
self.shooting = true
}
Secondly, the noshootAnimation and shootAnimation animations are empty actions: they will not do anything if initialized as SKAction(). Depending on what you are trying to do, there are a number of class methods of SKAction that will create actions for you here.
Thirdly, any SKNode (recall that a scene is an SKNode subclass) will not receive touch calls if the node's isUserInteractionEnabled property is set to false (to which it is defaulted); instead, it will be treated as if its parent received the call. However, should a node's isUserInteractionEnabled be true, it will be the one that receives the UIResponder calls, not its parent node. Make sure that this property is set to true for the scene and for any nodes that need to receive touches (you can do this in didMove(to:) in the scene or elsewhere).
I will now propose an improvement. I frequently use buttons in Spritekit, but they are all subclasses of a custom button class (itself a subclass of SKSpriteNode) that uses protocol delegation to alert members of touch events. The scheme looks like this:
class Button: SKSpriteNode {
weak var responder: ButtonResponder?
// Custom code
// MARK: UIResponder
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
// Respond here
responder?.respond(to: self)
}
}
protocol ButtonResponder: AnyObject {
func respond(to button: Button)
}
class MyScene: SKScene, ButtonResponder {
func respond(to button: Button) {
// Do something on touch, but typically check the name
switch button.name {
case "myButton":
// Do something specific
default:
break
}
}
override func didMove(to view: SKView) {
// Set the scene as a responder
let buttonInScene = childNode(withName: "myButton") as! Button
buttonInScene.responder = self
}
}
For more on delegation, look here. Hopefully this helped a little bit.
EDIT: Convert touch location
You can convert the touch to the node on the screen by passing a reference to the location(in:) method of UITouch instead of the scene as you did in your code
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
let yourNode = childNode(withName: "yourNode")!
for touch in touches {
let touchLocationToYourNode = touch.location(in: yourNode)
// Do something
}
}
Alternatively, use SKNode's convert(_:from:) and convert(_:to:) methods
let nodeA = SKNode()
let nodeB = SKNode()
nodeA.xScale = 1.5
nodeA.yScale = 1.2
nodeA.position = .init(x: 100, y: 100)
nodeA.zRotation = .pi
let pointInNodeACoordinateSystem = CGPoint(x: 100, y: 100)
let thatSamePointInNodeBCoordinateSystem = nodeB.convert(pointInNodeACoordinateSystem, from: nodeA)
I am having trouble getting custom SKSpriteNode buttons to work with the xCode level editor on devices with 3d Touch.
I have a button subclass which is mostly based on the DemoBots sample from apple.
The basic code would be this
enum ButtonIdentifier: String {
case playButton
case pauseButton
}
/// Button responder delegate
protocol ButtonDelegate: class {
func pressed(button button: ButtonNode)
}
class ButtonNode: SKSpriteNode {
public weak var delegate: ButtonDelegate? {
return scene as? ButtonDelegate
}
var isHighlighted = false {
didSet {
// running skactions to colorise buttons and animate
}
}
var identifier: ButtonIdentifier!
/// Code init (when button is created in code)
/// e.g let playButton = ButtonNode(imageNamed: "ButtonImage", identifier: playButton)
init(imageNamed: String, identifier: ButtonIdentifier) {
self.identifier = identifier
let texture = SKTexture(imageNamed: imageNamed)
super.init(texture: texture, color: SKColor.clearColor(), size: texture.size())
name = identifier.rawValue
setup()
}
/// Level editor init (when button is created in level editor)
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
// Ensure that the node has a supported button identifier as its name.
guard let nodeName = name, identifier = ButtonIdentifier(rawValue: nodeName) else {
fatalError("Unsupported button name found.")
}
self.identifier = identifier
setup()
}
private func setup() {
// zPosition
zPosition = 200
// Enable user interaction on the button node to detect tap and click events.
userInteractionEnabled = true
}
#if os(iOS)
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
super.touchesBegan(touches, withEvent: event)
isHighlighted = true
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
super.touchesEnded(touches, withEvent: event)
guard let scene = scene else { return }
for touch in touches {
let location = touch.locationInNode(scene)
let node = scene.nodeAtPoint(location)
if node === self || node.inParentHierarchy(self) {
runPressedAction()
} else {
isHighlighted = false
}
}
}
override func touchesCancelled(touches: Set<UITouch>?, withEvent event: UIEvent?) {
super.touchesCancelled(touches, withEvent: event)
isHighlighted = false
}
#endif
// MARK: - Pressed Action
private func runPressedAction() {
// SKAction that runs a press animation
delegate?.pressed(button: self)
}
}
If i create the button via the xCode level editor everything is working fine in the simulator or on my iPhone 6, however when using test flight my friend with a 6s Plus gets no touch input on the buttons, he cannot press them.
If I create the button all in code everything works, even on 3d touch devices.
Why is it not working with buttons from the level editor on 3d touch devices? I have been trying to look at apples demo bots sample to see if I missed some setting or something, but I cannot figure it out. (Demo bots buttons do work on my friends iPhone 6s plus)
I know that on 3d touch devices touchesMoved method gets called even though there are no changes in the x/y coordinates. However I don't think this causes my issues because when creating buttons in code everything is working as expected.
Is there some setting in the Xcode level editor I am missing to allow touches on 3d touch devices?
After days of frustration it turned out it was a iOS 9 bug. My friend was on iOS 9.2.x and after he updated to the latest iOS 9.3.x version everything is working, no code changes where required at all.
Swift newbie here.
I've been having trouble with a task that should be trivial. All I want to do is get the x,y coordinates of the mouse cursor on-demand. I would prefer not to wait for a mouse movement event to fire before I can grab the pointer's coords.
Would appreciate any help!
You should take a look at NSEvent method mouseLocation
edit/update: Xcode 11 • Swift 5.1
If you would like to monitor events on any window when your app is active, you can add a LocalMonitorForEvents matching mouseMoved mask and if it is not active a GlobalMonitorForEvents. Note that you need set to your window property acceptsMouseMovedEvents to true
import Cocoa
class ViewController: NSViewController {
lazy var window: NSWindow = self.view.window!
var mouseLocation: NSPoint { NSEvent.mouseLocation }
var location: NSPoint { window.mouseLocationOutsideOfEventStream }
override func viewDidLoad() {
super.viewDidLoad()
NSEvent.addLocalMonitorForEvents(matching: [.mouseMoved]) {
print("mouseLocation:", String(format: "%.1f, %.1f", self.mouseLocation.x, self.mouseLocation.y))
print("windowLocation:", String(format: "%.1f, %.1f", self.location.x, self.location.y))
return $0
}
NSEvent.addGlobalMonitorForEvents(matching: [.mouseMoved]) { _ in
print(String(format: "%.0f, %.0f", self.mouseLocation.x, self.mouseLocation.y))
}
}
override func viewWillAppear() {
super.viewWillAppear()
window.acceptsMouseMovedEvents = true
}
}
Sample project
You can get the current mouse location in this way:
Declare this in your view controller class:
var mouseLocation: NSPoint? { self.view.window?.mouseLocationOutsideOfEventStream }
Then, you can get the current mouse location and convert in your desired view coordinates:
if let currentMouseLocation = self.mouseLocation{
let pointInTargetView = self.**targetView**.convert(currentMouseLocation, from: self.view)
}
I am adding a sprite to the first screen in the basic SpriteKit template. However whenever I go to the play scene and return to the menu scene the image is enlarged. After checking size and scale mode everything seems to be normal. I am unsure why this is happening.
This is the code for the menu:
let playButton = SKSpriteNode(imageNamed: "play")
override func didMoveToView(view: SKView)
{
/* Setup your scene here */
//Place button in exact middle of the screen
self.playButton.position = CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame))
addChild(self.playButton)
self.backgroundColor = UIColor.blueColor()
}
override func touchesBegan(touches: NSSet, withEvent event: UIEvent)
{
/* Called when a touch begins */
for touch: AnyObject in touches
{
let location = touch.locationInNode(self)
if self.nodeAtPoint(location) == self.playButton
{
var scene = PlayJumper()
let sKView = self.view as SKView!
sKView.ignoresSiblingOrder = true
scene.scaleMode = .AspectFill
scene.size = sKView.bounds.size
sKView.presentScene(scene)
}
}
}
Screenshot of scene before moving to next scene:
This is the code to return to the menu from the next scene, and the image that is altered after returning to the screen.
import SpriteKit
class PlayJumper : SKScene
{
override func didMoveToView(view: SKView){
self.backgroundColor = UIColor.orangeColor()
}
override func touchesBegan(touches: NSSet, withEvent event: UIEvent)
{
for touch: AnyObject in touches
{
endGame()
}
}
func endGame()
{
if let scene = GameScene.unarchiveFromFile("GameScene") as? GameScene
{
let sKView = self.view as SKView!
sKView.ignoresSiblingOrder = true
scene.size = sKView.bounds.size
scene.scaleMode = .AspectFill
sKView.presentScene(scene)
}
}
}
Here is the image after moving back to the view, it is slightly larger than the previous image.
I need the image to stay the same no matter how many times the user goes from the game to the menu.
I can only think of two things that can cause this:
In your viewController, you may have changed the scene anchorPoint of the menu and when you go back to the menu you do not set it as the same thing its set to in the viewController.
You may want to use .ResizeFill instead of .AspectFill
Everytime I experienced this problem, it was because of one of these.
I'm trying to drag a SKSpirteNode around the screen by touching the screen. But I want to be able to do a constant movement of the Sprite, currently my code only moves the sprite to the location of my touch but if I hold and move the sprite will not follow. Moreover I don't want to "have" to touch the SKSpriteNode to activate the movement, I want to touch anywhere on the screen and to have a movement response from that SKSpriteNode.
Here is my current code:
class GameScene: SKScene {
override func didMoveToView(view: SKView) {
// SpriteNode I want to drag around
basket = SKSpriteNode(texture: basketTexture)
self.addChild(basket)
}
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
/* Called when a touch begins */
var nodeTouched = SKNode()
var currentNodeTouched = SKNode()
for touch: AnyObject in touches {
let location = touch.locationInNode(self)
nodeTouched = self.nodeAtPoint(location)
basket.position = location
}
Thank you any help appreciated.
I solved this by using the func touchesMoved instead of touchesBegan and works perfectly and smoothly. here is the final code:
class GameScene: SKScene {
override func didMoveToView(view: SKView) {
// SpriteNode I want to drag around
basket = SKSpriteNode(texture: basketTexture)
self.addChild(basket)
}
override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
/* Called when a touch begins */
var nodeTouched = SKNode()
var currentNodeTouched = SKNode()
for touch: AnyObject in touches {
let location = touch.locationInNode(self)
nodeTouched = self.nodeAtPoint(location)
basket.position = location
}