When I use 2 fingers to zoom map, more times is fired the rotation gestures. How can I prevent this behavior?
In Android, for example, we have this method
mapboxMap.getUiSettings().setIncreaseRotateThresholdWhenScaling(true);
that helps to prevent rotation while zooming, but in iOS I don't find this.
Related
Hi I would like to implement an rotating imageview where user can rotate the imageview by touching only the corners.
how could i implement that. Please help.
I would like to implement an rotating imageview where user can rotate the imageview by touching only the corners.
There are two parts to this task, and I'm not sure which one you're asking about:
How to do something in response to a dragging gesture.
How to rotate a view.
Whether you're talking about iOS or macOS, there are at least two options for responding to a dragging operation. One is to track the touch or mouse event yourself. Touches and mouse interactions both have a beginning, when the finger touches the screen or the mouse button is depressed; a middle, when the location of the finger or cursor may change; and an end, when the finger leaves the screen or the mouse button is released. Both operating systems have events for these things that are sent through their respective event handling systems.
An easier method is to use a gesture recognizer, which is usually easier to get right AND the existing gesture recognizers encourage you to implement the expected behavior by making that the easiest option. For example, UIKit and AppKit each have a rotation gesture recognizer that recognizes two touches moving in opposite directions as a rotation gesture, since that's a common way to rotate objects. But you can also implement your own gesture recognizer that can notice that a touch happens a) for longer than some minimum time, b) within some minimum distance from a corner, and c) with movement in a direction that would cause rotation. So, if you want to handle dragging in order to rotate something, look into gesture recognizers.
UIView and NSView both provide methods for rotating a view in its superview's coordinate system. NSView provides a frameRotation property that you can set, and UIView has a transform property to which you can apply Core Graphics functions like CGAffineTransformRotate().
In summary, what you should do is to create a gesture recognizer subclass that recognizes the rotation gesture that you want and rotates the view that it's attached to. Then instantiate that gesture recognizer and apply it to the view that you want to rotate.
I'm trying to write a parallax element that has multiple panes which scroll at different rates.
To achieve this, I have multiple View absolutely positioned and stacked in the z plane, with a ScrollView on top to capture the drag events, from which I'll animate the top position and opacity of the lower panes. (Reason for using a ScrollView is to benefit from the bouncing and momentum animation that gives us.)
However, the lower panes may contain elements that want to accept touches (as opposed to scrolls). The problem I'm having is that the ScrollView captures these touches, and there doesn't seem to be a mechanism to pass them on. Basically, I want the ScrollView to respond to drags, but the lower elements to respond to touches.
Is there a way to achieve that?
I've successfully used https://github.com/rome2rio/react-native-touch-through-view for that on iOS.
However on Android, somehow click events are not getting trough while swiping works for underlying view.
I have an NSScrollview with a large canvas that can be scrolled and zoomed etc. When I scroll diagonally at an angle of around 20 degrees from vertical it's really jerky (using a track pad.) This jerkiness even continues as the scrolling animates to rest.
All other angles of scroll are buttery smooth - which makes me think it's got something to do with a preference for vertical scrolling (predominant axis scrolling is disabled.)
The effect only seems to happen when I'm using layer backed views.
Anyone know what's going on here?
Thanks
Craig
Confirmed bug in OSX Yosemite (by Apple DTS)
I am looking to replicate Google Map or Apple Maps style interface with ability to pinch, zoom, scroll right/left, up/down, diagonally, and rotate in any direction but with no Maps. I can likely use UIScrollView but I am not sure how to scroll diagonally or rotate in any direction based on touch. Any suggestions?
Basically UIViuew property transform allow you to make all 2d transforms. But to create something like Maps you need to know Open GL well.
in my opinion the iPhone has a big advantage to other smartphones because of its intuitive, smooth and good feeling scrolling components in UIKit. I want to implement a simple game with UIKit which uses this kind of scrolling, but i can't use UIScrollView because it isn't customizable enough.
I tried to implement this scrolling myself and tried two different approaches:
I used a UIPanGestureRecognizer und moved the bounds of my custom control according to the translation the recognizer delivers me. In order to get this smooth scrolling after lifting my finger during the movement I start an animation. I use the velocity the recognizer gives me and a fixed time in order to calculate how far it should scroll after I lifted my finger. I tryed a linear movement and a ease-out movement, but both looks strange. (more later on that)
I use OnTouchMoved and OnTouchEnded to implement the scrolling. In OnTouchMove I move the bounds according to the movment of the finger. While the finger moves I calculate the difference in location and time of the current and last touch in order to calculate a velocity myself. When the finger lifts I start an animation in OnTouchEnded like in 1. but I use my self-calculated velocity instead.
Both approaches are not giving me the results I want. After lifting my finger, the scrolling is not smoothly continued. There seems to be a (sharp) bend in the celocity curve.
Has anyone an idea how apple does this smooth scrolling? My current guess is that my interpolation with two really close points is to inaccurate and doesn't take the current acceleration into account.
Thx for your thoughts!
Kie
Why don't you add a UIScrollView as handle on top of your view, without visible content. Then you can use the UIScrollViewDelegate methods to update your real view on certain actions. For example: if you capture the scrolling of the UIScrollView using scrollViewDidScroll: , you can update your visible view with the offset of the scrollview. You can use the scrollViewWillEndDragging: to start synchronizing the main view with the scrollview, and scrollViewDidEndDecelerating to stop synchronizing.
I use the same approach to scroll the background of my Animix app. It has an invisible scrollview at the bottom, to give the user the feeling he can drag the grass to the left and right to move the background.