I have a map with events associated with "click" that get fired when using mouse but if I use the touch screen they don't
example
map.on("click", () => console.log("blarg")
blarg gets logged if I click on the map with a mouse but does not if I touch the map with a touchscreen. This is on a windows computer and in an electron app (latest version of mapboxgl-js)
There are already similar questions answered on Stack Overflow:
Would onClick event work on touch on touch-screen devices?
document .click function for touch device
You can use touchstart and touchend events that are similar to mousedown and mouseup or use jQuery to detect taps.
Supporting both TouchEvent and MouseEvent - MDN
Related
I'm working on an app using Angular 2+ and Nativescript 5. In a normal button I listen to "taps" with the tap event:
<Button text="Tap me" (tap)="onTap($event)"></Button>
But the tap event triggers after the finger releases the button.
Now I want to listen to the "ButtonDown" event, when the finger touches the button. But I can't find that event on the docs!
Is there any workaround that works in this case?
You may have to listen for the touch event and make sure the action is down.
Dear stackoverflow folks! To this day I never saw the need to ask a question, because all of you have done a great job in asking and answering nearly all of the code-related problems I encountered. So, thank you for that!
At the moment I am working on an iOS application that is able to process raw touch events. These are then sent to an iMac over a WiFi network (the protocol I use is OSC). On the OS X side there is a server application listening for these OSC messages and converting them to mouse pointer movement / mouse button press / multi-touch gestures. So basically I want to build a (of course much more basic) software-bundle like mobile mouse (http://mobilemouse.com/) that I am able to adapt to the needs of our customers (by the means of customizing colors / additional buttons / gestures, and so on) for small remote control projects.
Right now, everything works but the multitouch gestures (pinch, rotate, two-finger-scroll). So my question is: How can I programmatically create and post a multitouch gesture event?
I searched a lot and found some threads about it here on stackoverflow, but none of them could help me:
Is there a way to trigger gesture events on Mac OS X? - Is there a way to change rotation of gesture events? - Generate and post Multitouch-Events in OS X to control the mac using an external camera - ...
Update 1:
The last thing I tried was:
CGEventSourceRef eventSource = CGEventSourceCreate(kCGEventSourceStateCombinedSessionState);
CGEventRef event = CGEventCreate(eventSource);
CGEventSetType(event, NSEventTypeMagnify);
CGEventPost(kCGHIDEventTap, event);
CFRelease(eventSource);
In the latest version of OSX and safari you can use mouse swipe gestures to go forwards and backwards through your browser history. my problem: I have a page with a horizontally scrolling image gallery. If you're using the mouse swipe gesture to scroll through the images, when you get to the end of the images its very easy to swipe into the next or previous page...
does anyone know a way to explicitly disable this using css or any other method?
Perhaps the following documents can help:
Handling Gesture Events
Preventing Default Behaviour
: with event.preventDefault();
These might only apply to iOS though.
Aside from that, I would say it's just how the browser handles the gestures, similar to how mouse-wheel scrolling will scroll a DIV block until it reaches the end then start to scroll the page as a whole.
Unfortunately, there's no way to prevent this behaviour since it's a browser gesture (event.preventDefault() on touchstart event won't work).
I'm developing a WP7 music player. I use the slider to track the playing progress.
I want to allow user to drag the slider to seek a certain posion within the track, but find no drag end event?
The slider_ValueChanged event doesnot satisfy my need.
I follow the instruction here WPF: Slider with an event that triggers after a user drags but it doesnot work on WP7
Please help
I follow the instruction here using Manipulation Completed Event, and it works
Can anybody give me more details on this?
I know that tap is coming from gestures, but I'd like to know more on WP7 events model.
For instance how following scenarions work:
stackpanel with tap event and checkbox inside which handles click
container with tap and any framework element also with tap, etc.
Which events have more priority and if is it possible to mask events?
What kind of event bubbling is used here from top to bottom or vice-versa?
If you're using WP7.1 SDK the easiest thing to make things consistent would be to use the Tap event everywhere (it's available on all UIElement derived controls). Tap events are bubbled up from the lowest control in the visual tree until a handler is found.
Then you don't need to worry about if/when Click and Tap will clash.