Key bindings in nativescript - nativescript

I am building android app using nativescript,
My apk will be installed on Android TV
How do I configure my nativescript app to be navigatable by remote/fly air mouse keys?
For example I have remote for my android tv like this one
I want to bind UI actions with UP, DOWN, LEFT, RIGHT & OK keys
Also want app to take input by numeric keys 0 to 9,
just like we navigate to a channel on regular tv.

Related

Is it possible to add a mobile operator to Xcode iPhone simulator?

So I need to add a mobile operator to Xcode simulator. Currently it looks like this
And i want it to be like this
So my question is if there's a way to do it in the Xcode simulator and if so how?
The status icons you see depends on the model of the iPhone. According to this page, phones that use Face ID do not display the carrier in the status bar, and phones that use Touch ID do.
iPhone models with Face ID
iPhone models with Touch ID
So just go to File -> Open Simulator, and select a different simulator that uses Touch ID, such as iPhone SE.
Also,
If you can't see an icon, check Control Centre by swiping down from the top right-hand corner.
I don't think so that you can achieve exactly same that you posted in question but in iOS we have flexibility to change simulator status bar like battery level, time, WiFi state, cellular state.
this feature is called simctl you can Google more details about it.
to specifically change status bar use simctl status_bar
I dont think u can add mobile cause of similator doesn't have SIM card. operator.Similator can work as a real device as coding, but it does not contain all the features of the real device. Simulator is a program and other device.
If you want to do something for a feature that can only be on a real device try this with a real device like mobile operator operations
It's similar to taking a photo from the phone and uploading it to the app. The simulator does not have a camera feature, so if you try to open the camera in the simulator, the application will crash. You need to test this on a real device

How to force an app to simulate in iPhone simulation using an iPad using Expo?

I'm currently using Expo to build an iPhone app.
I'm not supporting iPad and I have it so when it builds on iPad it runs in an iPhone Simulation.
The problem is I can't develop with Expo using this approach.
The app shows up as an iPad app, but this is not the mode being shipped to users.
According to this I might have some luck adding ios.supportsTablet to the app.json file and setting it to false but it didn't change anything.
Is there another configuration value I'm missing to force iPhone simulation mode on an iPad?
I'd rather not eject if I don't have to.
According to this, there currently isn't a way to live develop using "iPhone mode" on an iPad.
The Expo Client app can’t change its tablet support on the fly,
unfortunately, so it will always adapt your project to the iPad
viewport.
So, following the above forum here is how you get around it:
Run exp build:ios -t simulator
Open Simulator
Select Hardware/Device/iOs 11.x/iPad x generation
Unpack the generated build from the first command
You should have a file named yourApp.app
Drag that file into the iPad you are running in Simulator
It will install the app on the device and you can then view your creation
This is faster than doing a whole build cycle with TestFlight just to see your changes.
But it still leaves a bit to be desired.

Pebble: What does the onlyShownOnCommunication in appinfo.json metadata do?

In CloudPebble, there is a setting for App Visibility:
What does "Only Visible When Companion App Running" mean exactly? Does it mean when the iOS/Android companion app isn't running, then the Pebble watchapp doesn't show up in the app list on the Pebble?
I tried to look for information on the App Metadata docs page, https://developer.pebble.com/guides/pebble-apps/app-structure/app-metadata/, but it's broken...
I tried the setting on a watchapp I have that has an iOS companion app, and the setting doesn't seem to do anything at all. When I forcequit my iOS app, the watchapp is still listed in the Pebble app list.
This metadata item causes the watch app to be hidden when the phone app can't communicated with the watch, and to be shown when it can. The reason it stays open is because apps that use the new PebbleKit on iOS are allowed to continue communicating over BLE, even when force closed.

How to make a tvos App capture a swipe on the iPhone Remote App?

I can now (since dec 2015) use Apple's Remote App, for iOS, as a remote control for apple tv4!
I am developing a tvos App. I can swipe on the Remote App to navigate between controls on my settings screen. So far so good.
But the gesture recognisers on my game screen does not recognise the swipes on the Remote App (but they do recognise swipes on Siri Remote).
Question: What should I do to capture swipes from the iOS Remote App?
You can't.
With tvOS 9.1 and v4.2.3 of the Remote app for iOS, the Control screen on the iOS app works the same as a generic remote (e.g. the Apple Remote that comes with the 3rd-generation Apple TV or a third party IR universal remote). Those remotes only support 4-way directional control, not gesture control. More generally, Remote v4.2.3 can only do with a 4th-gen Apple TV the same things it does with 3rd-gen Apple TV.
The Remote app translates the gestures you make on your iOS device into a 4-way directional command (or select/play/pause, fast-forward, or another of the few commands generic remotes support), then sends that command to the Apple TV. It doesn't pass touch inputs along to the Apple TV the way the Siri Remote does.
However, Apple has been talking to the press about possible future changes.
Directional "button" inputs, whether from an Apple Remote, third-party hardware remote, or the Remote iOS app, are UIPress events, just like the Select, Menu, and Play/Pause buttons on the Siri Remote. As such, you can handle them in pressesBegan:withEvent: or with the allowedPressTypes property of a tap gesture recognizer. You can even set up the latter in Interface Builder:
Note that swipes on the Siri remote don't count as directional button presses, so if you're doing swipe gesture recognizers and want a directional button press to do the same thing, you need to recognize both separately.

How can i test my already created iphone application with apple watch using simulators?

I have used simulator while testing my app. Now i would like to test my app on apple watch simulator. I have followed the following steps in order test this:
Created a project with 'MyfirstApp'(iOS>application>Single view application) and clicked on next
Entered Projectname, organization name, organization identifier, bundle identifier, language (objective C), device (universal) and saved it on my desktop
I have opened (xcode, file>new>target>watchkit app>next) and entered Projectname, organization name, organization identifier, bundle identifier, language (objective C), checked both the check boxes, Project(selected my project), embed in application(none) and clicked on 'Finish'
Clicked on activate in the pop-message.
Selected myapp wtackkitapp and selected the iphone5s
Build the application
My app has been displayed in the iphone simulator and i did few actions on the app but i couldn't observe anything on apple watch simulator except black blank screen.
iPhone Applications do not run on the watch. You need to create a WatchKit extension and bundle it in your iPhone application. The WatchKit extension (running on the phone) will communicate with the watch to display the content that you want drawn on the watch.

Resources