NativeScript Android API Version check - nativescript

I added android platform to my nativescript app using the following command
tns platform add android
Now I cannot figure out which API version of the platform was added?
How can I figure this out?

The platform add android command will fetch all necessary files to start building apps for Android. I'll assume that you are asking about the compileSdk version of android apps - that is determined at Build time.
When you execute tns build/run android unless the --compileSdk 21/22/23/24/25 flag is specified, the latest version available on your system will be used.
So for example if I just recently downloaded Android SDK Build-Tools and SDK-Platform 25 from the Android SDK Manager the application package that is uploaded on the device will be built with platform 25.
Medium have a good article about compileSdk, targetSdk and minSdk that I recommend you read -> https://medium.com/google-developers/picking-your-compilesdkversion-minsdkversion-targetsdkversion-a098a0341ebd#.eoe0x9isx
Good luck!

latest docs say:
https://docs.nativescript.org/angular/ng-framework-modules/platform
import { isAndroid, isIOS, device, screen } from "tns-core-modules/platform";
class DeviceInfo {
constructor(
public model: string,
public deviceType: string,
public os: string,
public osVersion: string,
public sdkVersion: string,
public language: string,
public manufacturer: string,
public uuid: string
) { }
}
class ScreenInfo {
constructor(
public heightDIPs: number,
public heightPixels: number,
public scale: number,
public widthDIPs: number,
public widthPixels: number
) { }
}
#Component({
moduleId: module.id,
templateUrl: "./platform-module-example.html"
})
export class PlatformModuleExampleComponent {
public isItemVisible: boolean = false;
public deviceInformation: DeviceInfo;
public isItemVisibleScreenInfo: boolean = false;
public screenInformation: ScreenInfo;
public deviceInfoButton: string = "Show device info";
public screenInfoButton: string = "Show/Hide screen info";
constructor() {
this.deviceInformation = new DeviceInfo(
device.model,
device.deviceType,
device.os,
device.osVersion,
device.sdkVersion,
device.language,
device.manufacturer,
device.uuid);
this.screenInformation = new ScreenInfo(
screen.mainScreen.heightDIPs,
screen.mainScreen.heightPixels,
screen.mainScreen.scale,
screen.mainScreen.widthDIPs,
screen.mainScreen.widthPixels);
}
public checkPlatformType(args) {
let message = "";
if (isAndroid) {
message = "You are using Android device";
} else if (isIOS) {
message = "You are using IOS device";
}
alert(message);
}
public deviceInfo(args) {
if (this.isItemVisible) {
this.isItemVisible = false;
this.deviceInfoButton = "Show device info";
} else {
this.isItemVisible = true;
this.deviceInfoButton = "Hide device info";
}
}
public screenInfo(args) {
if (this.isItemVisibleScreenInfo) {
this.isItemVisibleScreenInfo = false;
this.screenInfoButton = "Show screen info";
} else {
this.isItemVisibleScreenInfo = true;
this.screenInfoButton = "Hide screen info";
}
}
}

Related

Broadcast Extension with ReplayKit Xamarin iOS not displaying device screen

I created a Xamarin forms project and added Broadcast Upload Extension and Extension UI and reference them into IOS Project also I followed the instructions in the doc here
https://learn.microsoft.com/en-us/xamarin/ios/platform/extensions#container-app-project-requirements
I uses agora for screen sharing but the result is the device share the camera instead of it's screen
I followed this sample
https://github.com/DreamTeamMobile/Xamarin.Agora.Samples/tree/master/ScreenSharing
and added a Dependency service Interface to invoke it from Xamarin forms all is working expect device share it's camera not it's screen
public class AgoraServiceImplementation : IAgoraService
{
private void JoiningCompleted(Foundation.NSString arg1, nuint arg2, nint arg3)
{
_myId = arg2;
_agoraEngine.SetEnableSpeakerphone(true);
JoinChannelSuccess((uint)_myId);
var bundle = NSBundle.MainBundle.GetUrlForResource("ScreenSharingIOSExtension", "appex", "PlugIns");
var frame = new CGRect(100, 100, 60, 60);
var broadcastPicker = new RPSystemBroadcastPickerView(frame);
var bundle2 = new NSBundle(bundle);
broadcastPicker.PreferredExtension = bundle2.BundleIdentifier;
var vc = Platform.GetCurrentUIViewController();
vc.Add(broadcastPicker);
}
public void StartShareScreen(string sessionId, string agoraAPI, string token, VideoAgoraProfile profile = VideoAgoraProfile.Portrait360P, bool swapWidthAndHeight = false, bool webSdkInteroperability = false)
{
_agoraDelegate = new AgoraRtcDelegate(this);
_agoraEngine = AgoraRtcEngineKit.SharedEngineWithAppIdAndDelegate(agoraAPI, _agoraDelegate);
_agoraEngine.EnableWebSdkInteroperability(webSdkInteroperability);
_agoraEngine.SetChannelProfile(ChannelProfile.LiveBroadcasting);
_agoraEngine.SetClientRole(ClientRole.Broadcaster);
//
_agoraEngine.EnableVideo();
if (!string.IsNullOrEmpty(AgoraSettings.Current.EncryptionPhrase))
{
_agoraEngine.SetEncryptionMode(AgoraSettings.Current.EncryptionType.GetModeString());
_agoraEngine.SetEncryptionSecret(helper.AgoraSettings.Current.EncryptionPhrase);
}
_agoraEngine.StartPreview();
Join();
}
private async void Join()
{
var token = await AgoraTokenService.GetRtcToken(helper.AgoraSettings.Current.RoomName);
if (string.IsNullOrEmpty(token))
{
//smth went wrong
}
else
{
_agoraEngine.JoinChannelByToken(token, AgoraSettings.Current.RoomName, null, 0, JoiningCompleted);
}
}
}
any help ?

Custom renderer for LibVLCSharp VideoView in Mac and UWP (Xamarin.Forms)

I'm trying to use VideoView from LibVLCSharp for Mac to create a custom renderer in Xamarin.Forms to play a video in Xamarin.Forms mac application. So far I only get audio but no video.
this is my VideoPlayerRenderer for mac implementation
[assembly: ExportRenderer(typeof(Player.VideoPlayer), typeof(Player.Mac.VideoPlayerRenderer))]
namespace Player.Mac {
public class VideoPlayerRenderer : ViewRenderer<VideoPlayer, LibVLCSharp.Platforms.Mac.VideoView> {
LibVLCSharp.Platforms.Mac.VideoView video_view;
public VideoPlayerRenderer() {
}
protected override void OnElementChanged (ElementChangedEventArgs<VideoPlayer> e) {
base.OnElementChanged (e);
if(e.OldElement != null) {
}
if(e.NewElement != null) {
if(Control == null) {
video_view = new LibVLCSharp.Platforms.Mac.VideoView();
video_view.MediaPlayer = e.NewElement.get_media_player();
SetNativeControl(video_view);
}
}
}
}
}
and the VideoPlayer Xamarin.Forms View
public class VideoPlayer : View {
LibVLC lib_vlc;
MediaPlayer media_player;
public VideoPlayer() {
}
public void init(LibVLC lib_vlc, MediaPlayer media_player) {
this.lib_vlc = lib_vlc;
this.media_player = media_player;
}
public void play() {
this.media_player.Play();
}
public MediaPlayer get_media_player() {
return this.media_player;
}
}
I've tried the same method on UWP and there i get no audio nor video. So i'm wondering if this is going in the wrong direction, and if so, how are you supposed to go about using LibVLCSharp for mac/uwp?
You don't have to create your own renderer since there is already one.
From the LibVLCSharp.Forms documentation :
This package also contains the views for the following platforms:
Android
iOS
Mac
The UWP support for Xamarin.Forms currently has blockers that we expect to get solved by the LVS 4/ libvlc 4 release. See this issue for a detailed explanation.

Xamarin UI Testing

I would like to add UnitTest into my Xamarin app, I get the splash screen displaying on the Andriod emulator and then the landing page loads but the next page it seems to closes the app . I have break point around the Platform.Android and that hits but it never hit the end of that method. I added WaitTimes for 5 minutes and I will get the timeout exception
Any idea why the app closes ?
static class AppManager
{
private const string ApkPath = #"C:\pathtoapkfile.apk";
static IApp app;
public static IApp App
{
get
{
if (app == null)
throw new NullReferenceException("'AppManager.App' not set. Call 'AppManager.StartApp()' before trying to access it.");
return app;
}
}
static Platform? platform;
public static Platform Platform
{
get
{
if (platform == null)
throw new NullReferenceException("'AppManager.Platform' not set.");
return platform.Value;
}
set { platform = value; }
}
public static void StartApp()
{
if (Platform == Platform.Android)
{
app = ConfigureApp
.Android
.ApkFile(ApkPath)
.WaitTimes(new WaitTimes())
.StartApp(AppDataMode.Clear);
}
if (Platform == Platform.iOS)
{
app = ConfigureApp
.iOS
.StartApp(AppDataMode.Clear);
}
}
}

AIR NativeProcess on Mac gives Error:3219, all solutions failing

I've read up most solutions for this error and none seem to apply.
I'm running a basic AS3 app in FlashBuilder, on OS-X.
descriptor is set to extendedDesktop
have set the profile in FB to 'extendedDesktop'
am publishing as 'signed native installer'
I've tried launching the file from both:
app:/demo.sh
file:///Users/visualife/Desktop/AE/demo.sh
the target file is set to 777 (executable)
the target file runs fine when directly targetted
i'm running the exe on the same OS and machine it's created on
changing the 'demo.sh' file to a jpg etc doesn't change anything
No matter what I try I get told native process is support, everything runs fine until start is called then a Error: 3219 is thrown with no further information.
all help greatly appreciated!
I've included my code below:
package {
import flash.desktop.NativeProcess;
import flash.desktop.NativeProcessStartupInfo;
import flash.display.Sprite;
import flash.errors.IllegalOperationError;
import flash.events.Event;
import flash.events.IOErrorEvent;
import flash.events.NativeProcessExitEvent;
import flash.events.ProgressEvent;
import flash.filesystem.File;
import flash.text.TextField;
public class VauxhallController extends Sprite {
private var debug_txt:TextField;
public var process:NativeProcess;
private var sh:File;
public function VauxhallController() {
if (stage) {
init();
} else {
this.addEventListener(Event.ADDED_TO_STAGE, init);
}
}
private function init($e:Event=null):void {
this.removeEventListener(Event.ADDED_TO_STAGE, init);
build();
if (NativeProcess.isSupported) {
initListeners();
debugMe("Native process supported");
go();
} else {
debugMe("Native not supported");
}
}
private function build():void {
// debug
debug_txt = new TextField();
debug_txt.width = 300;
debug_txt.height= 600;
this.addChild(debug_txt);
}
private function initListeners():void { }
private function go():void {
runShellFile();
}
private function runShellFile():void {
debugMe("runShellFile");
var nativeProcessStartupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
var essArgs:Vector.<String> = new Vector.<String>();
var file:File;
file = File.desktopDirectory.resolvePath("AE/demo.sh");
debugMe("path|"+ File.desktopDirectory.resolvePath("AE/demo.sh").url);
nativeProcessStartupInfo.executable = file;
nativeProcessStartupInfo.workingDirectory = File.desktopDirectory;
nativeProcessStartupInfo.executable = file;
process = new NativeProcess();
process.addEventListener(ProgressEvent.STANDARD_OUTPUT_DATA, onOutputData);
process.addEventListener(ProgressEvent.STANDARD_ERROR_DATA, onErrorData);
process.addEventListener(NativeProcessExitEvent.EXIT, onExit);
process.addEventListener(IOErrorEvent.STANDARD_OUTPUT_IO_ERROR, onIOError);
process.addEventListener(IOErrorEvent.STANDARD_ERROR_IO_ERROR, onIOError);
try {
process.start(nativeProcessStartupInfo);
} catch (error:IllegalOperationError) {
debugMe(error.toString());
} catch (error:ArgumentError) {
debugMe(error.toString());
} catch (error:Error) {
debugMe(error.toString());
}
debugMe("# DONE");
}
public function onOutputData(event:ProgressEvent):void { debugMe("Got: "+ process.standardOutput.readUTFBytes(process.standardOutput.bytesAvailable)); }
public function onErrorData(event:ProgressEvent):void { debugMe("ERROR: "+ process.standardError.readUTFBytes(process.standardError.bytesAvailable)); }
public function onExit(event:NativeProcessExitEvent):void { debugMe("Process exited with: "+ event.exitCode); }
public function onIOError(event:IOErrorEvent):void { debugMe("IOError: "+ event.toString()); }
private function debugMe(_str:String):void { debug_txt.appendText(_str +"\n"); }
}
}
Have you read this article?
http://www.actionscripterrors.com/?p=2527
<supportedProfiles>extendedDesktop desktop</supportedProfiles>
I have the same error and in my case is because I'm trying to open .exe on MacOS. Verify if your demo.sh script interacts with .exe files.

How to get the Device Resolution in Wp7

I want to set some icons in my app according to device screen resolution like wxga or wvga. I saw in many links like App.Current.Host.Content.ScaleFactor or Application.Current.RootVisual.RenderSize. But I can only access App.Current.Host.Content.ActualWidth or Height. These always say as 480x800 even though I am running the app in wxga Device. How do i know the resolution correctly ?
Windows Phone 7 only supports one resolution(800*480). Are you asking about Windows Phone 8? Please a look at Multi-resolution apps for Windows Phone 8 . Here is the ResolutionHelper class you can use.
public enum Resolutions { WVGA, WXGA, HD720p };
public static class ResolutionHelper
{
private static bool IsWvga
{
get
{
return App.Current.Host.Content.ScaleFactor == 100;
}
}
private static bool IsWxga
{
get
{
return App.Current.Host.Content.ScaleFactor == 160;
}
}
private static bool Is720p
{
get
{
return App.Current.Host.Content.ScaleFactor == 150;
}
}
public static Resolutions CurrentResolution
{
get
{
if (IsWvga) return Resolutions.WVGA;
else if (IsWxga) return Resolutions.WXGA;
else if (Is720p) return Resolutions.HD720p;
else throw new InvalidOperationException("Unknown resolution");
}
}
}
You Target your app for Windows Phone 7.1 , so you should update your app to target windows phone 8.0 OS by clicking right on your project and Upgrade to Windows Phone 8.0

Resources