Nativescript imagepicker not working in iOS :: not picking up image path? - nativescript

I am using Nativescript with Angular and have a page where I photograph a receipt or add from gallery and add a couple of text inputs and send to server.
The Add from gallery is working fine in Android but not in iOS.
Here is the template code:
<Image *ngIf="imageSrc" [src]="imageSrc" [width]="previewSize" [height]="previewSize" stretch="aspectFit"></Image>
<Button text="Pick from Gallery" (tap)="onSelectGalleryTap()" class="btn-outline btn-photo"> </Button>
and the component:
public onSelectGalleryTap() {
let context = imagepicker.create({
mode: "single"
});
let that = this;
context
.authorize()
.then(() => {
that.imageAssets = [];
that.imageSrc = null;
return context.present();
})
.then((selection) => {
alert("Selection done: " + JSON.stringify(selection));
that.imageSrc = selection.length > 0 ? selection[0] : null;
// convert ImageAsset to ImageSource
fromAsset(that.imageSrc).then(res => {
var myImageSource = res;
var base64 = myImageSource.toBase64String("jpeg", 20);
this.expense.receipt_data=base64;
})
that.cameraImage=null;
that.imageAssets = selection;
that.galleryProvided=true;
// set the images to be loaded from the assets with optimal sizes (optimize memory usage)
selection.forEach(function (element) {
element.options.width = that.previewSize;
element.options.height = that.previewSize;
});
}).catch(function (e) {
console.log(e);
});
}
I have posted below the Android and iOS screenshots of the line:
alert("Selection done: " + JSON.stringify(selection));
In Android there is a path to the location of the image in the file system but in iOS there are just empty curly brackets where I'd expect to see the path and then when submitted the message back is "Unable to save image" although the image preview is displaying on the page in Image.
Here are the screenshots:
Android:
iOS:
Any ideas why it is failing in iOS?
Thanks
==========
UPDATE
I am now saving the image to a temporary location and it is still not working in iOS. It works in Android.
Here is my code now.
import { ImageAsset } from 'tns-core-modules/image-asset';
import { ImageSource, fromAsset, fromFile } from 'tns-core-modules/image-source';
import * as fileSystem from "tns-core-modules/file-system";
...
...
public onSelectGalleryTap() {
alert("in onSelectGalleryTap");
var milliseconds=(new Date).getTime();
let context = imagepicker.create({
mode: "single"
});
let that = this;
context
.authorize()
.then(() => {
that.imageAssets = [];
that.previewSrc = null;
that.imageSrc = null;
return context.present();
})
.then((selection) => {
that.imageSrc = selection.length > 0 ? selection[0] : null;
// convert ImageAsset to ImageSource
fromAsset(that.imageSrc)
.then(res => {
var myImageSource = res;
let folder=fileSystem.knownFolders.documents();
var path=fileSystem.path.join(folder.path, milliseconds+".jpg");
var saved=myImageSource.saveToFile(path, "jpg");
that.previewSrc=path;
const imageFromLocalFile: ImageSource = <ImageSource> fromFile(path);
var base64 = imageFromLocalFile.toBase64String("jpeg", 20);
this.expense.receipt_data=base64;
})
that.cameraImage=null;
that.imageAssets = selection;
that.galleryProvided=true;
// set the images to be loaded from the assets with optimal sizes (optimize memory usage)
selection.forEach(function (element) {
element.options.width = that.previewSize;
element.options.height = that.previewSize;
});
}).catch(function (e) {
console.log(e);
});
}
Any ideas? Thanks.

It is an already communicated issue, several of us subscribed for, check here issue #321
for updates.

Related

Copy or download generated QR (vue-qrcode) code with VueJs

I use the plugin "vue-qrcode" to generate a qr code for my users to for a link to their public profile so they can share it e.g. on thei business card.
The Idea is now to give my users the possibility via a button to download the qr code and via another button to copy the qr code to the clipboard to make it easier to send it e.g. via mail (specially for the smartphone users).
Problem: I don't know how i can download or copy the qr code. Does anybody know to copy or download the qr code? Currently I use 'vue-clipboard2' to copy links etc. but it seems it can't copy images.
I use the below code to display the qr code on our site:
<template>
<qrcode-vue
style = "cursor: pointer;"
:value = "linkToProfile"
size = 160
level = "M"
:background = "backgroundcolor_qrcode"
:foreground = "color_qrcode"
></qrcode-vue>
</template>
<script>
import Vue from 'vue'
import QrcodeVue from 'qrcode.vue'
Vue.component('qrcode-vue', QrcodeVue )
export default {
data: () => ({
linkToProfile: "http://www.example.com/johnDoe",
})
</script>
Thanks -
Christian
I figured it out. The solution looks like this:
TEMPLATE AREA:
<qrcode-vue
id="qrblock"
:value = "linkToSki"
size = 220
level = "M"
ref="qrcode"
></qrcode-vue>
FUNCITONS AREA:
// -- FUNCTIONS AREA TO COPY / DOWNLOAD QR CODE - END ---
function selectText(element) {
if (document.body.createTextRange) {
const range = document.body.createTextRange();
range.moveToElementText(element);
range.select();
} else if (window.getSelection) {
const selection = window.getSelection();
const range = document.createRange();
range.selectNodeContents(element);
selection.removeAllRanges();
selection.addRange(range);
}
}
function copyBlobToClipboardFirefox(href) {
const img = document.createElement('img');
img.src = href;
const div = document.createElement('div');
div.contentEditable = true;
div.appendChild(img);
document.body.appendChild(div);
selectText(div);
const done = document.execCommand('Copy');
document.body.removeChild(div);
return done;
}
function copyBlobToClipboard(blob) {
// eslint-disable-next-line no-undef
const clipboardItemInput = new ClipboardItem({
'image/png' : blob
});
return navigator.clipboard
.write([clipboardItemInput])
.then(() => true)
.catch(() => false);
}
function downloadLink(name, href) {
const a = document.createElement('a');
a.download = name;
a.href = href;
document.body.append();
a.click();
a.remove();
}
// -- FUNCTIONS AREA TO COPY / DOWNLOAD QR CODE - END ---

How use MediaFilePicker and PhotoEditor plugins in Nativescript

I am trying to use MediaFilePicker on nativescript and at the same time use the PhotoEditor plugin to crop/edit the photo taken from the camera but I don't make it work... here is part of my code:
let options: ImagePickerOptions = {
android: {
isCaptureMood: true, // if true then camera will open directly.
isNeedCamera: true,
maxNumberFiles: 1,
isNeedFolderList: true
}, ios: {
isCaptureMood: true, // if true then camera will open directly.
maxNumberFiles: 1
}
};
let mediafilepicker = new Mediafilepicker();
mediafilepicker.openImagePicker(options);
mediafilepicker.on("getFiles", function (res) {
let results = res.object.get('results');
let result = results[0];
let source = new imageSourceModule.ImageSource();
source.fromAsset(result.rawData).then((source) => {
const photoEditor = new PhotoEditor();
photoEditor.editPhoto({
imageSource: source,
hiddenControls: [],
}).then((newImage) => {
}).catch((e) => {
reject();
});
});
});
The result object of the FilePicker comes like:
{
"type": "capturedImage",
"file": {},
"rawData": "[Circular]"
}
I believe if the picture was taken from the camera, then use the rawData field, but I dont know which format is coming and how to give it to PhotoEditor pluging to play with it.
Any suggestions?
Thanks!
The issue was at this line source.fromAsset(result.rawData) here, result.rawData is not an ImageAsset but it's PHAsset. You will have to create an ImageAsset from PHAsset and pass it on to fromAsset. So it would look like,
import { ImageAsset } from "tns-core-modules/image-asset";
....
....
imgSource.fromAsset(new ImageAsset(img)).then((source) => {
const photoEditor = new PhotoEditor();
console.log(source === imgSource);
photoEditor.editPhoto({
imageSource: source,
hiddenControls: [],
}).then((newImage: ImageSource) => {
console.log('Get files...');
// Here you can save newImage, send it to your backend or simply display it in your app
}).catch((e) => {
//reject();
});
});

Spot the difference between these two images

Programmatically, my code is detecting a difference between two classes of images, and always rejecting one class, while always allowing the other.
I have yet to find any difference between the images that yield the error and the ones that don't an yield error. But there has to be some difference, because the ones that yield an error do so 100% of the time, and the others work as expected 100% of the time.
In particular, I have inspected color format: RGB in both groups; size: no notable difference; datatype: uint8 in both; magnitude of pixel values: similar in both.
Below are two images that never work, followed by two images that always work:
This image never works: https://www.colourbox.com/preview/11906131-maple-tree-and-grass-silhouette.jpg
This image never works: http://feldmanphoto.com/wp-content/uploads/awe-inspiring-house-clipart-black-and-white-disney-coloring-pages-big-clipartxtras-illistration-background-housewives-bouncy.jpeg
This image always works: http://www.spacedesign.us/wp-content/uploads/landscape-with-old-tree-and-grass-over-white-background-black-and-black-and-white-trees.jpg
This image always works: http://www.modernhouse.co/wp-content/uploads/2017/07/1024px-RoseSeidlerHouseSulmanPrize.jpg
How can I spot the difference?
The scenario is that I am using Firebase with Swift iOS front end to send these images to a Google Cloud ML-engine hosted convnet. Some images work all the time and certain others never work as above. Further, all images work when I use the gcloud versions predict CLI. To me the issue is necessarily something in the images. Hence I am posting here for the imaging group. Code is included as requested for completeness.
CODE of index.js file is included:
'use strict';
const functions = require('firebase-functions');
const gcs = require('#google-cloud/storage');
const admin = require('firebase-admin');
const exec = require('child_process').exec;
const path = require('path');
const fs = require('fs');
const google = require('googleapis');
const sizeOf = require('image-size');
admin.initializeApp(functions.config().firebase);
const db = admin.firestore();
const rtdb = admin.database();
const dbRef = rtdb.ref();
function cmlePredict(b64img) {
return new Promise((resolve, reject) => {
google.auth.getApplicationDefault(function (err, authClient) {
if (err) {
reject(err);
}
if (authClient.createScopedRequired && authClient.createScopedRequired()) {
authClient = authClient.createScoped([
'https://www.googleapis.com/auth/cloud-platform'
]);
}
var ml = google.ml({
version: 'v1'
});
const params = {
auth: authClient,
name: 'projects/myproject-18865/models/my_model',
resource: {
instances: [
{
"image_bytes": {
"b64": b64img
}
}
]
}
};
ml.projects.predict(params, (err, result) => {
if (err) {
reject(err);
} else {
resolve(result);
}
});
});
});
}
function resizeImg(filepath) {
return new Promise((resolve, reject) => {
exec(`convert ${filepath} -resize 224x ${filepath}`, (err) => {
if (err) {
console.error('Failed to resize image', err);
reject(err);
} else {
console.log('resized image successfully');
resolve(filepath);
}
});
});
}
exports.runPrediction = functions.storage.object().onChange((event) => {
fs.rmdir('./tmp/', (err) => {
if (err) {
console.log('error deleting tmp/ dir');
}
});
const object = event.data;
const fileBucket = object.bucket;
const filePath = object.name;
const bucket = gcs().bucket(fileBucket);
const fileName = path.basename(filePath);
const file = bucket.file(filePath);
if (filePath.startsWith('images/')) {
const destination = '/tmp/' + fileName;
console.log('got a new image', filePath);
return file.download({
destination: destination
}).then(() => {
if(sizeOf(destination).width > 224) {
console.log('scaling image down...');
return resizeImg(destination);
} else {
return destination;
}
}).then(() => {
console.log('base64 encoding image...');
let bitmap = fs.readFileSync(destination);
return new Buffer(bitmap).toString('base64');
}).then((b64string) => {
console.log('sending image to CMLE...');
return cmlePredict(b64string);
}).then((result) => {
console.log(`results just returned and is: ${result}`);
let predict_proba = result.predictions[0]
const res_pred_val = Object.keys(predict_proba).map(k => predict_proba[k])
const res_val = Object.keys(result).map(k => result[k])
const class_proba = [1-res_pred_val,res_pred_val]
const opera_proba_init = 1-res_pred_val
const capitol_proba_init = res_pred_val-0
// convert fraction double to percentage int
let opera_proba = (Math.floor((opera_proba_init.toFixed(2))*100))|0
let capitol_proba = (Math.floor((capitol_proba_init.toFixed(2))*100))|0
let feature_list = ["houses", "trees"]
let outlinedImgPath = '';
let imageRef = db.collection('predicted_images').doc(filePath.slice(7));
outlinedImgPath = `outlined_img/${filePath.slice(7)}`;
imageRef.set({
image_path: outlinedImgPath,
opera_proba: opera_proba,
capitol_proba: capitol_proba
});
let predRef = dbRef.child("prediction_categories");
let arrayRef = dbRef.child("prediction_array");
predRef.set({
opera_proba: opera_proba,
capitol_proba: capitol_proba,
});
arrayRef.set({first: {
array_proba: [opera_proba,capitol_proba],
brief_description: ["a","b"],
more_details: ["aaaa","bbbb"],
feature_list: feature_list},
zummy1: "",
zummy2: ""});
return bucket.upload(destination, {destination: outlinedImgPath});
});
} else {
return 'not a new image';
}
});
Issue was that the bad images were grayscale, not RGB as expected by my model. I initially had checked this first by looking at the shape. But the 'bad' images had 3 color channels, each of those 3 channels stored the same number --- so my model was refusing to accept them. Also, as expected and contrary to what I initially thought I observed, turns out the gcloud ML-engine predict CLI actually also failed for these images. Took me 2 days to figure this out!

Nativescript how to save image to file in custom component

I have created a custom component that access the device's camera to snap a picture, set it as source of an ImageView and then save it to a file.
Here is the Javascript code
CAMERA.JS
Here is the initialization of the imageView
export function cameraLoaded(args):void{
cameraPage = <Page> args.object;
imageView = <Image> cameraPage.getViewById("img_upload");...
}
Here I set the imageViews'source to the just taken picture
export function takePicture():void{
camera.takePicture(
{
})
.then(
function(picture){
imageView.imageSource = picture;
},function(error){
});
}
This works perfectly.
Now I try to save the picture to a file.
export function saveToFile():void{
try {
let saved = imageView.imageSource.saveToFile(path,enums.ImageFormat.png);
HalooseLogger.log(saved,"upload");
})
}catch (e){
...
}
}
Here I get an error cannot read property saveToFile of undefined
This is very unusual, in fact if I console.log(imageView) here is the output :
Image<img_upload>#file:///app/views/camera/camera.xml:4:5;
but if I console.log(imageView.imageSource) i see it is ´undefined`.
How is this possible? What am I doing wrong?
ADDITIONAL INFO
The previous code and relatex xml is included in another view as follows :
MAIN.XML
<Page
xmlns="http://schemas.nativescript.org/tns.xsd"
xmlns:cameraPage="views/camera"
loaded="loaded">
<StackLayout orientation="vertical">
<StackLayout id="mainContainer">
<!-- DYNAMIC CONTENT GOES HERE -->
</StackLayout>
</StackLayout>
</Page>
MAIN.JS
This is were the camera view gets loaded dynamically
export function loaded(args):void{
mainPage = <Page>args.object;
contentWrapper = mainPage.getViewById("mainContainer");
DynamicLoaderService.loadPage(mainPage,contentWrapper,mainViewModel.currentActive);
}
The loadPage method does the following :
public static loadPage(pageElement,parentElement,currentActive):void{
let component = Builder.load({
path : "views/camera",
name : "camera",
page : pageElement
});
parentElement.addChild(component);
}
The thing is that as of NativeScript 2.4.0 the Image created for Android will always return null for its property imageSource. Currently, optimisations are on the way to prevent Out of Memory related issues when working with multiple large images and that is why image-asset was presented in nativeScript 2.4.0.
Now I am not sure if you are using the latest nativescript-camera (highly recommended) but if so you should consider that the promise from takePicture() is returning imageAsset. Due to the memory optimization imageSource will always return undefined (for Android) unless you specifically create one. You can do that with fromAsset() method providing the ImageAsset returned from the camera callback.
Example:
import { EventData } from 'data/observable';
import { Page } from 'ui/page';
import { Image } from "ui/image";
import { ImageSource, fromAsset } from "image-source";
import { ImageAsset } from "image-asset";
import * as camera from "nativescript-camera";
import * as fs from "file-system";
var imageModule = require("ui/image");
var img;
var myImageSource: ImageSource;
// Event handler for Page "navigatingTo" event attached in main-page.xml
export function onLoaded(args: EventData) {
// Get the event sender
let page = <Page>args.object;
img = <Image>page.getViewById("img");
camera.requestPermissions();
}
export function takePhoto() {
camera.takePicture()
.then(imageAsset => {
console.log("Result is an image asset instance");
img.src = imageAsset;
fromAsset(imageAsset).then(res => {
myImageSource = res;
console.log(myImageSource);
})
}).catch(function (err) {
console.log("Error -> " + err.message);
});
}
export function saveToFile(): void {
var knownPath = fs.knownFolders.documents();
var folderPath = fs.path.join(knownPath.path, "CosmosDataBank");
var folder = fs.Folder.fromPath(folderPath);
var path = fs.path.join(folderPath, "Test.png");
var saved = myImageSource.saveToFile(path, "png");
console.log(saved);
}

How To Exit app by double tap on back button in NativeScript Apps

Exit app on double tap on back button in Nativescript
Please help me with snippet of code
Here is the solution that I have found:
var frameModule = require("ui/frame");
var application = require("application")
var activity = application.android.startActivity ||
application.android.foregroundActivity ||
frameModule.topmost().android.currentActivity ||
frameModule.topmost().android.activity;
var lastPress;
activity.onBackPressed = function() {
var timeDelay = 500
if (lastPress + timeDelay > java.lang.System.currentTimeMillis()) {
var startMain = new android.content.Intent(android.content.Intent.ACTION_MAIN);
startMain.addCategory(android.content.Intent.CATEGORY_HOME);
startMain.setFlags(android.content.Intent.FLAG_ACTIVITY_NEW_TASK);
activity.startActivity(startMain);
// If you want to kill the app totally, use these codes instead of above
// activity.finish();
// java.lang.System.exit(0);
} else {
frameModule.topmost().goBack();
}
lastPress = java.lang.System.currentTimeMillis();
}
Hope this helps.
for Angular
import * as application from 'tns-core-modules/application';
exitapp=0
ngOnInit() {
application.android.on(application.AndroidApplication.activityBackPressedEvent, this.handleBackButton, this);
}
handleBackButton(args: any) {
this.ngZone.run(() => {
args.cancel = true;
if (this.routerExtensions.canGoBack()) {
this.routerExtensions.back();
}else{
this.exitapp++
this.getData.toast('Press again to exit')
if(this.exitapp==2){
application.android.foregroundActivity.finish();
}
setTimeout(()=>{
this.exitapp=0
},2000)
}
})
}

Resources