In my NS 6.2 Core app I am trying to load an image that comes from the backend as a base64 string. This string needs to be converted to an imageSource and displayed in an image element. I am testing on an Android physical device.
So far I tried:
const img = <Image>e.object;
let imageSource = new ImageSource();
var isLoaded = imageSource.loadFromBase64('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0i...');
if (isLoaded) {
img.imageSource = imageSource;
}
The base64 string is confirmed to be valid, but isLoaded is always false.
I also tried:
const s: string = "data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8...';
imageSource.fromBase64(s).then((loaded:boolean) => {
img.imageSource = imageSource;
});
This did not work either, no errors, just nothing is loaded into the Image.
Last thing I tried was:
import { fromBase64, ImageSource } from "tns-core-modules/image-source";
const s: string = "data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8...';
img.imageSource = fromBase64(s);
This produces an error:
System.err: An uncaught Exception occurred on "main" thread.
System.err: Calling js method onCreateView failed
System.err: Error: java.lang.IllegalArgumentException: bad base-64
I am out of ideas. Any help would be greatly appreciated.
You can try to split your base64 string to remove data:image/svg+xml;base64 part.
let base64 = 'data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0i...'
let data = base64.split(",");
let isLoaded = imageSource.loadFromBase64(data[1]);
Related
Having trouble with using the Image widget from the pdf library
Current Version:
pdf: ^3.3.0
code:
final imageA = PdfImage.file(
pdf.document,
bytes: File('lib/Images/rocket14.jpg').readAsBytesSync(),
);
child: pw.Image(pw.ImageImage(pdfimageA)),
Error:
Unhandled Exception: type 'PdfImage' is not a subtype of type 'Image'
I have absolutely no idea on how to parse a Pdfimage into an image, initially i had
UPDATE:
Using
Future<MemoryImage> convtoImage(String name) async => pw.MemoryImage((await rootBundle.load('assets/img/$name.jpg')).buffer.asUint8List(),);
results in a catch 22 where
error: A value of type 'MemoryImage' can't be returned from the function 'convtoImage' because it has a return type of 'Future<MemoryImage>'. (return_of_invalid_type at [scanrecopdfmaker] lib\main.dart:490)
but if i were to make it a return type of memimage it will prompt me to turn it back into a future.
also tried this, but it never updates the value so it remains null
Future<Uint8List> convtoImage(String name) async {
///1
var a = await rootBundle.load('lib/Images/$name.jpg');
Uint8List data = (await rootBundle.load('lib/Images/rocket14.jpg'))
.buffer
.asUint8List();
// print("data IS $data");
// var b = pw.MemoryImage(a.buffer.asUint8List());
return data;
}
var done;
var tempa = convtoImage('rocket14').whenComplete(() {
print("done");
}).then((value) => done=value);
print("AA IS $tempa");
print("BIS $done");
while(done == null){
print(done);
}
print("Finito");
Use rootBundle to load image from assets
pw.MemoryImage(
(await rootBundle.load('assets/img/your_image.jpg')).buffer.asUint8List(),
);
I had functionality to pick an image of QRcode from CameraRoll of Android and iOS in react-native and once the user had picked an image. I will use something like jsQR to decode that and validate if its a real qr code or not.
But on jsQR lib they said that they need to accept Uint8ClampedArray to decode the image and read the qr. So I already have a function to get the base64 image. But can't find on how to convert it properly to Uint8ClampedArray.
Here is my code below:
const handleImportScan = useCallback(async () => {
try {
const base64Image = await RNFS.readFile(
photos[selected].node.image.uri,
'base64',
);
console.log('base64img:', base64Image);
// First argument below should be a 'Uint8ClampedArray'
const code = jsQR(base64Image, width, height);
if (code) {
console.log('Found QR code', code);
}
} catch (error) {
console.log('err:', error);
}
}, [photos, selected]);
I'm trying to find a library or third-party to convert my base64 image to Uint8ClampedArray
Mostly I save the user qr generate images using PNG.
Appreciate it if someone could help.
Thanks
Note that base64data should be base64 encoded image not uri. (eg. without 'data:image/png;base64,', if you have uri)
const byteCharacters = atob(base64data);
const byteNumbers = new Array(byteCharacters.length);
for (let i = 0; i < byteCharacters.length; i++) {
byteNumbers[i] = byteCharacters.charCodeAt(i);
}
const byteArray = new Uint8ClampedArray(byteNumbers);
I am trying to convert the image into base64 but I get a null result, anyone has a clue? I found few examples how to take a pic from a camera and convert into b64 but i couldn't found anything how to convert image tag into b64.
<image #loaded="loaded" src="myimg.jpg">
var ImageSourceModule = require("tns-core-modules/image-source");
function loaded(args){
let imageSource = ImageSourceModule.fromNativeSource(args.object.nativeElement);
console.log(imageSource.toBase64String('jpeg'))
}
you might be getting null data because of coverting image before it is fully loaded. try converting image after it is fully loaded as shown in below snippet.
To convert image after it's load we are using loaded event and isLoading property of the Image.
loaded(args) {
console.log(args.eventName);
console.log(args.object);
var img = args.object;
console.log("img.isLoading: " + img.isLoading);
console.log("img.isLoaded: " + img.isLoaded);
if(img.isLoading){
img.on("isLoadingChange", function (args) {
console.log("isloading",args.value);
console.log("isloaded",args.object.isLoaded);
let img = args.object;
let imageSource= ImageSource.fromNativeSource(img);
console.log(imageSource.toBase64String('jpeg'))
});
}else if(!img.isLoading&&img.isLoaded){
let imageSource= ImageSource.fromNativeSource(img);
console.log(imageSource.toBase64String('jpeg'))
}else{
console.log("image loading failed");
}
}
After downloading xamarin.signaturePad sample i just want to get image from signature pad to memory stream and show on my imageview. here is my code and its working fine on iOS but on android its showing empty stream
var image = await padView.GetImageStreamAsync(SignatureImageFormat.Png);
var stream = new MemoryStream();
image.CopyToAsync(stream);
var imageByteArray= stream.ToArray();
img_result.Source = ImageSource.FromStream(() => newMemoryStream(imageByteArray));
Just cast your imagestream to memorystream. It should be valid
var imageStream = await padView.GetImageStreamAsync(SignatureImageFormat.Png);
// this is actually memory-stream so convertible to it
var mstream = (MemoryStream)imageStream;
//Unfortunately above mstream is not valid until you take it as byte array
mstream = new MemoryStream(mstream.ToArray());
//Now you can
img_result.Source = ImageSource.FromStream(()=>mstream);
I'm creating a project on CloudPebble using JavaScript.
I have a "Constants.js" which hosts a variable that I would like to access using "app.js", which is the main contents of the app. However, running the app I receive the following error:
[PHONE] pebble-app.js:?: JavaScript Error:
TypeError: Cannot read property 'length' of undefined
Here is my code:
Constants.js
var mainMenuOptions = ["MenuOption1", "MenuOption2", "MenuOption3"];
app.js
var UI = require('ui');
var Vector2 = require('vector2');
var constants = require('Constants.js');
var mainMenu = new UI.Menu({
});
for (var i = 0; i < constants.mainMenuOptions.length; i++) { //Error occurs here
mainMenu.item(0, i, { title: constants.mainMenuOptions[i] });
}
...
Any help is appreciated. Thanks!
I beleive your Constants.js should have this format:
var Constants = {
mainMenuOptions: ["MenuOption1", "MenuOption2", "MenuOption3"]
};
this.exports = Constants;
And then in app.js do
var constants = require('Constants');
to access it.
Used this approach in my very first Pebble.js app Autoinsult and it worked.