Flutter/Dart resizing an Image in actual device took > 10 mins - image

I discovered a behaviour with Flutter/Dart. I am trying to resize an image from the ImagePicker.
The simulator works well but on the actual device, an iPhone 6 plus, the whole process took more than 10 mins and ended with a crash.
On the actual device, I clicked on the button that brings out the Image Picker, I select a photo and the device just hung. After 10 mins, the Image Pciker dismisses and continues with the image resizing and after 5 mins or so, it crashes.
Here is the code:
ImagePicker.pickImage(source: source)
.then((_imageFile2) => _uploadFile(_imageFile2)
.then((downbloadURL) {
if (downbloadURL != null ) {
createCloudStoreRecord(fireBaseUser, downbloadURL, true);
setState(() {
profileImage = new DecorationImage(
image: getProfileImage(downbloadURL),
fit: BoxFit.cover,
);
});
Navigator.pop(context);
showInSnackBar("Image Updated");
} else {
Navigator.pop(context);
showInSnackBar("Image Update Error!");
}
}));
Future<String> _uploadFile(_imageFile2) async {
print("in upload image");
if (_imageFile2==null) {
print("imagePicker image is null");
Navigator.pop(context);
return null;
} else {
onLoading(context, "Updating ...");
try {
// resize image
Im.Image image = Im.decodeImage(_imageFile2.readAsBytesSync());
Im.Image smallerImage = Im.copyResize(image, 500); // choose the size here, it will maintain aspect ratio
final tempDir = await getTemporaryDirectory();
final path = tempDir.path;
var filename = user.uid.toString() + ".png";
var newPath = '$path/' + filename;
print("start compressed");
var compressedImage = new File(newPath)..writeAsBytesSync(Im.encodePng(smallerImage));
print("end compressed");
//final Directory systemTempDir = Directory.systemTemp;
final StorageReference ref = FirebaseStorage.instance.ref().child('users/' + filename);
final StorageUploadTask uploadTask = ref.putFile(
compressedImage,
new StorageMetadata(
contentLanguage: 'en',
customMetadata: <String, String>{'renalbase': 'user_photo'},
),
);
print("Start upload");
UploadTaskSnapshot uploadSnapshot = await uploadTask.future;
print("image uploaded");
Map<String, dynamic> pictureData = new Map<String, dynamic>();
pictureData["url"] = uploadSnapshot.downloadUrl.toString();
print("Bfore url = ${pictureData["url"]}");
final RegExp regExp = RegExp('.*(?=\\?)');
pictureData["url"] = Uri.decodeFull( regExp.stringMatch(pictureData["url"]) );
print("url = ${pictureData["url"]}");
return pictureData["url"];
} catch(e) {
print("Upload error: $e");
showInSnackBar("Upload error: $e");
return null;
}
}
}

I had similar issues with image resizing taking too long. I switched to using the maxHeight and maxWidth parameters in ImagePicker.pickImage and have had far better results.
_imageFile = await ImagePicker.pickImage(
source: ImageSource.gallery,
maxHeight: 450.0,
maxWidth: 450.0);

Gave up on the image plugin and used the flutter_native_image instead.
(https://github.com/btastic/flutter_native_image)
Works like a charm.

Neither flutter_native_image or ImagePicker works well in all cases...
flutter_native_image resize the picture by x-percent when you just call the quality parameter... what a mess, this is not the intended purpose !!!
and image_picker on android does not resize pictures when you provide two max dimensions and one dimension is lesser than one provided...

Related

Xamarin Forms - reduce byte[] size to target size

I have seen a couple of examples, but nothing helped me so far with this issue.
I have an image in byte[] which size must be reduced to under 2 MB. I tried a couple of things, but nothing helped so far.
I used the following code which could be found at many questions like this:
public static byte[] Compress(byte[] data)
{
MemoryStream output = new MemoryStream();
using (DeflateStream dstream = new DeflateStream(output, CompressionLevel.Optimal))
{
dstream.Write(data, 0, data.Length);
}
return output.ToArray();
}
This does reduce the size of the image, but not below 2 MB and I have no idea how to state that here.
Other examples were focused on saving the image on the phone with reduced size, but my image should remain in byte[]. This question thus does not help.
I hope someone can help.
You can try to use nuget Xam.Plugin.Media to set the compression quality to take photos and compress as well.
Please refer to the following code:
private async void cmdCameraPhotograph_Clicked(object sender, EventArgs e)
{
if (CrossMedia.Current.IsTakePhotoSupported)
{
var file = await CrossMedia.Current.TakePhotoAsync(new StoreCameraMediaOptions
{
Directory = "Photographs",
SaveToAlbum = true,
CompressionQuality = 40,
CustomPhotoSize = 35,
PhotoSize = PhotoSize.MaxWidthHeight,
MaxWidthHeight = 2000,
DefaultCamera = CameraDevice.Rear
}).ConfigureAwait(true);
if (file != null)
{
}
}
else
{
await DisplayAlert("Not Supported", "Your device does not support this feature.", "OK. Understood")
.ConfigureAwait(true);
}
}
You can also get the file from the Gallery
var file = await CrossMedia.Current.PickPhotoAsync(new PickMediaOptions
{
CompressionQuality = 40,
CustomPhotoSize = 35,
PhotoSize = PhotoSize.MaxWidthHeight,
MaxWidthHeight = 2000
}).ConfigureAwait(true);
You can also compress images on different platforms individually.
For more information,you can check thread Compress images.

How to print to a thermal printer from Canvas to image?

I'm using the blue_thermal_printer with Flutter Android to try and create an image from a Canvas recording but the image prints as a solid block instead of an image:
This class is responsible for creating the bytedata of the image:
import 'dart:typed_data';
import 'dart:ui' as ui;
import 'package:flutter/material.dart';
class LabelPainter {
Future<ByteData> getImageByteData() async {
int _width = 60;
int _height = 60;
ui.PictureRecorder recorder = new ui.PictureRecorder();
Paint _paint = Paint()
..style = PaintingStyle.stroke
..strokeWidth = 4.0;
Canvas c = new Canvas(recorder);
c.drawRRect(RRect.fromLTRBAndCorners(20, 30, 40, 50), _paint);
_paint.color = Colors.red;
c.drawRect(Rect.fromLTWH(10, 10, 10, 10), _paint);
_paint.color = Colors.blue;
c.drawRect(
Rect.fromCenter(center: Offset(50, 50), height: 50, width: 50), _paint);
_paint.color = Colors.black;
c.drawRect(
Rect.fromPoints(
Offset(0, 0), Offset(_width.toDouble(), _height.toDouble())),
_paint);
// c.drawPaint(Paint()); // etc
ui.Picture p = recorder.endRecording();
ui.Image _uiImg = await p.toImage(
_width, _height); //.toByteData(format: ImageByteFormat.png);
ByteData _byteData =
await _uiImg.toByteData(format: ui.ImageByteFormat.png);
return _byteData;
}
}
This is part of the status widget that gets the ByteData then saves the image to the file directory:
class _PrinterState extends State<Printer> {
String pathImage;
LabelPainter _labelPainter = new LabelPainter();
#override
void initState() {
super.initState();
initSavetoPath();
}
initSavetoPath() async {
//read and write
//image max 300px X 300px
final filename = 'yourlogo.png';
// var bytes = await rootBundle.load("images/logo.png");
ByteData bytes = await _labelPainter.getImageByteData();
String dir = (await getApplicationDocumentsDirectory()).path;
writeToFile(bytes, '$dir/$filename');
setState(() {
pathImage = '$dir/$filename';
});
}
#override
Widget build(BuildContext context) {
return Container();
}
//write to app path
Future<void> writeToFile(ByteData data, String path) {
final buffer = data.buffer;
return new File(path).writeAsBytes(
buffer.asUint8List(data.offsetInBytes, data.lengthInBytes));
}
}
This is the method I call when I want to print the image:
void _tesPrint() async {
//SIZE
// 0- normal size text
// 1- only bold text
// 2- bold with medium text
// 3- bold with large text
//ALIGN
// 0- ESC_ALIGN_LEFT
// 1- ESC_ALIGN_CENTER
// 2- ESC_ALIGN_RIGHT
bluetooth.isConnected.then((isConnected) async {
if (isConnected) {;
// bluetooth.printImageBytes(await _labelPainter.getImageBytesUint());
bluetooth.printImage(pathImage);
// bluetooth.printNewLine();
// bluetooth.printCustom("Terimakasih", 2, 1);
// bluetooth.printNewLine();
// bluetooth.printQRcode("Insert Your Own Text to Generate", 50, 50, 0);
// bluetooth.paperCut();
}
});
I already had this problem.
Only difference is that i generate image from widget, like a screenshoot.
So the same problem occours to share and print, the image is totally black.
The solution was to provide a white background and then, the problem was solved. The image content can be visualized.

How to convert image to uint8list in flutter without using async?

PdfImage requires Uint8List as param but I have ImageProvider. So how can we convert image to uint8list in flutter?
var imageProvider = AssetImage('assets/test.jpg');
final image = PdfImage(
pdf.document,
image:???, /// Uint8List required
width: img.width,
height: img.height,
);
Using FutureBuilder:
Use rootBundle.load()
(await rootBundle.load(/*YOUR IMAGE PATH HERE*/)).buffer.asUint8List()
UPDATE
As load() is an async operation, you need to wait until the data is fully loaded. Try substituting the UI with some loading indicator until then.
ByteData imageData;
#override
void initState() {
rootBundle.load('assets/test.jpg')
.then((data) => setState(() => this.imageData = data));
}
#override
Widget build(BuildContext context) {
if (imageData == null) {
return Center(child: CircularProgressIndicator());
}
final image = PdfImage(
pdf.document,
image: imageData.buffer.asUint8List(),
width: img.width,
height: img.height,
);
...
}
I tried different solutions to convert image to UInt8List and finally found one Solution. It worked for me.
XFile? image = await imagePicker.pickImage(
source: ImageSource.gallery,
); // Upload file from gallery
final bytes = await image!.readAsBytes(); // Converts the file to UInt8List
for the output, i used MemoryImage
MemoryImage(bytes!);
in Flutter, attaching local image to pdf file.
Actually It's a simple solution to add our local image to pdf file.
just copy paste the following code and try
final ByteData bytes = await rootBundle.load('assets/logo.jpg');
final Uint8List list = bytes.buffer.asUint8List();
final image = PdfImage.file(
pdf.document,
bytes: list,
);
pdf.addPage(pw.Page(build: (pw.Context context) {
return pw.Center(
child: pw.Image(image),
); // Center
}));
You could split initState into two if you prefer:
#override
void initState() {
loadAsset('test.jpg');
}
void loadAsset(string name) async {
var data = await rootBundle.load('assets/$name');
setState(() => this.imageData = data);
}
Note that this will cause build() to run an extra time but I find it easier on the eye. With Michael's circular Indicator, this is a harmless extra cycle.

Flutter: How would one save a Canvas/CustomPainter to an image file?

I am trying to collect a signature from the user and save it to an image. I have made it far enough that I can draw on the screen, but now I'd like to click a button to save to an image and store in my database.
This is what I have so far:
import 'package:flutter/material.dart';
class SignaturePadPage extends StatefulWidget {
SignaturePadPage({Key key}) : super(key: key);
#override
_SignaturePadPage createState() => new _SignaturePadPage();
}
class _SignaturePadPage extends State<SignaturePadPage> {
List<Offset> _points = <Offset>[];
#override
Widget build(BuildContext context) {
return Container(
color: Colors.white,
child: GestureDetector(
onPanUpdate: (DragUpdateDetails details) {
setState(() {
RenderBox referenceBox = context.findRenderObject();
Offset localPosition =
referenceBox.globalToLocal(details.globalPosition);
_points = new List.from(_points)..add(localPosition);
});
},
onPanEnd: (DragEndDetails details) => _points.add(null),
child: new CustomPaint(painter: new SignaturePainter(_points)),
),
);
}
}
class SignaturePainter extends CustomPainter {
SignaturePainter(this.points);
final List<Offset> points;
void paint(Canvas canvas, Size size) {
Paint paint = new Paint()
..color = Colors.black
..strokeCap = StrokeCap.round
..strokeWidth = 5.0;
for (int i = 0; i < points.length - 1; i++) {
if (points[i] != null && points[i + 1] != null)
canvas.drawLine(points[i], points[i + 1], paint);
}
}
bool shouldRepaint(SignaturePainter other) => other.points != points;
}
Not sure where to go from there...
You can capture the output of a CustomPainter with PictureRecorder. Pass your PictureRecorder instance to the constructor for your Canvas. The Picture returned by PictureRecorder.endRecording can then be converted to an Image with Picture.toImage. Finally, extract the image bytes using Image.toByteData.
Here's an example: https://github.com/rxlabz/flutter_canvas_to_image
Add the rendered method in your widget
ui.Image get rendered {
// [CustomPainter] has its own #canvas to pass our
// [ui.PictureRecorder] object must be passed to [Canvas]#contructor
// to capture the Image. This way we can pass #recorder to [Canvas]#contructor
// using #painter[SignaturePainter] we can call [SignaturePainter]#paint
// with the our newly created #canvas
ui.PictureRecorder recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
SignaturePainter painter = SignaturePainter(points: _points);
var size = context.size;
painter.paint(canvas, size);
return recorder.endRecording()
.toImage(size.width.floor(), size.height.floor());
}
Then using state fetch the rendered image
var image = signatureKey.currentState.rendered
Now, you can produce png Image using toByteData(format: ui.ImageByteFormat.png) and store using asInt8List()
var pngBytes = await image.toByteData(format: ui.ImageByteFormat.png);
File('your-path/filename.png')
.writeAsBytesSync(pngBytes.buffer.asInt8List());
For complete example, on how to export canvas as png check out this example
https://github.com/vemarav/signature
The existing solutions worked for me, but the images I captured with PictureRecorder were always blurry vs. what was rendering on-screen. I eventually realized I could use some elementary Canvas tricks to pull this off. Basically, after you create the PictureRecorder's Canvas, set its size to multiple times your desired scale (here I have it set to 4x). Then just canvas.scale it. Boom - your generated images are no longer blurry vs. what appears on screens with modern resolutions!
You may want to crank the _overSampleScale value higher for printed or images that may be blown up/expanded, or lower if you're using this a ton and want to improve image preview loading performance. Using it on-screen, you'll need to constrain your Image.memory Widget with a Container of the actual width and height, as with the other solutions. Ideally this number would be the ratio between Flutter's DPI in its fake "pixels" (i.e. what PictureRecorder captures) and the actual DPI of the screen.
static const double _overSampleScale = 4;
Future<ui.Image> get renderedScoreImage async {
final recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
final size = Size(widget.width * _overSampleScale, widget.height * _overSampleScale);
final painter = SignaturePainter(points: _points);
canvas.save();
canvas.scale(_overSampleScale);
painter.paint(canvas, size);
canvas.restore();
final data = recorder.endRecording()
.toImage(size.width.floor(), size.height.floor());
return data;
}
Given all the data that you need to paint your custom painter, this is all you need to do (in this example, "points" were needed for my customer painter,of course this will change based on your usecase):
Future<void> _handleSavePressed() async {
PictureRecorder recorder = PictureRecorder();
Canvas canvas = Canvas(recorder);
var painter = MyCustomPainter(points: points);
var size = _containerKey.currentContext.size;
painter.paint(canvas, size);
ui.Image renderedImage = await recorder
.endRecording()
.toImage(size.width.floor(), size.height.floor());
var pngBytes =
await renderedImage.toByteData(format: ui.ImageByteFormat.png);
Directory saveDir = await getApplicationDocumentsDirectory();
String path = '${saveDir.path}/custom_image.jpg';
File saveFile = File(path);
if (!saveFile.existsSync()) {
saveFile.createSync(recursive: true);
}
saveFile.writeAsBytesSync(pngBytes.buffer.asUint8List(), flush: true);
await GallerySaver.saveImage(path, albumName: 'iDream');
print('Image was saved!');
}
Answer based on https://gist.github.com/OPY-bbt/a5418127d8444393a2ef25ad2d966dc0
Follow the complete class to draw a PNG image using Flutter > 3.0.0
import 'dart:typed_data';
import 'dart:ui';
import 'dart:ui' as ui;
import 'package:flutter/material.dart';
class BitmapUtils {
Future<Uint8List> generateImagePngAsBytes(String text) async {
ByteData? image = await generateSquareWithText(text);
return image!.buffer.asUint8List();
}
Future<ByteData?> generateSquareWithText(String text) async {
final recorder = PictureRecorder();
final canvas = Canvas(
recorder, Rect.fromPoints(Offset(0.0, 0.0), Offset(200.0, 200.0)));
final stroke = Paint()
..color = Colors.grey
..style = PaintingStyle.stroke;
canvas.drawRect(Rect.fromLTWH(0.0, 0.0, 200.0, 200.0), stroke);
final textPainter = TextPainter(
text: TextSpan(
text: text,
style: TextStyle(
color: Colors.black,
fontSize: 30,
),
),
textDirection: TextDirection.ltr,
textAlign: TextAlign.center);
textPainter.layout();
// Draw the text centered around the point (50, 100) for instance
final offset =
Offset(50 - (textPainter.width / 2), 100 - (textPainter.height / 2));
textPainter.paint(canvas, offset);
final picture = recorder.endRecording();
ui.Image img = await picture.toImage(200, 200);
final ByteData? pngBytes =
await img.toByteData(format: ImageByteFormat.png);
return pngBytes;
}
}

Why does live camera capture control with Xamarin Forms on iOS freeze?

I downloaded the source for Xamarin Moments from GitHub and now I'm trying to convert the CameraPage renderer from Page to a ContentView
Then I refactored the code to make it a ContentView renderer. Most of the actual setup of the live preview and image capture comes from the Moments app with some refactoring where needed/preferred.
The live preview shows up but when I press the button to take the picture the app freezes without an exception, not even in Xcode's console view.
//this is how it's called:
btnTakePicture.Clicked += (s,e)=> { GetCameraImage().Wait(); };
// this method freezes
public async Task<byte[]> GetCameraImage()
{
byte[] imageBuffer = null;
if (captureDeviceInput != null)
{
var videoConnection = stillImageOutput.ConnectionFromMediaType(AVMediaType.Video);
Console.WriteLine("[HASFIQWRPPOA] This message shows up");
// this is where the app freezes, even though the live preview still moves.
var sampleBuffer = await stillImageOutput.CaptureStillImageTaskAsync(videoConnection);
Console.WriteLine("[CLKJFADSFQXW] THIS DOESN'T SHOW UP");
// var jpegImageAsBytes = AVCaptureStillImageOutput.JpegStillToNSData (sampleBuffer).ToArray ();
var jpegImageAsNsData = AVCaptureStillImageOutput.JpegStillToNSData(sampleBuffer);
Console.WriteLine("[ROIAJDGNQWTG]");
// var image = new UIImage (jpegImageAsNsData);
// var image2 = new UIImage (image.CGImage, image.CurrentScale, UIImageOrientation.UpMirrored);
// var data = image2.AsJPEG ().ToArray ();
imageBuffer = jpegImageAsNsData.ToArray();
Console.WriteLine("[FIOUJGAIDGUQ] Image buffer: "+imageBuffer.Length);
}
if (imageBuffer != null && imageBuffer.Length > 100)
{
using (var ms = new MemoryStream(imageBuffer))
{
var uiimg = UIImage.LoadFromData(NSData.FromStream(ms));
this.Add(new UIImageView(uiimg));
}
}
return imageBuffer;
}
Here is how I set the live preview
// This method runs fine and the camera preview is started as expected
public void SetupLiveCameraStream()
{
try
{
// add a UIView to the renderer
liveCameraStream = new UIView()
{
Frame = new CGRect(0f, 0f, Element.Width, Element.Height),
};
this.Add(liveCameraStream);
// find a camera
var captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
if (captureDevice != null)
{
Console.WriteLine("[ZKSDJGWEHSY] Capture device found"); // not the case on simulator
captureSession = new AVCaptureSession();
videoPreviewLayer = new AVCaptureVideoPreviewLayer(captureSession)
{
Frame = liveCameraStream.Bounds
};
liveCameraStream.Layer.AddSublayer(videoPreviewLayer);
ConfigureCameraForDevice(captureDevice);
captureDeviceInput = AVCaptureDeviceInput.FromDevice(captureDevice);
var dictionary = new NSMutableDictionary();
dictionary[AVVideo.CodecKey] = new NSNumber((int)AVVideoCodec.JPEG);
stillImageOutput = new AVCaptureStillImageOutput()
{
OutputSettings = new NSDictionary()
};
captureSession.AddInput(captureDeviceInput);
captureSession.AddOutput(stillImageOutput);
captureSession.StartRunning();
Console.WriteLine("[OIGAJGUWRJHWY] Camera session started");
}
else
{
Console.WriteLine("[OASDFUJGOR] Could not find a camera device");
}
}
catch (Exception x)
{
Console.WriteLine("[QWKRIFQEAHJF] ERROR:" + x);
}
}
I had this issue, and it turned out I was deadlocking because of a combination of using async/await with Task.Result. At a guess you could be experiencing something similar with your usage of Task.Wait().
The two sections of code:
btnTakePicture.Clicked += await (s,e) => { GetCameraImage().Wait(); };
And:
var sampleBuffer = await stillImageOutput.CaptureStillImageTaskAsync(videoConnection);

Resources