I am trying to collect a signature from the user and save it to an image. I have made it far enough that I can draw on the screen, but now I'd like to click a button to save to an image and store in my database.
This is what I have so far:
import 'package:flutter/material.dart';
class SignaturePadPage extends StatefulWidget {
SignaturePadPage({Key key}) : super(key: key);
#override
_SignaturePadPage createState() => new _SignaturePadPage();
}
class _SignaturePadPage extends State<SignaturePadPage> {
List<Offset> _points = <Offset>[];
#override
Widget build(BuildContext context) {
return Container(
color: Colors.white,
child: GestureDetector(
onPanUpdate: (DragUpdateDetails details) {
setState(() {
RenderBox referenceBox = context.findRenderObject();
Offset localPosition =
referenceBox.globalToLocal(details.globalPosition);
_points = new List.from(_points)..add(localPosition);
});
},
onPanEnd: (DragEndDetails details) => _points.add(null),
child: new CustomPaint(painter: new SignaturePainter(_points)),
),
);
}
}
class SignaturePainter extends CustomPainter {
SignaturePainter(this.points);
final List<Offset> points;
void paint(Canvas canvas, Size size) {
Paint paint = new Paint()
..color = Colors.black
..strokeCap = StrokeCap.round
..strokeWidth = 5.0;
for (int i = 0; i < points.length - 1; i++) {
if (points[i] != null && points[i + 1] != null)
canvas.drawLine(points[i], points[i + 1], paint);
}
}
bool shouldRepaint(SignaturePainter other) => other.points != points;
}
Not sure where to go from there...
You can capture the output of a CustomPainter with PictureRecorder. Pass your PictureRecorder instance to the constructor for your Canvas. The Picture returned by PictureRecorder.endRecording can then be converted to an Image with Picture.toImage. Finally, extract the image bytes using Image.toByteData.
Here's an example: https://github.com/rxlabz/flutter_canvas_to_image
Add the rendered method in your widget
ui.Image get rendered {
// [CustomPainter] has its own #canvas to pass our
// [ui.PictureRecorder] object must be passed to [Canvas]#contructor
// to capture the Image. This way we can pass #recorder to [Canvas]#contructor
// using #painter[SignaturePainter] we can call [SignaturePainter]#paint
// with the our newly created #canvas
ui.PictureRecorder recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
SignaturePainter painter = SignaturePainter(points: _points);
var size = context.size;
painter.paint(canvas, size);
return recorder.endRecording()
.toImage(size.width.floor(), size.height.floor());
}
Then using state fetch the rendered image
var image = signatureKey.currentState.rendered
Now, you can produce png Image using toByteData(format: ui.ImageByteFormat.png) and store using asInt8List()
var pngBytes = await image.toByteData(format: ui.ImageByteFormat.png);
File('your-path/filename.png')
.writeAsBytesSync(pngBytes.buffer.asInt8List());
For complete example, on how to export canvas as png check out this example
https://github.com/vemarav/signature
The existing solutions worked for me, but the images I captured with PictureRecorder were always blurry vs. what was rendering on-screen. I eventually realized I could use some elementary Canvas tricks to pull this off. Basically, after you create the PictureRecorder's Canvas, set its size to multiple times your desired scale (here I have it set to 4x). Then just canvas.scale it. Boom - your generated images are no longer blurry vs. what appears on screens with modern resolutions!
You may want to crank the _overSampleScale value higher for printed or images that may be blown up/expanded, or lower if you're using this a ton and want to improve image preview loading performance. Using it on-screen, you'll need to constrain your Image.memory Widget with a Container of the actual width and height, as with the other solutions. Ideally this number would be the ratio between Flutter's DPI in its fake "pixels" (i.e. what PictureRecorder captures) and the actual DPI of the screen.
static const double _overSampleScale = 4;
Future<ui.Image> get renderedScoreImage async {
final recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
final size = Size(widget.width * _overSampleScale, widget.height * _overSampleScale);
final painter = SignaturePainter(points: _points);
canvas.save();
canvas.scale(_overSampleScale);
painter.paint(canvas, size);
canvas.restore();
final data = recorder.endRecording()
.toImage(size.width.floor(), size.height.floor());
return data;
}
Given all the data that you need to paint your custom painter, this is all you need to do (in this example, "points" were needed for my customer painter,of course this will change based on your usecase):
Future<void> _handleSavePressed() async {
PictureRecorder recorder = PictureRecorder();
Canvas canvas = Canvas(recorder);
var painter = MyCustomPainter(points: points);
var size = _containerKey.currentContext.size;
painter.paint(canvas, size);
ui.Image renderedImage = await recorder
.endRecording()
.toImage(size.width.floor(), size.height.floor());
var pngBytes =
await renderedImage.toByteData(format: ui.ImageByteFormat.png);
Directory saveDir = await getApplicationDocumentsDirectory();
String path = '${saveDir.path}/custom_image.jpg';
File saveFile = File(path);
if (!saveFile.existsSync()) {
saveFile.createSync(recursive: true);
}
saveFile.writeAsBytesSync(pngBytes.buffer.asUint8List(), flush: true);
await GallerySaver.saveImage(path, albumName: 'iDream');
print('Image was saved!');
}
Answer based on https://gist.github.com/OPY-bbt/a5418127d8444393a2ef25ad2d966dc0
Follow the complete class to draw a PNG image using Flutter > 3.0.0
import 'dart:typed_data';
import 'dart:ui';
import 'dart:ui' as ui;
import 'package:flutter/material.dart';
class BitmapUtils {
Future<Uint8List> generateImagePngAsBytes(String text) async {
ByteData? image = await generateSquareWithText(text);
return image!.buffer.asUint8List();
}
Future<ByteData?> generateSquareWithText(String text) async {
final recorder = PictureRecorder();
final canvas = Canvas(
recorder, Rect.fromPoints(Offset(0.0, 0.0), Offset(200.0, 200.0)));
final stroke = Paint()
..color = Colors.grey
..style = PaintingStyle.stroke;
canvas.drawRect(Rect.fromLTWH(0.0, 0.0, 200.0, 200.0), stroke);
final textPainter = TextPainter(
text: TextSpan(
text: text,
style: TextStyle(
color: Colors.black,
fontSize: 30,
),
),
textDirection: TextDirection.ltr,
textAlign: TextAlign.center);
textPainter.layout();
// Draw the text centered around the point (50, 100) for instance
final offset =
Offset(50 - (textPainter.width / 2), 100 - (textPainter.height / 2));
textPainter.paint(canvas, offset);
final picture = recorder.endRecording();
ui.Image img = await picture.toImage(200, 200);
final ByteData? pngBytes =
await img.toByteData(format: ImageByteFormat.png);
return pngBytes;
}
}
Related
I'm using the blue_thermal_printer with Flutter Android to try and create an image from a Canvas recording but the image prints as a solid block instead of an image:
This class is responsible for creating the bytedata of the image:
import 'dart:typed_data';
import 'dart:ui' as ui;
import 'package:flutter/material.dart';
class LabelPainter {
Future<ByteData> getImageByteData() async {
int _width = 60;
int _height = 60;
ui.PictureRecorder recorder = new ui.PictureRecorder();
Paint _paint = Paint()
..style = PaintingStyle.stroke
..strokeWidth = 4.0;
Canvas c = new Canvas(recorder);
c.drawRRect(RRect.fromLTRBAndCorners(20, 30, 40, 50), _paint);
_paint.color = Colors.red;
c.drawRect(Rect.fromLTWH(10, 10, 10, 10), _paint);
_paint.color = Colors.blue;
c.drawRect(
Rect.fromCenter(center: Offset(50, 50), height: 50, width: 50), _paint);
_paint.color = Colors.black;
c.drawRect(
Rect.fromPoints(
Offset(0, 0), Offset(_width.toDouble(), _height.toDouble())),
_paint);
// c.drawPaint(Paint()); // etc
ui.Picture p = recorder.endRecording();
ui.Image _uiImg = await p.toImage(
_width, _height); //.toByteData(format: ImageByteFormat.png);
ByteData _byteData =
await _uiImg.toByteData(format: ui.ImageByteFormat.png);
return _byteData;
}
}
This is part of the status widget that gets the ByteData then saves the image to the file directory:
class _PrinterState extends State<Printer> {
String pathImage;
LabelPainter _labelPainter = new LabelPainter();
#override
void initState() {
super.initState();
initSavetoPath();
}
initSavetoPath() async {
//read and write
//image max 300px X 300px
final filename = 'yourlogo.png';
// var bytes = await rootBundle.load("images/logo.png");
ByteData bytes = await _labelPainter.getImageByteData();
String dir = (await getApplicationDocumentsDirectory()).path;
writeToFile(bytes, '$dir/$filename');
setState(() {
pathImage = '$dir/$filename';
});
}
#override
Widget build(BuildContext context) {
return Container();
}
//write to app path
Future<void> writeToFile(ByteData data, String path) {
final buffer = data.buffer;
return new File(path).writeAsBytes(
buffer.asUint8List(data.offsetInBytes, data.lengthInBytes));
}
}
This is the method I call when I want to print the image:
void _tesPrint() async {
//SIZE
// 0- normal size text
// 1- only bold text
// 2- bold with medium text
// 3- bold with large text
//ALIGN
// 0- ESC_ALIGN_LEFT
// 1- ESC_ALIGN_CENTER
// 2- ESC_ALIGN_RIGHT
bluetooth.isConnected.then((isConnected) async {
if (isConnected) {;
// bluetooth.printImageBytes(await _labelPainter.getImageBytesUint());
bluetooth.printImage(pathImage);
// bluetooth.printNewLine();
// bluetooth.printCustom("Terimakasih", 2, 1);
// bluetooth.printNewLine();
// bluetooth.printQRcode("Insert Your Own Text to Generate", 50, 50, 0);
// bluetooth.paperCut();
}
});
I already had this problem.
Only difference is that i generate image from widget, like a screenshoot.
So the same problem occours to share and print, the image is totally black.
The solution was to provide a white background and then, the problem was solved. The image content can be visualized.
I'm trying to implement an UI where the user can edit and apply effects to an uploaded image, and want to save the BlendMode merged to the image. It's possible to save the result of the blended image or apply it using the Canvas?
There are some packages that apply some specific filters, but I want something more customizable for the end user.
I already saw some examples of how to implement Canvas to draw images, but can't figure it out how to use to load an image an apply the blend related in the docs. Anyone could give an example?
UPDATED:
For who has the same question, bellow follows the code with how to save a image from canvas to a file with blendMode applied.
But I still haven't the result expected. The quality of the image generated isn't the same as the original image, neither the blend seems to be the blend that i've applied. And i can't save as jpg, just as png file.
So, how can i load an image, apply a blend with canvas and save as a jpg file, without losing quality?
CODE:
const kCanvasSize = 200.0;
class CanvasImageToFile {
CanvasImageToFile._();
static final instance = CanvasImageToFile._();
ByteData _readFromFile(File file) {
// File file = getSomeCorrectFile();
Uint8List bytes = file.readAsBytesSync();
return ByteData.view(bytes.buffer);
}
Future<File> _writeToFile(ByteData data) async {
String dir = (await getTemporaryDirectory()).path;
String filePath = '$dir/tempImage.jpg';
final buffer = data.buffer;
return new File(filePath).writeAsBytes(
buffer.asUint8List(data.offsetInBytes, data.lengthInBytes));
}
Future<ui.Image> _loadImageSource(File imageSource) async {
// ByteData data = await rootBundle.load(asset);
ByteData data = _readFromFile(imageSource);
ui.Codec codec = await ui.instantiateImageCodec(data.buffer.asUint8List());
ui.FrameInfo fi = await codec.getNextFrame();
return fi.image;
}
Future<File> generateImage(File imageSource) async {
File imageResult;
ui.Image image;
await _loadImageSource(imageSource).then((value) {
image = value;
});
if (image != null) {
final recorder = ui.PictureRecorder();
var rect =
Rect.fromPoints(Offset(0.0, 0.0), Offset(kCanvasSize, kCanvasSize));
final canvas = Canvas(recorder, rect);
Size outputSize = rect.size;
Paint paint = new Paint();
//OVERLAY - BlendMode uses the previously drawn content as a mask
paint.blendMode = BlendMode.colorBurn;
paint.color = Colors.red;
// paint.colorFilter = ColorFilter.mode(Colors.blue, BlendMode.colorDodge);
// paint = Paint()..color = Colors.red;
// paint = Paint()..blendMode = BlendMode.multiply;
//Image
Size inputSize = Size(image.width.toDouble(), image.height.toDouble());
final FittedSizes fittedSizes =
applyBoxFit(BoxFit.cover, inputSize, outputSize);
final Size sourceSize = fittedSizes.source;
final Rect sourceRect =
Alignment.center.inscribe(sourceSize, Offset.zero & inputSize);
canvas.saveLayer(rect, paint);
canvas.drawImageRect(
image, sourceRect, rect, paint);
canvas.restore();
final picture = recorder.endRecording();
final img = await picture.toImage(200, 200);
final byteData = await img.toByteData(format: ImageByteFormat.png);
await _writeToFile(byteData).then((value) {
imageResult = value;
});
return imageResult;
}
After some research e some adjust at decoding image from png to rawUnmodified in my previous code using (Bitmap package), i could save the image with the original format (jpg) and achieved what i wanted. If there's anyone who have the same question, bellow follows the code to load an image with canvas, apply a blend and write to a file with the same quality:
Future<File> generateImage(
File imageSource, Color color, BlendMode blendMode) async {
File imageResult;
ui.Image image;
await _loadImageSource(imageSource).then((value) {
image = value;
});
if (image != null) {
final recorder = ui.PictureRecorder();
var rect = Rect.fromPoints(Offset(0.0, 0.0),
Offset(image.width.toDouble(), image.height.toDouble()));
final canvas = Canvas(recorder, rect);
Size outputSize = rect.size;
Paint paint = new Paint();
//OVERLAY - BlendMode uses the previously drawn content as a mask
// paint.blendMode = blendMode;
// paint.color = color;
paint.colorFilter = ColorFilter.mode(color, blendMode);
//Image
Size inputSize = Size(image.width.toDouble(), image.height.toDouble());
final FittedSizes fittedSizes =
applyBoxFit(BoxFit.contain, inputSize, outputSize);
final Size sourceSize = fittedSizes.source;
final Rect sourceRect =
Alignment.center.inscribe(sourceSize, Offset.zero & inputSize);
canvas.drawImageRect(image, sourceRect, rect, paint);
final picture = recorder.endRecording();
final img = await picture.toImage(image.width, image.height);
ByteData byteData =
await img.toByteData(format: ui.ImageByteFormat.rawUnmodified);
Bitmap bitmap = Bitmap.fromHeadless(
image.width, image.height, byteData.buffer.asUint8List());
Uint8List headedIntList = bitmap.buildHeaded();
await _writeToFile(headedIntList.buffer.asByteData()).then((value) {
imageResult = value;
});
return imageResult;
}
}
I'm trying to write an asynchronous function for saving a drawing with canvas to file. For some reason the function does not complete, meaning the file is not created. I am wondering if the process just takes for a long while, since flutter does not throw any error.
void _saveImage() async {
final recorder = new ui.PictureRecorder();
final canvas = new Canvas(recorder);
MyPainter mypainter = new MyPainter(
shapes: _shapes,
lineColor: Colors.amber,
width: 1.0
);
mypainter.paint(canvas,Size(1280, 720));
final picture = recorder.endRecording();
final img = await picture.toImage(1280, 720);
final pngBytes = await img.toByteData(format: ImageByteFormat.png); //This line will not work
File('/home/filename.png').writeAsBytesSync(await pngBytes.buffer.asInt8List());
}
Edit: Currently toByteData is not supported by flutter web, see https://github.com/flutter/flutter/issues/44908.
PdfImage requires Uint8List as param but I have ImageProvider. So how can we convert image to uint8list in flutter?
var imageProvider = AssetImage('assets/test.jpg');
final image = PdfImage(
pdf.document,
image:???, /// Uint8List required
width: img.width,
height: img.height,
);
Using FutureBuilder:
Use rootBundle.load()
(await rootBundle.load(/*YOUR IMAGE PATH HERE*/)).buffer.asUint8List()
UPDATE
As load() is an async operation, you need to wait until the data is fully loaded. Try substituting the UI with some loading indicator until then.
ByteData imageData;
#override
void initState() {
rootBundle.load('assets/test.jpg')
.then((data) => setState(() => this.imageData = data));
}
#override
Widget build(BuildContext context) {
if (imageData == null) {
return Center(child: CircularProgressIndicator());
}
final image = PdfImage(
pdf.document,
image: imageData.buffer.asUint8List(),
width: img.width,
height: img.height,
);
...
}
I tried different solutions to convert image to UInt8List and finally found one Solution. It worked for me.
XFile? image = await imagePicker.pickImage(
source: ImageSource.gallery,
); // Upload file from gallery
final bytes = await image!.readAsBytes(); // Converts the file to UInt8List
for the output, i used MemoryImage
MemoryImage(bytes!);
in Flutter, attaching local image to pdf file.
Actually It's a simple solution to add our local image to pdf file.
just copy paste the following code and try
final ByteData bytes = await rootBundle.load('assets/logo.jpg');
final Uint8List list = bytes.buffer.asUint8List();
final image = PdfImage.file(
pdf.document,
bytes: list,
);
pdf.addPage(pw.Page(build: (pw.Context context) {
return pw.Center(
child: pw.Image(image),
); // Center
}));
You could split initState into two if you prefer:
#override
void initState() {
loadAsset('test.jpg');
}
void loadAsset(string name) async {
var data = await rootBundle.load('assets/$name');
setState(() => this.imageData = data);
}
Note that this will cause build() to run an extra time but I find it easier on the eye. With Michael's circular Indicator, this is a harmless extra cycle.
First things first, the goal of this program is to allow user to sign officials document with a phone or tablet.
The program has to save the image as a png.
I use Flutter(and dart) and VS Code to develop this app.
What works :
-The user can draw on the canvas.
What don't works:
-the image can't be saved as a png
What i found:
-The **Picture** get by ending the **PictureRecoder** of the canvas is empty (i tried to display it but no success)
-I tried to save it as a PNG using **PictureRecorder.EndRecording().toImage(100.0,100.0).toByteDate(EncodingFormat.png())** but the size is really small, and it can't be displayed.
If some of you could give me some hint about where the problem could be, it would be really nice.
FYI : Flutter is at the latest version on the dev channel
Here's the full code :
import 'dart:ui' as ui;
import 'package:flutter/material.dart';
///This class is use just to check if the Image returned by
///the PictureRecorder of the first Canvas is not empty.
///FYI : The image is not displayed.
class CanvasImageDraw extends CustomPainter {
ui.Picture picture;
CanvasImageDraw(this.picture);
#override
void paint(ui.Canvas canvas, ui.Size size) {
canvas.drawPicture(picture);
}
#override
bool shouldRepaint(CustomPainter oldDelegate) {
return true;
}
}
///Class used to display the second canvas
class SecondCanvasView extends StatelessWidget {
ui.Picture picture;
var canvas;
var pictureRecorder= new ui.PictureRecorder();
SecondCanvasView(this.picture) {
canvas = new Canvas(pictureRecorder);
}
#override
Widget build(BuildContext context) {
return new Scaffold(
appBar: new AppBar(
title: new Text('Image Debug'),
),
body: new Column(
children: <Widget>[
new Text('Top'),
CustomPaint(
painter: new CanvasImageDraw(picture),
),
new Text('Bottom'),
],
));
}
}
///This is the CustomPainter of the first Canvas which is used
///to draw to display the users draw/sign.
class SignaturePainter extends CustomPainter {
final List<Offset> points;
Canvas canvas;
ui.PictureRecorder pictureRecorder;
SignaturePainter(this.points, this.canvas, this.pictureRecorder);
void paint(canvas, Size size) {
Paint paint = new Paint()
..color = Colors.black
..strokeCap = StrokeCap.round
..strokeWidth = 5.0;
for (int i = 0; i < points.length - 1; i++) {
if (points[i] != null && points[i + 1] != null)
canvas.drawLine(points[i], points[i + 1], paint);
}
}
bool shouldRepaint(SignaturePainter other) => other.points != points;
}
///Classes used to get the user draw/sign, send it to the first canvas,
///then display it.
///'points' list of position returned by the GestureDetector
class Signature extends StatefulWidget {
ui.PictureRecorder pictureRecorder = new ui.PictureRecorder();
Canvas canvas;
List<Offset> points = [];
Signature() {
canvas = new Canvas(pictureRecorder);
}
SignatureState createState() => new SignatureState(canvas, points);
}
class SignatureState extends State<Signature> {
List<Offset> points = <Offset>[];
Canvas canvas;
SignatureState(this.canvas, this.points);
Widget build(BuildContext context) {
return new Stack(
children: [
GestureDetector(
onPanUpdate: (DragUpdateDetails details) {
RenderBox referenceBox = context.findRenderObject();
Offset localPosition =
referenceBox.globalToLocal(details.globalPosition);
setState(() {
points = new List.from(points)..add(localPosition);
});
},
onPanEnd: (DragEndDetails details) => points.add(null),
),
new Text('data'),
CustomPaint(
painter:
new SignaturePainter(points, canvas, widget.pictureRecorder)),
new Text('data')
],
);
}
}
///The main class which display the first Canvas, Drawing/Signig area
///
///Floating action button used to stop the PictureRecorder's recording,
///then send the Picture to the next view/Canvas which should display it
class DemoApp extends StatelessWidget {
Signature sign = new Signature();
Widget build(BuildContext context) => new Scaffold(
body: sign,
floatingActionButton: new FloatingActionButton(
child: new Icon(Icons.save),
onPressed: () async {
ui.Picture picture = sign.pictureRecorder.endRecording();
Navigator.push(context, new MaterialPageRoute(builder: (context) => new SecondCanvasView(picture)));
},
),
);
}
void main() => runApp(new MaterialApp(home: new DemoApp()));
First off, thanks for the interesting question and the self-contained answer. That is refreshing to see and a lot easier to help out with!
There's a few issues with your code. The first is that you shouldn't be passing the canvas & points from Signature to SignatureState; that's an antipattern in flutter.
This code here works. I've done a few little things that were unnecessary to the answer to clean it up as well.
import 'dart:ui' as ui;
import 'package:flutter/material.dart';
///This class is use just to check if the Image returned by
///the PictureRecorder of the first Canvas is not empty.
class CanvasImageDraw extends CustomPainter {
ui.Image image;
CanvasImageDraw(this.image);
#override
void paint(ui.Canvas canvas, ui.Size size) {
// simple aspect fit for the image
var hr = size.height / image.height;
var wr = size.width / image.width;
double ratio;
double translateX;
double translateY;
if (hr < wr) {
ratio = hr;
translateX = (size.width - (ratio * image.width)) / 2;
translateY = 0.0;
} else {
ratio = wr;
translateX = 0.0;
translateY = (size.height - (ratio * image.height)) / 2;
}
canvas.translate(translateX, translateY);
canvas.scale(ratio, ratio);
canvas.drawImage(image, new Offset(0.0, 0.0), new Paint());
}
#override
bool shouldRepaint(CanvasImageDraw oldDelegate) {
return image != oldDelegate.image;
}
}
///Class used to display the second canvas
class SecondView extends StatelessWidget {
ui.Image image;
var pictureRecorder = new ui.PictureRecorder();
SecondView({this.image});
#override
Widget build(BuildContext context) {
return new Scaffold(
appBar: new AppBar(
title: new Text('Image Debug'),
),
body: new Column(
children: <Widget>[
new Text('Top'),
CustomPaint(
painter: new CanvasImageDraw(image),
size: new Size(200.0, 200.0),
),
new Text('Bottom'),
],
));
}
}
///This is the CustomPainter of the first Canvas which is used
///to draw to display the users draw/sign.
class SignaturePainter extends CustomPainter {
final List<Offset> points;
final int revision;
SignaturePainter(this.points, [this.revision = 0]);
void paint(canvas, Size size) {
if (points.length < 2) return;
Paint paint = new Paint()
..color = Colors.black
..strokeCap = StrokeCap.round
..strokeWidth = 5.0;
for (int i = 0; i < points.length - 1; i++) {
if (points[i] != null && points[i + 1] != null)
canvas.drawLine(points[i], points[i + 1], paint);
}
}
// simplified this, but if you ever modify points instead of changing length you'll
// have to revise.
bool shouldRepaint(SignaturePainter other) => other.revision != revision;
}
///Classes used to get the user draw/sign, send it to the first canvas,
///then display it.
///'points' list of position returned by the GestureDetector
class Signature extends StatefulWidget {
Signature({Key key}): super(key: key);
#override
State<StatefulWidget> createState()=> new SignatureState();
}
class SignatureState extends State<Signature> {
List<Offset> _points = <Offset>[];
int _revision = 0;
ui.Image get rendered {
var pictureRecorder = new ui.PictureRecorder();
Canvas canvas = new Canvas(pictureRecorder);
SignaturePainter painter = new SignaturePainter(_points);
var size = context.size;
// if you pass a smaller size here, it cuts off the lines
painter.paint(canvas, size);
// if you use a smaller size for toImage, it also cuts off the lines - so I've
// done that in here as well, as this is the only place it's easy to get the width & height.
return pictureRecorder.endRecording().toImage(size.width.floor(), size.height.floor());
}
void _addToPoints(Offset position) {
_revision++;
_points.add(position);
}
Widget build(BuildContext context) {
return new Stack(
children: [
GestureDetector(
onPanStart: (DragStartDetails details) {
RenderBox referenceBox = context.findRenderObject();
Offset localPosition = referenceBox.globalToLocal(details.globalPosition);
setState(() {
_addToPoints(localPosition);
});
},
onPanUpdate: (DragUpdateDetails details) {
RenderBox referenceBox = context.findRenderObject();
Offset localPosition = referenceBox.globalToLocal(details.globalPosition);
setState(() {
_addToPoints(localPosition);
});
},
onPanEnd: (DragEndDetails details) => setState(() => _addToPoints(null)),
),
CustomPaint(painter: new SignaturePainter(_points, _revision)),
],
);
}
}
///The main class which display the first Canvas, Drawing/Signing area
///
///Floating action button used to stop the PictureRecorder's recording,
///then send the Picture to the next view/Canvas which should display it
class DemoApp extends StatelessWidget {
GlobalKey<SignatureState> signatureKey = new GlobalKey();
Widget build(BuildContext context) => new Scaffold(
body: new Signature(key: signatureKey),
floatingActionButton: new FloatingActionButton(
child: new Icon(Icons.save),
onPressed: () async {
var image = signatureKey.currentState.rendered;
Navigator.push(context, new MaterialPageRoute(builder: (context) => new SecondView(image: image)));
},
),
);
}
void main() => runApp(new MaterialApp(home: new DemoApp()));
The biggest issue you had is that you were holding your own canvas objects all over the place - that's not something you should ever do really. CustomPainter provides its own canvas. When you were painting, I don't think it was actually painting to the right place.
Also, you were recreating the list each time you added a point. I'm going to assume that's because your canvas wouldn't draw otherwise due to the other.points != points. I've changed it to take a revision int which is incremented each time something is added to the list. Even better would be to use your own list subclass that did this on its own, but that's a bit much for this example. You could also just use other.points.length != points.length if you're for sure never going to modify any of the elements in the list.
There were a few things to do considering the size of the image as well. I've gone the easiest route and made it so that the canvas is just the same size as its parent, so that the rendering is easier as it can just use that same size again (and therefore renders an image the size of the phone's screen). If you didn't want to do this but rather render your own size, it can be done. You'd have to pass in something to the SignaturePainter so it would know how to scale the points so they fit within the new size (you could adapt the aspect fit code in CanvasImageDraw for that if you so wished).
If you have any other questions about what I did, feel free to ask.