I want to retrieve image data in sqlite. im using below code
var image = await ImagePicker.pickImage(source: imageSource);
List<int> bytes = await image.readAsBytes();
i want to take image and after save it sqlite.if can get and set image from sqlite database ?.
I found the solution in my question.
I'm getting the image from an image_picker and Encode it to BASE64 string value like below
Uint8List _bytesImage;
File _image;
String base64Image;
Future getImage() async {
var image2 = await ImagePicker.pickImage(
source: ImageSource.gallery,
);
List<int> imageBytes = image2.readAsBytesSync();
print(imageBytes);
base64Image = base64Encode(imageBytes);
print('string is');
print(base64Image);
print("You selected gallery image : " + image2.path);
_bytesImage = Base64Decoder().convert(base64Image);
setState(() {
_image=image2;
});
}
after creating an SQLite database dbhelper.dart file to retrieve String values and database model file Image.dart for the get and set the String values.
image.dart
class Image{
int id;
String image;
Employee(this.id, this.image);
Employee.fromMap(Map map) {
id= map[id];
image = map[image];
}
}
dbhelper.dart
class DBHelper {
static Database _db;
Future<Database> get db async {
if (_db != null) return _db;
_db = await initDb();
return _db;
}
initDb() async {
io.Directory documentsDirectory = await getApplicationDocumentsDirectory();
String path = join(documentsDirectory.path, "test.db");
var theDb = await openDatabase(path, version: 1, onCreate: _onCreate);
return theDb;
}
void _onCreate(Database db, int version) async {
// When creating the db, create the table
await db.execute(
"CREATE TABLE Imagedata(id INTEGER PRIMARY KEY, image TEXT)");
print("Created tables");
}
void saveImage(Imagedata imagedata) async {
var dbClient = await db;
await dbClient.transaction((txn) async {
return await txn.rawInsert(
'INSERT INTO Imagedata(id, image) VALUES(' +
'\'' +
imagedata.id+
'\'' +
',' +
'\'' +
imagedata.image +
'\'' +
')');
});
}
Future<List<Imagedata>> getMyImage() async {
var dbClient = await db;
List<Map> list = await dbClient.rawQuery('SELECT * FROM Imagedata');
List<Imagedata> images= new List();
for (int i = 0; i < list.length; i++) {
images.add(new Imagedata(list[i]["id"], list[i]["image"]));
}
print(images.length);
return images;
}
Future<int> deleteMyImage(Imagedata imagedata) async {
var dbClient = await db;
int res =
await dbClient.rawDelete('DELETE * FROM Imagedata');
return res;
}
}
last getting String value from the database and Decode String value to the Image file.
Getting image from database
Future<List<Employee>> fetchImageFromDatabase() async {
var dbHelper = DBHelper();
Future<List<Imagedata>> images= dbHelper.getImages();
return images;
}
after Decode string value to the Image file
String DecoImage;
Uint8List _bytesImage;
FutureBuilder<List<Imagedata>>(
future: fetchImageFromDatabase(),
builder: (context, snapshot) {
if (snapshot.hasData) {
return new
ListView.builder(
itemCount: snapshot.data.length,
itemBuilder: (context, index) {
DecoImage=snapshot.data[index].image;
_bytesImage = Base64Decoder().convert(DecoImage);
return new SingleChildScrollView(
child: Container(
child: _bytesImage == null
? new Text('No image value.')
: Image.memory(_bytesImage)
),
);
}
);
}
}
),
i think that is helpful for other flutter,sqlite developers
import 'dart:convert';
import 'dart:typed_data';
Uint8List bytesImage1;
bool bolWithImage1 = false;
try {
bytesImage1 =
base64Decode(base64StringFromSql);
bolWithImage1 = true;
} catch (err) {}
i.e. if bolWithImage1 is true, the conversion is successful. You can then use image.memory(byteImage1, ......) to show the image in flutter.
You can also save the image as a BLOB (data type: UInt8List). Storing both as Blob (UInt8List) or String(with Base64encoder) in sqflite works. The key was to use MemoryImage instead of Image.memory. Otherwise you would get type 'Image' is not a subtype of type 'ImageProvider ' error.
//First create column in database to store as BLOB.
await db.execute('CREATE TABLE $photoTable($colId INTEGER PRIMARY KEY AUTOINCREMENT, $colmage BLOB)');
//User imagePicker to get the image
File imageFile = await ImagePicker.pickImage(source: ImageSource.camera, maxHeight: 200, maxWidth: 200, imageQuality: 70);
//Get the file in UInt8List format
Uint8List imageInBytes = imageFile.readAsBytesSync();
//write the bytes to the database as a blob
db.rawUpdate('UPDATE $photoTable SET $colImage = ?, WHERE $colId =?', [imageInBytes, colID]);
//retrieve from database as a Blob of UInt8List
var result = await db.query(photoTable, orderBy: '$colID ASC');
List<Photo> photoList = List<Photo>();
for (int i=0; i<result.length; i++){
photoList.add(Photo.fromMapObject(userMapList[i]));
}
//Map function inside Photo object
Photo.fromMapObject(Map<String, dynamic> map) {
this._id = map['id'];
this._imageFile = map['image'];
}
//Display the image using using MemoryImage (returns ImagePicker Object) instead of Image.memory (returns an Image object).
return Row(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
CircleAvatar(
backgroundImage: MemoryImage(Photo.image),
backgroundColor: Colors.blueGrey[50],
),
]);
Related
How to upload images in server using flutter with Laravel API? I tried using getx, its returning null. also I have image_picker and image_cropper package in my pubspec.yaml
Select Image from Gallery using File Picker
import 'dart:io';
import 'package:flutter/material.dart';
import 'package:file_picker/file_picker.dart';
class ImageScreen extends StatefulWidget {
ImageScreen();
#override
State<ImageScreen> createState() => _ImageScreenState();
}
class _ImageScreenState extends State<ImageScreen> {
File file;
Future<File> uploadImage() async {
FilePickerResult result = await FilePicker.platform.pickFiles();
if (result != null) {
setState(() {
file = File(result.files.single.path);
});
print(result.files.single.path);
} else {
// User canceled the picker
}
return file;
}
#override
Widget build(BuildContext context) {
return Scaffold(
body: GestureDetector(
onTap: () {
uploadImage();
},
child: Container(
color: Colors.green,
padding: EdgeInsets.all(5),
child: Text('Upload Image', style: TextStyle(fontSize: 16, color: Colors.white),)
),
),
);
}
}
Uploading Image to the server using http.multipartFile
static Future<dynamic> uploadImage({File file}) async {
try {
http.MultipartRequest request = new http.MultipartRequest("POST", _uri);
http.MultipartFile multipartFile = await http.MultipartFile.fromPath('file_name', file.path);
request.files.add(multipartFile);
var streamedResponse = await request.send();
var response = await http.Response.fromStream(streamedResponse);
if (response.statusCode == 200 ) {
return jsonDecode(response.body);
}
}
catch (e) {
return null;
}
}
final images = <File>[].obs;
Use this method for picking up images
Future pickImage(ImageSource source) async {
ImagePicker imagePicker = ImagePicker();
XFile pickedFile = await imagePicker.pickImage(source: source, imageQuality: 80);
File imageFile = File(pickedFile.path);
print(imageFile);
if (imageFile != null) {
images.add(imageFile);
} else {
Get.showSnackbar(GetSnackBar(message: "Please select an image file"));
}
}
Use this for uploading images to the server with your specific url.As I have used dio for uploading.You can use http as well.
Future<String> uploadImage(File file) async {
String fileName = file.path.split('/').last;
// you can edit it for your own convenience
var _queryParameters = {
'api_token': 'your token if required',
};
Uri _uri = 'Your base url'
var formData = FormData.fromMap({
"file": await MultipartFile.fromFile(file.path, filename: fileName),
});
var response = await dio.post(_uri, data: formData);
print(response.data);
if (response.data['data'] != false) {
return response.data['data'];
} else {
throw new Exception(response.data['message']);
}
}
This works for me, so maybe others might need it as well.
uploadImage(imageFile) async {
var stream = new http.ByteStream(
DelegatingStream.typed(imageFile.openRead()));
var length = await imageFile.length();
var uri = Uri.parse(
'http://192.168.5.196/ApiFileUploadTest/public/api/uploading-file-api');
var request = new http.MultipartRequest('POST', uri);
var multipartFile = new http.MultipartFile('file', stream, length,
filename: basename(imageFile.path));
request.files.add(multipartFile);
var response = await request.send();
print(response.statusCode);
response.stream.transform(utf8.decoder).listen((value) {
print(value);
});
}
I cannot display selected images from gallery in a grid. In this code, I am displaying images in a list and I want to turn it into small grid type in 1 row but I don't know how. Can you please help?
Here's my code for selecting multiple images using file picker.
FileType fileType;
String imgName, _imgPath;
Map<String, String> imgPaths;
List<File> _imgList = List();
bool isLoadingPath = false;
_openFile() async {
setState(() => isLoadingPath = true);
try {
_imgPath = null;
imgPaths = await FilePicker.getMultiFilePath(
type: fileType != null ? fileType : FileType.custom,
allowedExtensions: ['jpg', 'png']);
_imgList.clear();
imgPaths.forEach((key, val) {
print('{ key: $key, value: $val}');
File file = File(val);
_imgList.add(file);
});
} on PlatformException catch (e) {
print("Unsupported operation" + e.toString());
}
if (!mounted) return;
setState(() {
isLoadingPath = false;
imgName = _imgPath != null
? _imgPath.split('/').last
: imgPaths != null
? imgPaths.keys.toString()
: '...';
});
}
Displaying images in a list. (How to display images as it is?)
Widget _fileBuilder() {
return Builder(
builder: (BuildContext context) => isLoadingPath
? Padding(
padding: const EdgeInsets.only(bottom: 4.0))
: _imgPath != null || imgPaths != null && (imgPaths.length > 1 && imgPaths.length < 5)
? new Container(
height: imgPaths.length > 1
? MediaQuery.of(context).size.height * 0.15
: MediaQuery.of(context).size.height * 0.10,
width: MediaQuery.of(context).size.width,
child: new Scrollbar(
child: new ListView.separated(
itemCount: imgPaths != null && imgPaths.isNotEmpty
? imgPaths.length
: 1,
itemBuilder: (BuildContext context, int index) {
final bool isMultiPath = imgPaths != null && imgPaths.isNotEmpty;
final int fileNo = index + 1;
final String name = 'File $fileNo : ' + (isMultiPath
? imgPaths.keys.toList()[index]
: _imgPath ?? '...');
final filePath = isMultiPath
? imgPaths.values.toList()[index].toString()
: _imgPath;
return new ListTile(
title: Transform.translate(
offset: Offset(-25, 0),
child: new Text(
name,
),
),
leading: Icon(Icons.attach_file_outlined, color: Color(0xFFF3A494),),
dense: true,
);
},
separatorBuilder:
(BuildContext context, int index) =>
new Divider(),
)),
)
: new Container(child: Text('4 photos is the maximum'),),
);
}
Dependencies:
file_picker: ^1.4.2
path:
mime:
async:
what you can do is, use the Image Picker dependency. You can find its documentation on pub.dev. after installing it, try using it and store the image uploaded in a file in the device. and with that file name, you can access the image.
You can try the below code, it worked for me. Also don't forget to import dart:io; for using file.
var _storedImage;
Future<void> _takePictureByCamera() async {
final picker = ImagePicker();
final imageFile =
await picker.getImage(source: ImageSource.camera, maxWidth: 600, imageQuality: 60);
setState(() {
_storedImage = File(imageFile!.path);
});
final appDir = await path_provider.getApplicationDocumentsDirectory();
final fileName = path.basename(imageFile!.path);
final savedImage = File(imageFile.path).copy('${appDir.path}/$fileName');
widget.onSelectImage(savedImage);
}
Future<void> _takePictureByGallery() async {
final picker = ImagePicker();
final imageFile =
await picker.getImage(source: ImageSource.gallery, maxWidth: 600);
if (imageFile == null) {
return;
}
setState(() {
_storedImage = File(imageFile.path);
});
final appDir = await path_provider.getApplicationDocumentsDirectory();
final fileName = path.basename(imageFile.path);
final savedImage = File(imageFile.path).copy('${appDir.path}/$fileName');
widget.onSelectImage(savedImage);
}
and after selecting or clicking the image, you can do this to display the image ->
void getImage() async {
final pickedImage = await showModalBottomSheet(
context: accountTabScaffoldMessengerKey.currentContext!,
backgroundColor: Colors.transparent,
enableDrag: true,
// elevation: 0,
builder: (context) => AccountImageUpdateBottomSheet(_selectImage),
);
_selectImage(pickedImage);
}
void _selectImage(File pickedImage) {
setState(() {
_pickedImage = pickedImage;
});
}
The image you selected is stored in the _pickedImage and you can access it by Image.file(_pickedImage).
how can I convert the Uint8List imagedata of the Screenshot Package to save it with the ImageGallerySaver package saveFile command, which needs a string?
TextButton(
onPressed: () {
_imageFile = null;
screenshotController
.capture()
.then((Uint8List image) async {
//print("Capture Done");
setState(() {
_imageFile = image;
});
final result = await ImageGallerySaver.saveFile();
print("File Saved to Gallery");
}).catchError((onError) {
print(onError);
});
I found a solution:
TextButton(
onPressed: () {
_imageFile = null;
screenshotController
.capture()
.then((Uint8List image) async {
//print("Capture Done");
String dir =
(await getApplicationDocumentsDirectory()).path;
File file = File("$dir/" +
DateTime.now().millisecondsSinceEpoch.toString() +
".png");
await file.writeAsBytes(image);
setState(() {
_imageFile = image;
});
final result =
await ImageGallerySaver.saveFile(file.path);
print("File Saved to Gallery");
}).catchError((onError) {
print(onError);
});
},
child: Icon(Icons.change_history),
), // Thi
I have the following code where I fetch an image from firebase storage as an Image. Now, I want to store this image in my CachedNetworkImage so that I don't have to fetch it every time from the DB. Since the cachednetworkimage expects a URL and I am fetching an Image, how do I use the cachednetworkimage?
Here's my code;
final FirebaseStorage storage = FirebaseStorage(
app: Firestore.instance.app,
storageBucket: 'gs://my-project.appspot.com');
Uint8List imageBytes;
String errorMsg;
_MyHomePageState() {
storage.ref().child('selfies/me2.jpg').getData(10000000).then((data) =>
setState(() {
imageBytes = data;
})
).catchError((e) =>
setState(() {
errorMsg = e.error;
})
);
}
#override
Widget build(BuildContext context) {
var img = imageBytes != null ? Image.memory(
imageBytes,
fit: BoxFit.cover,
) : Text(errorMsg != null ? errorMsg : "Loading...");
return new Scaffold(
appBar: new AppBar(
title: new Text(widget.title),
),
body: new ListView(
children: <Widget>[
img,
],
));
}
}```
Cached network image and Flutter cache manager
The package Cached network image depends on another package called Flutter cache manager in order to store and retrieve image files.
Flutter cache manager
You need to download your image files and put them in the cache using the package. Here is an example code that gets files and their download urls from Firebase Storage and put them in the cache:
// import the flutter_cache_manager package
import 'package:flutter_cache_manager/flutter_cache_manager.dart';
// ... other imports
class MyCacheManager {
Future<void> cacheImage() async {
final FirebaseStorage storage = FirebaseStorage(
app: Firestore.instance.app,
storageBucket: 'gs://my-project.appspot.com',
);
final Reference ref = storage.ref().child('selfies/me2.jpg');
// Get your image url
final imageUrl = await ref.getDownloadURL();
// Download your image data
final imageBytes = await ref.getData(10000000);
// Put the image file in the cache
await DefaultCacheManager().putFile(
imageUrl,
imageBytes,
fileExtension: "jpg",
);
}
}
Cached network image
Next, you will use CacheNetworkImage widget as it shown in the documentation.
// ... some code
#override
Widget build(BuildContext context) {
return Scaffold(
body: CachedNetworkImage(
imageUrl: "your_image_link_here",
placeholder: (context, url) => CircularProgressIndicator(),
errorWidget: (context, url, error) => Icon(Icons.error),
),
);
}
If you put your image files in the cache by using Flutter cache manager, Cached network image should retrieve them from the cache directly. If your image files expire or the cache is cleared somehow, it will download and put them in the cache for you.
Full Example
import 'package:cached_network_image/cached_network_image.dart';
import 'package:cloud_firestore/cloud_firestore.dart';
import 'package:firebase_storage/firebase_storage.dart';
import 'package:flutter/material.dart';
import 'package:flutter_cache_manager/flutter_cache_manager.dart';
class MyCacheManager {
final _storage = FirebaseStorage(
app: FirebaseFirestore.instance.app,
storageBucket: 'gs://my-project.appspot.com',
);
final defaultCacheManager = DefaultCacheManager();
Future<String> cacheImage(String imagePath) async {
final Reference ref = _storage.ref().child(imagePath);
// Get your image url
final imageUrl = await ref.getDownloadURL();
// Check if the image file is not in the cache
if ((await defaultCacheManager.getFileFromCache(imageUrl))?.file == null) {
// Download your image data
final imageBytes = await ref.getData(10000000);
// Put the image file in the cache
await defaultCacheManager.putFile(
imageUrl,
imageBytes,
fileExtension: "jpg",
);
}
// Return image download url
return imageUrl;
}
}
class MyApp extends StatefulWidget {
#override
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
String _imageUrl;
#override
void initState() {
final myCacheManager = MyCacheManager();
// Image path from Firebase Storage
var imagePath = 'selfies/me2.jpg';
// This will try to find image in the cache first
// If it can't find anything, it will download it from Firabase storage
myCacheManager.cacheImage(imagePath).then((String imageUrl) {
setState(() {
// Get image url
_imageUrl = imageUrl;
});
});
super.initState();
}
#override
Widget build(BuildContext context) {
return Scaffold(
body: Center(
child: _imageUrl != null
? CachedNetworkImage(
imageUrl: _imageUrl,
placeholder: (context, url) => CircularProgressIndicator(),
errorWidget: (context, url, error) => Icon(Icons.error),
)
: CircularProgressIndicator(),
),
);
}
}
Try this way out using firebase_image package. From here . You need to sync with image url (selfies/me2.jpg) and bucket url (gs://my-project.appspot.com)
Image(
image: FirebaseImage('gs://bucket123/userIcon123.jpg'),
// Works with standard parameters, e.g.
fit: BoxFit.fitWidth,
width: 100,
// ... etc.
)
I'm not Fire-store user, but this should work.
It might needs a little modification or something, please share in a comment to update my answer according to that
You can get file object as follow..
import 'dart:io';
import 'dart:typed_data';
Uint8List readyData = imageBytes;
File('my_image.jpg').writeAsBytes(bodyBytes);
and save it using image_gallery_saver, so the code should look like
Future<String> _createFileFromString() async {
final encodedStr = "...";
Uint8List bytes = base64.decode(imageBytes);
String dir = (await getApplicationDocumentsDirectory()).path;
String fullPath = '$dir/abc.png';
print("local file full path ${fullPath}");
File file = File(fullPath);
await file.writeAsBytes(bytes);
print(file.path);
final result = await ImageGallerySaver.saveImage(bytes);
print(result);
return file.path;
}
For your storage instance use some method like so
Future<void> downloadURLExample() async {
String downloadURL = await storage.ref('selfies/me2.jpg')
.getDownloadURL();
// Within your widgets:
// CachedNetworkImage(imageUrl: downloadURL);
}
to get it working with Firebase Storage with included offline functionality I changed it that way
Future<String> cacheImage(String imagePath) async {
var fileinfo = await defaultCacheManager.getFileFromCache(imagePath);
if(fileinfo != null)
{
return fileinfo.file.path;
} else{
final Reference ref = _storage.child(imagePath);
// Get your image url
final imageUrl = await ref.getDownloadURL();
// Check if the image file is not in the cache
// Download your image data
final imageBytes = await ref.getData(10000000);
// Put the image file in the cache
var file = await defaultCacheManager.putFile(
imageUrl,
imageBytes!,
key: imagePath,);
return file.path;
}
}
for anyone still stuck with this.. try this required no hacks and uses CachedNetworkImageProvider built-in retrieval methods.
first screen:
CachedNetworkImage(
imageUrl: "https://whereismyimage.com",
progressIndicatorBuilder:
(context, url, progress) {
return CircularProgressIndicator(
value: progress.progress,
);
},
errorWidget: (context, url, error) => const Icon(Icons.error),
),
then second screen
Image(image: CachedNetworkImageProvider("https://whereismyimage.com)"),
The CachedNetworkImageProvider knows how to retrieve the cached image using the url.
Check out cached_network_image: ^2.5.0 package.
How to use it?
CachedNetworkImage(
imageUrl: "http://via.placeholder.com/350x150",
placeholder: (context, url) => CircularProgressIndicator(),
errorWidget: (context, url, error) => Icon(Icons.error),
),
I'm using qr_flutter to create QrImage. It's ok but I would like to convert QrImage into image in order to create a PDF file to print on the printer. Please kindly help!
QrImage(
data: qrString,
size: 300.0,
version: 10,
backgroundColor: Colors.white,
),
Use a RepaintBoundary widget with a key to export the widget to a a b64 string which then you can export as an image.
Example:
Future<Uint8List> _getWidgetImage() async {
try {
RenderRepaintBoundary boundary =
_renderObjectKey.currentContext.findRenderObject();
ui.Image image = await boundary.toImage(pixelRatio: 3.0);
ByteData byteData =
await image.toByteData(format: ui.ImageByteFormat.png);
var pngBytes = byteData.buffer.asUint8List();
var bs64 = base64Encode(pngBytes);
debugPrint(bs64.length.toString());
return pngBytes;
} catch (exception) {}
}
#override
Widget build(BuildContext context) {
return Scaffold(
body: Column(children: [
RepaintBoundary(
key: _renderObjectKey,
child: QrImage(
data: "some text",
size: 300.0,
version: 10,
backgroundColor: Colors.white,
),
),
RaisedButton(onPressed: () {
_getWidgetImage();
})
]));
}
Future<Uint8List> toQrImageData(String text) async {
try {
final image = await QrPainter(
data: text,
version: QrVersions.auto,
gapless: false,
color: hexToColor('#000000'),
emptyColor: hexToColor('#ffffff'),
).toImage(300);
final a = await image.toByteData(format: ImageByteFormat.png);
return a.buffer.asUint8List();
} catch (e) {
throw e;
}
}
A more updated typed answer, that adds responsibility seggregation and null-safety, extending the correct one from #Zroq would be:
Future<Uint8List> createImageFromRenderKey({GlobalKey<State<StatefulWidget>>? renderKey}) async {
try {
final RenderRepaintBoundary boundary = renderKey?.currentContext?.findRenderObject()! as RenderRepaintBoundary;
final ui.Image image = await boundary.toImage(pixelRatio: 3);
final ByteData? byteData = await image.toByteData(format: ui.ImageByteFormat.png);
return byteData!.buffer.asUint8List();
} catch(_) {
rethrow;
}
}
The idea is based on the same principle: using the global render key to create the ByteData that allows you to create the Uint8List buffer. However, the new versions of Flutter change the type of the boundary to become a RenderyObject? instead of a RenderRepaintBoundary.
The rethrow is (dirty) way of bypassing the limitation/small bug where RepaintBoundary may be being used in the UI to repaint the boundary (exposed as boundary.debugNeedsPaint), so it can potentially throw an unhandled exception or create a low-quality image buffer. So if the view is being used I rethrow the method.
More details about the stack trace: https://github.com/theyakka/qr.flutter/issues/112