i want to display an image stored in mysql database
the problem is that i can't convert the blob format into a Uint8list ; i searched and found soulitions but none of them works
Grab the blob from JSON:
var blob = yourJSONMapHere['yourJSONKeyHere'];
var image = BASE64.decode(blob); // image is a Uint8List
Now, use image in a Image.memory
new Container( child: new Image.memory(image));
this soulition didn't work because base64.decode need a string source not a blob file to convert
Don't know if it is relatable to this particular case but had similar issue with Cloud Firestore.
Created blob for storing in this way:
Blob myBlob = Blob(await audioFile.readAsBytes());
saved to Firestore in one field as usual
Then tried to read it back and couldn't figure it out how to get Uint8List from blob I get back from Firestore.
my solution:
//extract blob from field of Firestore document
Blob audioBlob = audioDocFromDb.get("fieldName");
//use .bytes on blob from this source
//package:cloud_firestore_platform_interface/src/blob.dart
Uint8List audioBytes = audioBlob.bytes;
this worked for me.
In my case I was packing up recorded audio and trying to play it back.
I had this problem too, i know the solution now, after many attempts:
Dont forget to upvote!
Uint8List image = Uint8List.fromList(blob.toBytes());
Image.memory(image);
This function has always saved me for getting bytes from a file that is uploaded to a URL.
import 'package:http/http.dart' as http;
// urlImageBlob is the URL where our file is hosted.
Uint8List fileBytes = await http.readBytes(Uri.parse(urlImageBlob));
// Display if are image.
Image.memory(fileBytes);
Related
I am trying to use the image_picker plugin. I can get the image as file using this plugin. I need to convert this image to bytes and send to a api. So I tried to use dart:convert to convert the image to byte string. Now when I decode I am getting a Uint8List type. How to convert this to a file and display in a Image.file(). I couldn’t proceed from here. Can someone help me with this.
consider i am getting this decodedBytes i am getting from a api response, how can i convert them to display in a Image widget
This is the code I tried so far.
var image = await ImagePicker.pickImage(source: ImageSource.camera);
setState(() {
imageURI = image;
final bytes = image.readAsBytesSync();
String img64 = base64Encode(bytes);
print(bytes);
print(img64);
final decodedBytes = base64Decode(img64);
print(decodedBytes);
//consider i am getting this decodedBytes i am getting from a api response, how can i convert them to display in a Image widget
});
I am getting this error using writeAsBytesSync(),
Unhandled Exception: FileSystemException: Cannot open file, path = 'decodedimg.png'
You get this error, because you can't write to any arbitrary location in an application sandbox. You can use path_provider to look up a temporary directory.
But in your case, just use the image object, pickImage already returns a File object, so just use Image.file(image)
If you want to decode a base64 into a temporary directory you can use:
import 'package:path_provider/path_provider.dart';
import 'package:path/path.dart' as path;
Future<File> writeImageTemp(String base64Image, String imageName) async {
final dir = await getTemporaryDirectory();
await dir.create(recursive: true);
final tempFile = File(path.join(dir.path, imageName));
await tempFile.writeAsBytes(base64.decode(base64Image));
return tempFile;
}
with pubspec.yaml:
dependencies:
path: ^1.6.0
path_provider: ^1.6.7
I am trying to convert image taken from camera to blob to pass in Face API detect face api as [binary data] input
(https://[location].api.cognitive.microsoft.com/face/v1.0/detect[?returnFaceId][&returnFaceLandmarks][&returnFaceAttributes])
However unable to convert the base64Image to blob and idea?
Blob is data object in Javascript. Unless you need to send your data via webview, you cannot convert a base64 string to Blob in NativeScript. In javascript, it's simply create a new Blob with base64 string, e.g. example
var mediaFile = new Blob([_base64], {
type: 'data:image/png;base64',
'Content-Transfer-Encoding': 'base64'
});
But you can just send binary data to Face API with NSData.
I'm using the Microsoft Bot Framework with Cognitive Services to generate images from a source image that the user uploads via the bot. I'm using C#.
The Cognitive Services API returns a byte[] or a Stream representing the treated image.
How can I send that image directly to my user? All the docs and samples seem to point to me having to host the image as a publically addressable URL and send a link. I can do this but I'd rather not.
Does anyone know how to simple return the image, kind of like the Caption Bot does?
You should be able to use something like this:
var message = activity.CreateReply("");
message.Type = "message";
message.Attachments = new List<Attachment>();
var webClient = new WebClient();
byte[] imageBytes = webClient.DownloadData("https://placeholdit.imgix.net/~text?txtsize=35&txt=image-data&w=120&h=120");
string url = "data:image/png;base64," + Convert.ToBase64String(imageBytes)
message.Attachments.Add(new Attachment { ContentUrl = url, ContentType = "image/png" });
await _client.Conversations.ReplyToActivityAsync(message);
The image source of HTML image elements can be a data URI that contains the image directly rather than a URL for downloading the image. The following overloaded functions will take any valid image and encode it as a JPEG data URI string that may be provided directly to the src property of HTML elements to display the image. If you know ahead of time the format of the image returned, then you might be able to save some processing by not re-encoding the image as JPEG by just returning the image encoded as base 64 with the appropriate image data URI prefix.
public string ImageToBase64(System.IO.Stream stream)
{
// Create bitmap from stream
using (System.Drawing.Bitmap bitmap = System.Drawing.Bitmap.FromStream(stream) as System.Drawing.Bitmap)
{
// Save to memory stream as jpeg to set known format. Could also use PNG with changes to bitmap save
// and returned data prefix below
byte[] outputBytes = null;
using (System.IO.MemoryStream outputStream = new System.IO.MemoryStream())
{
bitmap.Save(outputStream, System.Drawing.Imaging.ImageFormat.Jpeg);
outputBytes = outputStream.ToArray();
}
// Encoded image byte array and prepend proper prefix for image data. Result can be used as HTML image source directly
string output = string.Format("data:image/jpeg;base64,{0}", Convert.ToBase64String(outputBytes));
return output;
}
}
public string ImageToBase64(byte[] bytes)
{
using (System.IO.MemoryStream inputStream = new System.IO.MemoryStream())
{
inputStream.Write(bytes, 0, bytes.Length);
return ImageToBase64(inputStream);
}
}
hello guys i'm retrieving an image from database , it's stored as a binary field
here is the code i use
public FileContentResult GetImage(int? id)
{
byte[] img=.......//get image from db
string imgType = "image/jpeg";
return File(img, imgType);
}
but the problem is that code downloads the image , and i need to show the image not to download it,
can anyone help me with this?
I was setting the content type wrongly - make sure it's image/jpeg, image/png and so on.
After using the ImagesService to transform an uploaded image, I would like to store it back into a new Blob file and make it available through getServingUrl() as provided by the ImagesService.
Storing the image in a new AppEngineFile as described here works fine and I am able to open and view it locally using the dev server.
However when passing the blobKey for the new AppEngineFile to ImagesService.getServingUrl() a
java.lang.IllegalArgumentException: Could not read blob.
exception is thrown. Any ideas what the problem could be? This is the code I use to transform and store an uploaded image (blobKey and blobInfo correspond to the uploaded file, not the newly created one).
/* Transform image in existing Blob file */
Image originalImage = ImagesServiceFactory.makeImageFromBlob(blobKey);
Transform verticalFlip = ImagesServiceFactory.makeVerticalFlip();
ImagesService imagesService = ImagesServiceFactory.getImagesService();
Image newImage = imagesService.applyTransform(verticalFlip, originalImage);
/* Store newImage in an AppEngineFile */
FileService fileService = FileServiceFactory.getFileService();
AppEngineFile file = fileService.createNewBlobFile(blobInfo.getContentType());
FileWriteChannel writeChannel = fileService.openWriteChannel(file, true);
ByteBuffer buffer = ByteBuffer.wrap(newImage.getImageData());
writeChannel.write(buffer);
/* closeFinally assigns BlobKey to new file object */
writeChannel.closeFinally();
BlobKey newBlobKey = fileService.getBlobKey(file);
Edit:
The above code is correct, the problem was storing a String representation of the new blob key using newBlobKey.toString() instead of newBlobKey.getKeyString().
Why would you want to do that? Once you transform an image it is cached and anyway it is always fast. If you really feel you want to save it just use urlfetch to read the data and store them in the BlobStore ;-)
The following works fine when executed at the end of the code posted in the question:
String url = imagesService.getServingUrl(newBlobKey)
The URL can then be used to scale and crop the new image as described in the docs
http://code.google.com/appengine/docs/java/images/overview.html#Transforming_Images_from_the_Blobstore