I have a default Spring message listener running.
When the onMessage hits, it comes in as TextMessage (NOT BytesMessage)
How do I write that into a PDF file?
I think there is some issue with my code below...so it writes to the file, but the pdf will not open...
if (message instanceof TextMessage) {
try {
//System.out.println(((TextMessage) message).getText());
TextMessage txtMessage = (TextMessage)message;
ByteArrayInputStream bais = new ByteArrayInputStream(txtMessage.getText().getBytes("UTF8"));
String outStr=bais.toString();
File newFile=new File("D:\\document.pdf");
FileOutputStream fos = new FileOutputStream(newFile);
int data;
while((data=bais.read())!=-1)
{
char ch = (char)data;
fos.write(ch);
}
fos.flush();
fos.close();
thanks for any suggestions
Please consider using a pdf specific API to create/update a pdf file. I would highly recommend iText. A pdf file is not simply a stream of bytes. A lot of things are involved and you have to consider font, page size, starting X and Y coordinates, direction of text, adding new pages, tabulat structure or free style and list goes on.
There are a lot of code examples on the site that will get you started. Here is a simplified snippet of adding text in a pdf file using iText API:
try {
...
BufferedInputStream bis = new BufferedInputStream(new FileInputStream(pdfFile));
...
PdfReader reader = new PdfReader(bis);
/* outs could be any output stream */
stamper = new PdfStamper(reader,outs);
... /* removed the code to get current page */
PdfContentByte over = stamper.getOverContent(currentPage);
over.beginText();
over.setFontAndSize(myFont, myFontSize);
over.setTextMatrix(xPoint, yPoint);
over.showText("Add this text");
over.endText();
... /* removed code to adjust x and y coordinate and add page if needed */
} catch (Exception ex) {
ex.printStackTrace();
} finally {
try {
stamper.close();
} catch (Exception ex) {/* handle exception */}
try {
outs.flush();
outs.close();
} catch (Exception ignored) {/* handle exception */}
}
Related
UPDATE: The initial question has been answered as to why the crashes happen but the lingering problem remains of why is the 'OnImageAvailable' callback called so may times? When it is called, I want to do stuff with the image, but whatever method I run at that time is called many times. Is this the wrong place to be using the resulting image?
I am using the sample code found here for a Xamarin Android implementation of the Android Camera2 API. My issue is that when the capture button is pressed a single time, the OnCameraAvalibleListener's OnImageAvailable callback gets called multiple times.
This is causing a problem because the image from AcquireNextImage needs to be closed before another can be used, but close is not called until the Run method of the ImageSaver class as seen below.
This causes these 2 errors:
Unable to acquire a buffer item, very likely client tried to acquire
more than maxImages buffers
AND
maxImages (2) has already been acquired, call #close before acquiring
more.
The max image is set to 2 by default, but setting it to 1 does not help. How do I prevent the callback from being called twice?
public void OnImageAvailable(ImageReader reader)
{
var image = reader.AcquireNextImage();
owner.mBackgroundHandler.Post(new ImageSaver(image, file));
}
// Saves a JPEG {#link Image} into the specified {#link File}.
private class ImageSaver : Java.Lang.Object, IRunnable
{
// The JPEG image
private Image mImage;
// The file we save the image into.
private File mFile;
public ImageSaver(Image image, File file)
{
if (image == null)
throw new System.ArgumentNullException("image");
if (file == null)
throw new System.ArgumentNullException("file");
mImage = image;
mFile = file;
}
public void Run()
{
ByteBuffer buffer = mImage.GetPlanes()[0].Buffer;
byte[] bytes = new byte[buffer.Remaining()];
buffer.Get(bytes);
using (var output = new FileOutputStream(mFile))
{
try
{
output.Write(bytes);
}
catch (IOException e)
{
e.PrintStackTrace();
}
finally
{
mImage.Close();
}
}
}
}
The method OnImageAvailable can be called again as soon as you leave it if there is another picture in the pipeline.
I would recommend calling Close in the same method you are calling AcquireNextImage. So, if you choose to get the image directly from that callback, then you have to call Close in there as well.
One solution involved grabbing the image in that method and close it right away.
public void OnImageAvailable(ImageReader reader)
{
var image = reader.AcquireNextImage();
try
{
ByteBuffer buffer = mImage.GetPlanes()[0].Buffer;
byte[] bytes = new byte[buffer.Remaining()];
buffer.Get(bytes);
// I am not sure where you get the file instance but it is not important.
owner.mBackgroundHandler.Post(new ImageSaver(bytes, file));
}
finally
{
image.Close();
}
}
The ImageSaver would be modified to accept the byte array as first parameter in the constructor:
public ImageSaver(byte[] bytes, File file)
{
if (bytes == null)
throw new System.ArgumentNullException("bytes");
if (file == null)
throw new System.ArgumentNullException("file");
mBytes = bytes;
mFile = file;
}
The major downside of this solution is the risk of putting a lot of pressure on the memory as you basically save the images in memory until they are processed, one after another.
Another solution consists in acquiring the image on the background thread instead.
public void OnImageAvailable(ImageReader reader)
{
// Again, I am not sure where you get the file instance but it is not important.
owner.mBackgroundHandler.Post(new ImageSaver(reader, file));
}
This solution is less intensive on the memory; but you might have to increase the maximum number of images from 2 to something higher depending on your needs. Again, the ImageSaver's constructor needs to be modified to accept an ImageReader as a parameter:
public ImageSaver(ImageReader imageReader, File file)
{
if (imageReader == null)
throw new System.ArgumentNullException("imageReader");
if (file == null)
throw new System.ArgumentNullException("file");
mImageReader = imageReader;
mFile = file;
}
Now the Run method would have the responsibility of acquiring and releasing the Image:
public void Run()
{
Image image = mImageReader.AcquireNextImage();
try
{
ByteBuffer buffer = image.GetPlanes()[0].Buffer;
byte[] bytes = new byte[buffer.Remaining()];
buffer.Get(bytes);
using (var output = new FileOutputStream(mFile))
{
try
{
output.Write(bytes);
}
catch (IOException e)
{
e.PrintStackTrace();
}
}
}
finally
{
image?.Close();
}
}
I too facing this issue for longer time and tried implementing #kzrytof's solution but didn't helped well as expected but found the way to get the onImageAvailable to execute once.,
Scenario: When the image is available then the onImageAvailable method is called right?
so, What I did is after closing the image using image.close(); I called the imagereader.setonImageAvailableListener() and made the listener = null. this way I stopped the execution for second time.,
I know, that your question is for xamarin and my below code is in native android java but the method and functionalities are same, so try once:
#Override
public void onImageAvailable(ImageReader reader) {
final Image image=imageReader.acquireLatestImage();
try {
if (image != null) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * width;
int bitmapWidth = width + rowPadding / pixelStride;
if (latestBitmap == null ||
latestBitmap.getWidth() != bitmapWidth ||
latestBitmap.getHeight() != height) {
if (latestBitmap != null) {
latestBitmap.recycle();
}
}
latestBitmap.copyPixelsFromBuffer(buffer);
}
}
catch(Exception e){
}
finally{
image.close();
imageReader.setOnImageAvailableListener(null, svc.getHandler());
}
// next steps to save the image
}
I'm trying to use the PictureService in Gluon Mobile to get a profile picture that can be stored in a MySQL table (Blob). What is the best way to get the byte array (and if possible the base64 encoded string) from the returned image object of the service? I tried SwingFXUtils, but it caused the app to crash on my Android. Thus, I am not sure the SwingFXUtils is supported.
Services.get(PicturesService.class).ifPresent(service -> {
service.takePhoto(false).ifPresent(image -> {
try {
byte[] imageStream = PictureFactory.convertToByteArray(image);
//String base64String = Base64.getEncoder().encodeToString(imageStream);
//this.picModel.setByteArray(imageStream);
//this.picModel.setBase64Str(base64String);
//this.picModel.setPatientId(User.patient.getPatientId());
av.setImage(image); //Avatar
av.setRotate(-90);
if (this.imageBox.getChildren().size() < 3) {
this.imageBox.getChildren().add(0, av);
}
} catch (Exception ex) {
Logger.getLogger(PictureFactory.class.getName()).log(Level.SEVERE, null, ex);
}
});
});
Byte Array Conversion Code:
public static byte[] convertToByteArray(Image img) throws IOException {
BufferedImage byteImg = SwingFXUtils.fromFXImage(img, null);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
ImageIO.write(byteImg, "png", stream);
byte[] result = stream.toByteArray();
stream.close();
return result;
}
A completely unrelated issue which I am also experiencing is that the image is rotated 90 degrees clockwise when taking it in portrait mode. Any way to fix this? I'm hoping it can be fixed in the image itself that gets sent to the database instead of just rotating the imageview.
I separate Message msg into Multipart multi1 = (Multipart) msg.getContent().
And a mail attachment is in one BodyPart, Part part = multi1.getBodyPart(i);
Then I want to save the attachment.
private void saveFile(String fileName, InputStream in) throws IOException {
File file = new File(fileName);
if (!file.exists()) {
OutputStream out = null;
try {
out = new BufferedOutputStream(new FileOutputStream(file));
in = new BufferedInputStream(in);
byte[] buf = new byte[BUFFSIZE];
int len;
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
} catch (FileNotFoundException e) {
LOG.error(e.toString());
} finally {
// close streams
if (in != null) {
in.close();
}
if (out != null) {
out.close();
}
}
}
But it cost too much time on reading IO Stream. For example,a 2.7M file needs almost 160 seconds to save on the disk. I have already tried Channel and some other IO Stream, but nothing changed. Any solution for saving attachment using Java Mail?
For more code information https://github.com/cainzhong/java-mail-demo/blob/master/src/main/java/com/java/mail/impl/ReceiveMailImpl.java
Actually, mail.imaps.partialfetch takes effect and speeds up a lot. There is a mistake for my previous code.
props.put("mail.imap.partialfetch","false");
props.put("mail.imap.fetchsize", "1048576");
props.put("mail.imaps.partialfetch", "false");
props.put("mail.imaps.fetchsize", "1048576");
instead of
props.put("mail.imap.partialfetch",false);
props.put("mail.imap.fetchsize", "1048576");
props.put("mail.imaps.partialfetch", false);
props.put("mail.imaps.fetchsize", "1048576");
It is important to put a quotation mark on "false". If not, the parameters will not take effects.
Anyway, thanks to Bill Shannon.
There's two key parts to this operation - reading the data from your mail server and writing the data to your filesystem. Most likely it's the speed of the server and the network connection to the server that's controlling the overall speed of the operation. You can try setting the mail.imap.fetchsize and mail.imap.partialfetch properties to see if that improves performance.
You can also try using something like NullOutputStream instead of FileOutputStream to measure only the speed of reading the data.
I managed to create the functionality to upload an image with the "gwtuploaded" to the app engine and store the image as an blob. My problem now is: The url that I get from imagesService.getServingUrl(suo) links to an image which is smaller than the image i uploaded. If i check the image in the blob viewer in the app engine console, it seems to have the correct size.
Why do I get a resized version of my image with the following code. Can I somehow retrieve the url to the full sized image?
My code looks like this:
public class MyUploadAction extends AppEngineUploadAction{
#Override
public String executeAction(HttpServletRequest request, List<FileItem> sessionFiles) throws UploadActionException {
String imageUrls = "";
// Image service is needed to get url from blob
ImagesService imagesService = ImagesServiceFactory.getImagesService();
// Get a file service -> write the blob
FileService fileService = FileServiceFactory.getFileService();
// Iterate over the files and upload each one
for(FileItem myFile : sessionFiles){
InputStream imgStream = null;
// construct our entity objects
Blob imageBlob = null;
// Create a new Blob file with mime-type "image/png"
AppEngineFile file = null;
FileWriteChannel writeChannel = null;
try {
// get input stream from file
imgStream = myFile.getInputStream();
imageBlob = new Blob(IOUtils.toByteArray(imgStream));
// create empty app engine file with mime type of uploaded file e.g.: image/png, image/jpeg
file = fileService.createNewBlobFile(myFile.getContentType());
// Open a channel to write to it
boolean lock = true;
writeChannel = fileService.openWriteChannel(file, lock);
// This time we write to the channel directly
writeChannel.write(ByteBuffer.wrap(imageBlob.getBytes()));
} catch (IOException e) {
e.printStackTrace();
}finally{
// Now finalize
try {
writeChannel.closeFinally();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
// Get the url from the blob
ServingUrlOptions suo = ServingUrlOptions.Builder.withBlobKey(fileService.getBlobKey(file)).secureUrl(true);
imageUrls += imagesService.getServingUrl(suo);
imageUrls = imageUrls.replaceFirst("0.0.0.0", "127.0.0.1");
System.out.println(imageUrls);
}
return imageUrls ;
}
}
I tried and found that image service got a maximum image output of 1600x1600.
see here and search for 1600: https://developers.google.com/appengine/docs/java/images/
That applies even if you don't request for an resize.
To serve the image in original size, you shouldn't use the image service, just output directly from (e.g. blobstore)
once again I need some help:
yesterday I asked this question that was about the way to use a large jpg image as a Bitmap (http://stackoverflow.com/questions/13511657/problems-with-big-drawable-jpg-image) and I resolved myself (Is my own response on that question) but whenever I resume my activity, as it uses that bitmap as the GLRenderer texture it crashes. I've tried many things, the last try was to make that bitmap static in order to keep it as a member variable into the activity but it crashes because, I supose, it looses it's mBuffer.
More details on the Activity code:
I declared it as SingletonInstance into the manifest:
android:launchMode="singleInstance"
in order to keep the tiles for the renderer.
and here some code:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mGLSurfaceView = new GLSurfaceView(this);
mGLSurfaceView.setEGLConfigChooser(true);
mSimpleRenderer = new GLRenderer(this);
getTextures();
if (!mIsTileMapInitialized){
tileMap = new LandSquareGrid(1, 1, mHeightmap, mLightmap, false, true, true, 128, true);
tileMap.setupSkybox(mSkyboxBitmap, true);
mIsTileMapInitialized = true;
}
initializeRenderer();
mGLSurfaceView.setRenderer(mSimpleRenderer);
setContentView( R.layout.game_layout );
setOnTouchListener();
initializeGestureDetector();
myCompassView = (MyCompassView)findViewById(R.id.mycompassview);
// Once set the content view we can set the TextViews:
coordinatesText = (TextView) findViewById(R.id.coordDynamicText);
altitudeText = (TextView) findViewById(R.id.altDynamicText);
directionText = (TextView) findViewById(R.id.dirDynamicText);
//if (!mIsGLInitialized){
mOpenGLLayout = (LinearLayout)findViewById(R.id.openGLLayout);
mOpenGLLayout.addView(mGLSurfaceView);
mVirtual3DMap = new Virtual3DMap(mSimpleRenderer, tileMap);
if (mGameThread == null){
mGameThread = new Thread(mVirtual3DMap);
mGameThread.start();
}
}
On getTextures method I get few small textures and the largest one as in my last question self response:
if (mTerrainBitmap==null){
InputStream is = getResources().openRawResource(R.drawable.terrain);
try {
// Set terrain bitmap options to 16-bit, 565 format.
terrainBitmapOptions.inPreferredConfig = Bitmap.Config.RGB_565;
Bitmap auxBitmap = BitmapFactory.decodeStream(is, null, terrainBitmapOptions);
mTerrainBitmap = Bitmap.createBitmap(auxBitmap);
}
catch (Exception e){
}
finally {
try {
is.close();
}
catch (IOException e) {
// Ignore.
}
}
}
So, again, first time it works great but when I go back I do:
protected void onPause() {
super.onPause();
mGLSurfaceView.onPause();
}
#Override
protected void onStop() {
// TODO Auto-generated method stub
super.onStop();
if (mVirtual3DMap != null) {
try {
mVirtual3DMap.cancel();
mGameThread=null;
mVirtual3DMap = null;
mGLSurfaceView.destroyDrawingCache();
mSimpleRenderer=null;
System.gc();
} catch (Throwable e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
And whan I resume the activity:
#Override
protected void onResume() {
super.onResume();
mGLSurfaceView.onResume();
if (mVirtual3DMap != null) {
try {
mVirtual3DMap.resume();
} catch (Throwable e) {
e.printStackTrace();
}
}
}
And it crashes.
Why?? Ok, here is the exception cause on the GLThread:
java.lang.IllegalArgumentException: bitmap is recycled...
I tried this messy stuff because launching more than two times the original activity the application crashes bacuse of this or because of the amount of memory used and now I don't know if revert all these changes or what todo with this.
Is there a good way to keep in memory and usable, by this or another application activity, this bitmap?
Please, I need your advices.
Thanks in advance.
Do not handle resources manually or your app's surface will broke up. You can't handle your resources manually.
If you worry about reloading resources and you use API level 11+, you can use setPreserveEGLContextOnPause(). It will perserve your textures and FBOs.
If you can't use API 11+, you can port GLSurfaceView() to your app. You can check my own GLSurfaceView that is ported from ICS.
PS: Sorry about my poor english.
No. Let Android handle all the resources. You must handle the appropriate events and reload the bitmap when the activity is resumed. You cannot expect, that any OpenGL handles are still valid after the activity has been resumed.
Think of it as in the example of a laptop coming out from hibernation. Although all memory has been restored, you cannot expect that any open socket has still a real active connection going.
I am an Android noobie, so please correct me if I am wrong.