I'm using a UIImagePickerController with an overlay of a transparent white rimmed rectangle as a guide for capturing a small section of the view that the user would be interested in, centered in screen coordinates (320 x 480, portrait).
The problem is, I'm not entirely sure which bits of camera's view (actual photo) the viewfinder is showing, it can't be showing all of it because the image resolution is 3264 x 2448, which is a different width-length ratio to the screen space available, and it's showing fullscreen.
I need to be able to crop the exact image area that is "under" the overlay on the UIImagePickerController view from the actual produced image.
I've tried taking a ratio of the width of the sample divided by the width of the portrait camera view, and then multiplying the width of the image by that and scaling the y-coordinates from that, but the results are incorrect.
How could I try to solve this?
Related
I am trying to render a few images in NSImageView's. These images are much larger than the size of the NSImageView (which I have set to scale proportionally up or down). But, the rendered image don't look very good. For example, in this sample image, the white border seems to have jagged edges at the 12,3,6 and 9 o'clock positions. The source image seems to be fine, even when I zoom out all the way in Preview.app.
I have tried scaling the image myself (using MGCropExtensions - which sets the interpolation quality to High), but, it doesn't seem to help. I would imagine NSImageView would internally draw on device boundary automatically, so that shouldn't be an issue?
Any ideas on how to get the image rendered crisply in the NSImageView? Thanks
Source Image - It has a white border which doesn't show here (against the white background)
Rendered NSImageView screenshot
I've created an image in Photoshop to be used as a sprite in Unity and everything works fine while the sprite is scaled at X: 1; Y: 1.
The problem starts when I scale the image up as the border of the image stretches out with the rest of the image. Is there any way to scale an image from its centre or to ignore the image's border when it's scaled?
Here's the example now that I am able to show it:
The rectangle on top is the original image without being scaled up or down and the rectangle on the bottom is scaled at X:5, Y:0.5 but the borders are stretched.
I think that the borders are stretched because it's part of the image and when it's being scaled, the image (including the borders) is just being stretched.
Is there any way to stretch the sprite image but by ignoring the borders?
Are you trying to scale the image and keep the original ratio?
If so, here are the steps:
Hope this helps. Please let me know if you were trying to do something else.
You can use a sliced sprite. The center of the image is scaled to fit the control rectangle but the borders maintain their sizes regardless of the scaling. Check out the Unity doc here: Unity - Manual: Image
In my app I am using image capture functionality by camera in profile picture section but in android it returns rectangular size image and after resizing the image gets distorted.
To avoid Image distortion I need square size croped image, same as functionality provided by whatsapp.
Is there any plugin to get square size image or crop image like square?
I'd like to be able to set a lock screen image that did not scale to take up the entire lock screen. This is causing a lot of issues with cutting off parts of the image either horizontally or vertically. If I have a WriteableBitmap version of my image, and either the height or width (whichever is larger) is scaled to be either the screen height or width, then how might I over lay that image on top of another image that is the page size to make the image become a 'full screen' image. I am thinking that this would mimic a 'screenshot', and therefore the lock screen would not try to scale the image (Even though there would be some blank background on part of the newly created image where the original did not show up).
As an example, if I'm trying to adapt a picture from PhotoChooserTask to overlay on a default image of 768x1280, where the 768x1280 will always remain in portrait orientation exactly as the lock screen does, regardless of the PhotoChooserTask image result's dimensions, how can I always make this fit inside the 768x1280?
I am taking photos from a webcam in my Cocoa application and I would like to zoom in on the centre of the image I receive. I start by receiving a CIImage and eventually save an NSImage.
How would I go about zooming in on either of these objects?
“Zoom” means a couple of things. You'll need at least to crop the image, and you may want to scale up. Or you may want to reserve scaling for display only.
CGImage
To crop it, use a CICrop filter.
To scale it, use either a CILanczosScaleTransform filter or a CIAffineTransform filter.
To crop and scale it, use both filters. Simply pass the output of the crop as the input of the scale.
NSImage
Crop and scale are the same operation here. You'll need to create a new, empty NSImage of the desired size (whether it's the size of the source crop if you won't zoom or an increased size if you will zoom), lock focus on it, draw the crop rectangle from the source image into the bounding rectangle of the destination image, and unlock focus.
If the destination rectangle is not the same size as the source (crop) rectangle, it will scale; if they are the same size, it will simply copy or composite pixel-to-pixel.