Draw a vertical line - macos

I want to put an image (I will create) in my status bar app and this image should made of vertical line but I don't now how to draw them.

Found
let graphWidth: CGFloat = 22.0;
let graphHeight: CGFloat = 22.0
let size = NSMakeSize( graphWidth, graphHeight )
let image = NSImage(size: size)
image.lockFocus()
let path = NSBezierPath()
path.move(to: NSPoint(x: 20.0, y: 0))
path.line(to: NSPoint(x: 20.0, y: 20.0))
path.stroke()
image.unlockFocus()
statusItem.image = image

Related

How to create a CGVector that "points" from one point (CGPoint) to another point

I want to use a CGVector for a SKAction.move, which moves one SKSpriteNode towards another SKSpriteNode.
I want to use this code:
let point: CGPoint = CGPoint(x: CGFloat(arc4random_uniform(UInt32(size.width))), y: CGFloat(arc4random_uniform(UInt32(size.height))))
let object: SKSpriteNode = SKSpriteNode(color: NSColor.red, size: CGSize(width: 10, height: 10))
object.position = CGPoint(x: 50, y: 50)
addChild(object)
object.run(SKAction.move(by: CGVector(), duration: 2.5)) // <- Vector from `object.position` to `point`
You need to use vector math
so looking at your code you want to use something like this:
let P1 = point
let P2 = object.position
//Find offset
let offset = P1 - P2
//Set X and Y distance to move
let distX = CGFloat(offset.x)
let distY = CGFloat(offset.y)
//set vector
let vec = CGVector(dx: distX, dy: distY)
//Set action
object.run(SKAction.move(by: vec, duration: 2.4)
Let me know how you get on

How to Animate a UIImage out of the Screen in Swift

i have searched around the web but nothing could answer my question.
I am trying to animate an UIImageview out of the Screen, set the alpha of it to 0, reset its position and reset the alpha to 1 again.
I can't seem to get the right position where to animate the Image to.
I have set up a gesturerecognizer to drag the image around and when the center of the image is above a certain point, i want to animate it out of screen.
This is the Regocnizercode :
#objc func imageDragged(gestureRecognizer: UIPanGestureRecognizer){
//Current Point where the label is dragged
let draggedLabelPoint = gestureRecognizer.translation(in: view)
//Updating the center of the label : viewcenter +/- currentpoint
matchImageView.center = CGPoint(x: view.bounds.width/2 + draggedLabelPoint.x, y: view.bounds.height/2 + draggedLabelPoint.y)
let xFromCenter = view.bounds.width/2 - matchImageView.center.x
var rotation = CGAffineTransform(rotationAngle: xFromCenter / -500)
let scale = min(100/abs(xFromCenter),1)
var scaledAndRotated = rotation.scaledBy(x: scale, y: scale)
//Transforming the Item respective to the distance from center
matchImageView.transform = scaledAndRotated
if gestureRecognizer.state == .ended {
if matchImageView.center.x < (view.bounds.width/2 - 100) {
animateImageToSide(target: CGPoint(x: view.bounds.width - 200, y: matchImageView.center.y))
matchImageView.alpha = 0
}
if matchImageView.center.x > (view.bounds.width/2 + 100) {
animateImageToSide(target: CGPoint(x: view.bounds.width + 200, y: matchImageView.center.y))
matchImageView.alpha = 0
}
//Reset the scaling variables and recentering the Item after swipe
rotation = CGAffineTransform(rotationAngle: 0)
scaledAndRotated = rotation.scaledBy(x: 1, y: 1)
matchImageView.transform = scaledAndRotated
matchImageView.center = CGPoint(x: view.bounds.width/2, y: view.bounds.height/2)
}
}
I am sorry if the resetting part is irrelevant for you, but im not sure if this could cause this behavior.
And this is my current Animationmethod :
func animateImageToSide(target:CGPoint){
UIImageView.animate(withDuration: 0.5, delay: 0, options: .curveLinear, animations: {self.matchImageView.center = target}) { (success: Bool) in
self.updateImage()
self.fadeInImage()//This is where i set the alpha back to 1
}
}
I am not sure if a CGPoint is what i need to use.
The current behavoir of the image is very weird. Most of the time it snaps to a specific place, regardless of the targetposition. Maybe my attempt on this is entirely wrong.
Thanks for your help !

How to prepare rectangle type tab bar in swift?

I have prepared a tab bar which has a square shape rectangle but I can't able to prepare a rectangle type shape of a tab bar button or UIButton or View. How can I prepare a tab bar or button or view as following gif file?
You can try something like given below for each button:
let tabBar1Path = UIBezierPath()
let tabBar1Layer = CAShapeLayer()
tabBar1Path.move(to: CGPoint(x: 0, y: 0))
tabBar1Path.addLine(to: CGPoint(x: self.view!.frame.size.width * 0.25, y: 0))
tabBar1Path.addLine(to: CGPoint(x: self.view!.frame.size.width * 0.2, y: tabBarHeight))
tabBar1Path.addLine(to: CGPoint(x: 0, y: tabBarHeight))
tabBar1Path.addLine(to: CGPoint(x: 0, y: 0))
tabBar1Layer.path = trianglePath.cgPath
tabBar1Layer.fillColor = UIColor.black.cgColor
tabBarButton1.layer.addSublayer(triangleLayer)

Swift 3: get color of pixel in UIImage (better: UIImageView)

I tried different solutions (e.g. this one), but the color I get back looks a bit different than in the real image. I guess it's because the image is only RGB, not RGBA. May that be an issue?
Related issue: if the UIImage has contentMode = .scaleAspectFill, do I have to do a recalculation of the image or can I just use imageView.image?
EDIT:
I tried with this extension:
extension CALayer {
func getPixelColor(point: CGPoint) -> CGColor {
var pixel: [CUnsignedChar] = [0, 0, 0, 0]
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
let context = CGContext(data: &pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
context!.translateBy(x: -point.x, y: -point.y)
self.render(in: context!)
let red: CGFloat = CGFloat(pixel[0]) / 255.0
let green: CGFloat = CGFloat(pixel[1]) / 255.0
let blue: CGFloat = CGFloat(pixel[2]) / 255.0
let alpha: CGFloat = CGFloat(pixel[3]) / 255.0
let color = UIColor(red:red, green: green, blue:blue, alpha:alpha)
return color.cgColor
}
}
but for some images it seems as if the coordinate system is turned around, for others I get really wrong values... what am I missing here?
EDIT 2:
I try with these images:
https://dl.dropboxusercontent.com/u/119600/gradient.png
https://dl.dropboxusercontent.com/u/119600/gradient#2x.png
but I do get wrong values. They are embedded in a UIImageView but I convert the coordinates:
private func convertScreenPointToImage(point: CGPoint) -> CGPoint {
let widthMultiplier = gradientImage.size.width / UIScreen.main.bounds.width
let heightMultiplier = gradientImage.size.height / UIScreen.main.bounds.height
return CGPoint(x: point.x * widthMultiplier, y: point.y * heightMultiplier)
}
This one
gives me === Optional((51, 76, 184, 255)) when running on the iPhone 7 simulator, which is not correct...
I wrote this is a playground. I index into the image data with a pointer and grab the rgba values:
func pixel(in image: UIImage, at point: CGPoint) -> (UInt8, UInt8, UInt8, UInt8)? {
let width = Int(image.size.width)
let height = Int(image.size.height)
let x = Int(point.x)
let y = Int(point.y)
guard x < width && y < height else {
return nil
}
guard let cfData:CFData = image.cgImage?.dataProvider?.data, let pointer = CFDataGetBytePtr(cfData) else {
return nil
}
let bytesPerPixel = 4
let offset = (x + y * width) * bytesPerPixel
return (pointer[offset], pointer[offset + 1], pointer[offset + 2], pointer[offset + 3])
}
let image = UIImage(named: "t.png")!
if let (r,g,b,a) = pixel(in: image, at: CGPoint(x: 1, y:2)) {
print ("Red: \(r), Green: \(g), Blue: \(b), Alpha: \(a)")
}
Note that if you use this on a UIImage that is a property of a UIImageView the pixel coordinates are those of the actual image in its original resolution, not the screen coordinates of the scaled UIImageView. Also it tried with RGB Jpg and RGBA PNG and the both get imported as 32 bit RGBA images so it works for both.

Getting Pixel Color from an Image using CGPoint in Swift 3

I am try this PixelExtractor class in Swift 3, get a error;
Cannot invoke initializer for type 'UnsafePointer' with an argument list of type '(UnsafeMutableRawPointer?)'
class PixelExtractor: NSObject {
let image: CGImage
let context: CGContextRef?
var width: Int {
get {
return CGImageGetWidth(image)
}
}
var height: Int {
get {
return CGImageGetHeight(image)
}
}
init(img: CGImage) {
image = img
context = PixelExtractor.createBitmapContext(img)
}
class func createBitmapContext(img: CGImage) -> CGContextRef {
// Get image width, height
let pixelsWide = CGImageGetWidth(img)
let pixelsHigh = CGImageGetHeight(img)
let bitmapBytesPerRow = pixelsWide * 4
let bitmapByteCount = bitmapBytesPerRow * Int(pixelsHigh)
// Use the generic RGB color space.
let colorSpace = CGColorSpaceCreateDeviceRGB()
// Allocate memory for image data. This is the destination in memory
// where any drawing to the bitmap context will be rendered.
let bitmapData = malloc(bitmapByteCount)
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue)
let size = CGSizeMake(CGFloat(pixelsWide), CGFloat(pixelsHigh))
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
// create bitmap
let context = CGBitmapContextCreate(bitmapData, pixelsWide, pixelsHigh, 8,
bitmapBytesPerRow, colorSpace, bitmapInfo.rawValue)
// draw the image onto the context
let rect = CGRect(x: 0, y: 0, width: pixelsWide, height: pixelsHigh)
CGContextDrawImage(context, rect, img)
return context!
}
func colorAt(x x: Int, y: Int)->UIColor {
assert(0<=x && x<width)
assert(0<=y && y<height)
let uncastedData = CGBitmapContextGetData(context)
let data = UnsafePointer<UInt8>(uncastedData)
let offset = 4 * (y * width + x)
let alpha: UInt8 = data[offset]
let red: UInt8 = data[offset+1]
let green: UInt8 = data[offset+2]
let blue: UInt8 = data[offset+3]
let color = UIColor(red: CGFloat(red)/255.0, green: CGFloat(green)/255.0, blue: CGFloat(blue)/255.0, alpha: CGFloat(alpha)/255.0)
return color
}
}
Fix this error.
let data = UnsafePointer<UInt8>(uncastedData)
->
let data = UnsafeRawPointer(uncastedData)
Get other error; 'Type 'UnsafeRawPointer?' has no subscript members'
How to modify this error?
You can write something like this when you have an UnsafeRawPointer in your data:
let alpha = data.load(fromByteOffset: offset, as: UInt8.self)
let red = data.load(fromByteOffset: offset+1, as: UInt8.self)
let green = data.load(fromByteOffset: offset+2, as: UInt8.self)
let blue = data.load(fromByteOffset: offset+3, as: UInt8.self)
Or else, you can get UnsafeMutablePointer<UInt8> from your uncastedData (assuming it's an UnsafeMutableRawPointer):
let data = uncastedData.assumingMemoryBound(to: UInt8.self)
SWIFT 3 (updated March 2017) Xcode 8 / IOS 10
Important: note that return value corresponds to red: b, green:r and blue: r as in the data they are stored backwards
First, create the extension (you can copy&paste somewhere in your
code)
extension UIImage {
func getPixelColor(pos: CGPoint) -> UIColor {
if let pixelData = self.cgImage?.dataProvider?.data {
let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4
let r = CGFloat(data[pixelInfo+0]) / CGFloat(255.0)
let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)
return UIColor(red: b, green: g, blue: r, alpha: a)
} else {
//IF something is wrong I returned WHITE, but change as needed
return UIColor.white
}
}
}
Then just call it as:
let colorAtPixel : UIColor = (theView.image?.getPixelColor(pos: CGPoint(x: 2, y: 2)))!
Although the code returns de exact color, it seems that is not returning the correct one for different CGPoints.
Might it be because of the screen resolution? (x1,x2,x3)?
It would be great if someone can add some light to the mystery...
Swift-3 (IOS 10.3)
extension UIImage {
func getPixelColor(atLocation location: CGPoint, withFrameSize size: CGSize) -> UIColor {
let x: CGFloat = (self.size.width) * location.x / size.width
let y: CGFloat = (self.size.height) * location.y / size.height
let pixelPoint: CGPoint = CGPoint(x: x, y: y)
let pixelData = self.cgImage!.dataProvider!.data
let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
let pixelIndex: Int = ((Int(self.size.width) * Int(pixelPoint.y)) + Int(pixelPoint.x)) * 4
let r = CGFloat(data[pixelIndex]) / CGFloat(255.0)
let g = CGFloat(data[pixelIndex+1]) / CGFloat(255.0)
let b = CGFloat(data[pixelIndex+2]) / CGFloat(255.0)
let a = CGFloat(data[pixelIndex+3]) / CGFloat(255.0)
return UIColor(red: r, green: g, blue: b, alpha: a)
}
}
Usage : -
let color = yourImageView.image!.getPixelColor(atLocation: location, withFrameSize: yourImageView.frame.size)
location is a CGPoint
and size is size of your imageView
The following section is taken from some Swift 3 code I'm using to sample pixels from an image to get the predominant hue which I use to generate a background for tableView rows. The mechanics for the hue selection process don't apply to your question, so I'm just providing the relevant fragment.
let colorSpace = CGColorSpaceCreateDeviceRGB() // UIExtendedSRGBColorSpace
let newImage = image.cgImage?.copy(colorSpace: colorSpace)
let pixelData = newImage?.dataProvider!.data
let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
var hueFrequency = [Int: Double]()
hueFrequency[1] = 1 // Add one entry so this serves as a default if no hues from the image pass the filters
let nStart = 1
let mStart = 1
for n in nStart...Int(image.size.width / samplingFactor) {
for m in mStart...Int(image.size.height / samplingFactor) {
let pixelInfo: Int = ((Int(image.size.width) * m * Int(samplingFactor)) + n * Int(samplingFactor)) * 4 // bytesPerPixel
let b = CGFloat(data[pixelInfo]) / CGFloat(255.0) // cgImage bitmapinfo = rawValue 8194 -> BGRA ordering
let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
let r = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)
Also, note that I found the bitmapInfo value (image.cgImage!.bitmapInfo using my parameters) indicated a reordering of the RGBA sequence to BGRA, which I had to deal with in ordering the steps to pick out the data. If your colors are off, you may want to check this.

Resources