I am trying to create an NSAttributedString that includes an NSImage for an OS X application.
I have tried a few different ways, but with this basic code:
let image = NSImage(named: "super-graphic")!
let attachment = NSTextAttachment()
attachment.image = image
let imageString = NSAttributedString(attachment: attachment)
When I set this on an NSLabel, or NSTextField attributed string, the image doesn't render.
Is it possible to combine NSImage and NSAttributedString to embed an image in an attributed string on OS X?
Well, I feel silly, this code works:
let image = NSImage(named: "super-graphic")!
let attachment = NSTextAttachment()
let cell = NSTextAttachmentCell(imageCell: image)
attachment.attachmentCell = cell
let imageString = NSAttributedString(attachment: attachment)
The key difference was to avoid using the "image" property of the NSTextAttachment
Related
I am trying to convert a UIImage to a SwiftUI Image using the init(uiImage:) initializer. My UIImage itself is created from a CIImage generated by a CIQRCodeGenerator CIFilter. I am running my code on a Playground in Xcode 11.1 GM seed 1. Here is the entirety of my code:
import SwiftUI
import UIKit
func qrCodeImage(for string: String) -> Image? {
let data = string.data(using: String.Encoding.utf8)
guard let qrFilter = CIFilter(name: "CIQRCodeGenerator") else { return nil }
qrFilter.setValue(data, forKey: "inputMessage")
guard let ciImage = qrFilter.outputImage else { return nil }
let uiImage = UIImage(ciImage: ciImage)
let image = Image(uiImage: uiImage)
return image
}
let image = qrCodeImage(for: "fdsa")
And here is the result:
Even when I transform the image with CGAffineTransform(scaleX: 10, y: 10), the resulting SwiftUI Image at the end is still the same size, but blank.
Following solution provided in: Generating QR Code with SwiftUI shows empty picture
Here is the code:
var ciContext = CIContext()
func qrCodeImage(for string: String) -> Image? {
let data = string.data(using: String.Encoding.utf8)
guard let qrFilter = CIFilter(name: "CIQRCodeGenerator") else { return nil }
qrFilter.setValue(data, forKey: "inputMessage")
guard let ciImage = qrFilter.outputImage else { return nil }
let cgImage = ciContext.createCGImage(ciImage, from: ciImage.extent)
let uiImage = UIImage(cgImage: cgImage!)
let image = Image(uiImage: uiImage)
return image
}
let image = qrCodeImage(for: "fdsa")
Result:
screenshot in swift playground
Can confirm I encounter the same issue with a SwiftUI Image using a UIImage initialized from data. Can verify that the image is loaded when paused in debugging, but it does not display in the SwiftUI Image.
This solution worked for me: explicitly specify the image rendering mode. In my case I added the following: .renderingMode(.original)
#Eugene remark worked for me:
let image = Image(uiImage: uiImage).renderingMode(.original)
How to display from UILabel HTML content from API such as images and video, expect text which I have already done it. Is there a way to fetch from UILabel without using textview or WebKit all the content.
My code is kinda working from the images but the images are huge/cut and also doesn't embed videos from YT or FB
https://imgur.com/a/zdaMCHv
do {
if let htmlMessageData = post?.content.data(using:
String.Encoding.utf16) {
let layoutMessage = try NSMutableAttributedString(data:
htmlMessageData, options:
[NSAttributedString.DocumentReadingOptionKey.documentType:
NSAttributedString.DocumentType.html], documentAttributes: nil)
let paragraphStyle = NSMutableParagraphStyle()
paragraphStyle.alignment = .justified
layoutMessage.addAttribute(NSAttributedString.Key.paragraphStyle,
value: paragraphStyle, range: NSRange(location: 0, length:
layoutMessage.string.count))
self.contentLabel.attributedText = layoutMessage
}
}
catch {}
I just tried to add an image inside the textview using native NSAtrributedstring and NStextAttachment, getting some help from this article here
However, I am unable to do it. I am using nativescript-mediafilepicker library to add the image from the Photo library, then converting the PH image to UIImage using one its inbuilt method. But the textview is not getting updated with the image. However, I am being able to add more string through the NSattributedstring but not image.
here's my code.
//creating and initilizing new NSMutableAttributedString
var attributedString = NSMutableAttributedString.alloc().initWithString(textview.ios.text);
textview.ios.attributedText = attributedString;
//value is an UIImage object what the convertPHImageToUIImage method returns
var image = value;
console.log(image);
//the above log prints<UIImage: 0x2817ac4d0> size {4032, 3024} orientation 0 scale 1.000000
let oldWidth = image.size.width;
// console.log(oldWidth);
let scaleFactor = oldWidth / (textview.ios.frame.size.width - 10);
//console.log(scaleFactor);
let orientation="up";
//create NStextAttachment
let textAttachment = NSTextAttachment.alloc().init();
//adding UIImage object to NSTextAttachment
textAttachment.image = UIImage.imageWithCGImageScaleOrientation(image.CGImage ,scaleFactor , orientation)
// console.dir(textAttachment);
//creating a new NSAttributedString
let attrStringWithImage = NSAttributedString.alloc().init();
//console.dir(attrStringWithImage);
attrStringWithImage.attachment = textAttachment;
console.dir(attrStringWithImage)
//appenind the NSAttributedString to the mutable string..
attributedString.appendAttributedString(attrStringWithImage);
//console.log(attributedString.containsAttachmentsInRange(textview.ios.selectedRange));
textview.ios.attributedText = attributedString;
//textview.ios.textStorage.insertAttributedStringAtIndex(attrStringWithImage,textview.ios.selectedRange.location)
//this doesn't work either
Install tns-platform-declarations if you are using TypeScript, that will make your life easy when you want to access native apis.
UIImage.imageWithCGImageScaleOrientation(cgImage, scale, orientation);
This docs will help you understanding the casting of Objective C to JavaScript / TypeScript.
How can I display an image in an UIImageView from
documentsDirectoryURL.URLByDeletingLastPathComponent!.URLByDeletingLastPathComponent!.URLByDeletingLastPathComponent!.URLByAppendingPathComponent("img"))
For example, something like this
imageView.image = UIImage(documentsDirectoryURL.URLByDeletingLastPathComponent!.URLByDeletingLastPathComponent!.URLByDeletingLastPathComponent!.URLByAppendingPathComponent("img")))
This is my documentDirectoryURL
let documentsDirectoryURL = NSFileManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first as! NSURL
Try this
imageView.image = UIImage(contentsOfFile: documentsDirectoryURL.URLByDeletingLastPathComponent!.URLByDeletingLastPathComponent!.URLByDeletingLastPathComponent!.URLByAppendingPathComponent("img")).path)
Good luck
I'm trying to save images retrieved from Parse.com like this:
let userImageFile = object["Image"] as PFFile
userImageFile.getDataInBackgroundWithBlock {
(imageData: NSData!, error: NSError!) -> Void in
if error == nil {
image = UIImage(data:imageData)
let imageToSave:NSData = UIImagePNGRepresentation(image)
self.saveImage(intRandomNumb, retImage: imageToSave)
}
}
where the saveImage-function looks like this:
func saveImage(imagepath:Int, retImage:NSData){
println("image is being saved")
let defaults = NSUserDefaults.standardUserDefaults()
let imagePathName = "\(imagepath)"
defaults.setObject(retImage, forKey: imagePathName)
}
and later, I'm trying to display this image like this:
var point = gestureRecognizer.locationInView(self.tv)
if let indexPath = self.tv.indexPathForRowAtPoint(point)
{
let data = mainList[indexPath.row] as SecondModel
let fileRef = data.fileReference
let intFileRef = Int(fileRef)
println(intFileRef)
let defaults = NSUserDefaults.standardUserDefaults()
let usedKeyName = "\(intFileRef)"
if let photo = defaults.objectForKey(usedKeyName) as? UIImage {
println("Image created")
let photo = defaults.objectForKey(usedKeyName) as UIImage
var imageView = UIImageView(frame: CGRect(x: 0, y: 0, width: self.view.frame.width, height: self.view.frame.height))
imageView.image = photo
self.view.addSubview(imageView)
}
and the "Image created" never gets printed which means the retrieving somehow doesn't work.
I'm not quite sure if you're able to save images to the userdefaults like I've done here, but that was the best I could come up with, and I couldn't find any previous questions like this for Swift.
Any suggestions on how to proceed would be appreciated.
SOLUTION: The problem was that I tried to load the image directly as a UIImage. I also had to convert the NSData to a UIImage, this all happens in the last section of the code displayed above. Finally my code looks like this:
if let photo = defaults.objectForKey("\(intFileRef)") as? NSData {
println("Image created")
let photo = defaults.objectForKey("\(intFileRef)") as NSData
let imageToView:UIImage = UIImage(data: photo)
var imageView = UIImageView(frame: CGRect(x: 0, y: 0, width: self.view.frame.width, height: self.view.frame.height))
imageView.image = imageToView
self.view.addSubview(imageView)
}
I hope this can help others struggling with something similar to this.
Swift 3
Hey, try this beautiful code here:
Convert your UIImage to Data.
PNG:
yourDataImagePNG = UIImagePNGRepresentation(yourUIImageHere)
JPEG :
yourDataImageJPG = UIImage(data: yourUIImageHere,scale:1.0)
Save in UserDefaults.
UserDefaults().set(yourDataImagePNG, forKey: "image")
Recover from:
UserDefaults.standard.object(forKey: "image") as! Data
I hope to help!
It seems like you do not call defaults.synchronize() so it's not written to the defaults file.