SwiftUI - How does Apple Music create their background gradient? - image

I was taking a look at the background that Apple Music uses when displaying the currently playing song, like these:
I really like the way that these backgrounds echo the color of the album cover, but I'm not entirely sure how I could implement something similar, given an Image. At first I thought it was just a magnified and blurred copy of the album Image, like this:
Image("album cover")
.resizable()
.frame(width: 300, height: 300)
.blur(radius: 20)
But looking at the two images above I don't think this is the case, as not all colors in the cover image are included. Instead, do they use a radial gradient? If so, how do they pick which colors to use, and how could I do something similar, given an Image?
Thanks for the help!

I went through a similar exercise lately and here is what I've used
.background {
ZStack {
Rectangle()
.fill(backgroundColor().gradient)
.edgesIgnoringSafeArea(.all)
Rectangle()
.fill(.ultraThinMaterial)
.edgesIgnoringSafeArea(.all)
}
}
Note that .gradient is available since iOS 16. The background color can be any swiftui color. In my case I'm using the average color of the song artwork available with MusicKit since iOS 15.

Related

How to get the approximate background color of an image?

Is there a way to get the approximate background color from an image in Flutter? I am getting my image from a URL. I don't need an exact background color: just an approximation - for instance, getting the color of the pixel in the top left corner (0, 0) would be just fine.
There seems to be no easy way to do this - I have tried many imaging packages, but they only provide "primary color" and not background color.
Old question, but for people still needing this, see the ImagePixels widget from the https://pub.dev/packages/image_pixels package (I am the author of this package):
#override
Widget build(BuildContext context) {
return ImagePixels(
imageProvider: image,
builder: (context, img) {
Color topLeftColor = img.pixelColorAt(0, 0);
Text("Pixel color at top-left: $topLeftColor.");
);}}
Note you could also get a dozen pixels all around the image (or at the corners of the image), and then average them. This would have a better chance of getting a good representative color.
Also note, if all you want to do with this color is to extend it to a larger area, there is a constructor that does that for you: ImagePixels.container.
Have you tried the image package?
If you just want the top left corner pixel, I believe you can read the image's pixels and get it.

will frame() in Image on SwiftUI has different size for each devices?

I have a code for View with Image in swiftUI, I wonder if the frame() function will produces different size for each devices(like iPhone 8, Xs, Xs+, etc)
struct CircleImage : View {
var body: some View {
Image("jorge")
.resizable()
.frame(width: 100, height: 100)
.clipShape(Circle())
.overlay(
Circle().stroke(Color.white, lineWidth: 4))
.shadow(radius: 10)
}
}
I have Xcode 11 with beta of course, but I don't have catalina since I have to do my work with Mojave and Catalina doesn't support it. So I can't find any clues with SwiftUI Preview.
I also tried to see above code with multiple simulators(with 8 and Xs+) and I saw that images seems have same size but I'm not sure for it.
And If the sizes are same, what is the standard size of screen? and if not, how can I find each screen's standards?
I'm not sure I understand your question, but I'll do my best to answer. When you specify a frame(width: 100, height: 100), SwiftUI will make it 100x100 points. No matter which device. If you would like to adapt to the available space of a device, check the example I posted in this other question: https://stackoverflow.com/a/56853211/7786555
Additionally, here's the documentation to get the screen size for your current device:
The bounding rectangle of the screen, measured in points:
https://developer.apple.com/documentation/uikit/uiscreen/1617838-bounds
The bounding rectangle of the physical screen, measured in pixels.
https://developer.apple.com/documentation/uikit/uiscreen/1617810-nativebounds

tvOS: Rounded corners for image view when focused

I have an image view that will get this awesome tvOS focus effect when the containing view gets focused.
The problem is - it should have rounded corners. Now this is easily done:
imageView.layer.cornerRadius = 5
imageView.layer.masksToBounds = true
I have to set either masksToBounds of the layer or clipsToBounds of the image view to true (which is basically the same), in order to clip the edges of the image - but as soon as I do this, the focus effect won't work any more, because it will get clipped as well.
I had more or less the same problem with buttons, but since the focus effect is much simpler than for the image view (only scaling and shadow), I just implemented it myself, but that is not an option for the image view, with all the effects applied (moving, shimmering, and so on...)
Is there an easier way? Did I miss something? I can't be the only trying to figure out how this works!? :)
I have found out an alternative solution. What one may do is to actually draw the image, clipping out the corners with an alpha channel. The image then gets scaled correctly when focused. That applied to the layer. Then, to have the alpha channel added to the other layers (like the one for the glowing effect) we need to set; "masksFocusEffectToContents = true".
I made an extension for it, based on this answer:
Swift 4.2
extension UIImageView {
func roundedImage(corners: UIRectCorner, radius: CGFloat) {
let rect = CGRect(origin:CGPoint(x: 0, y: 0), size: self.frame.size)
UIGraphicsBeginImageContextWithOptions(self.frame.size, false, 1)
UIBezierPath(
roundedRect: rect,
byRoundingCorners: corners,
cornerRadii: CGSize(width: radius, height: radius)
).addClip()
self.draw(rect)
self.image = UIGraphicsGetImageFromCurrentImageContext()!
// Shadows - Change shadowOpacity to value > 0 to enable the shadows
self.layer.shadowOpacity = 0
self.layer.shadowColor = UIColor.black.cgColor
self.layer.shadowOffset = CGSize(width: 10, height: 15)
self.layer.shadowRadius = 3
// This propagate the transparency to the the overlay layers,
// like the one for the glowing effect.
self.masksFocusEffectToContents = true
}
}
Then to apply the rounded corners call:
myImageView.adjustsImageWhenAncestorFocused = true
myImageView.clipToBounds = false
// masks all corners with a radius of 25 in myImageView
myImageView.roundedImage(corners: UIRectCorner.allCorners, radius: 25)
One can obviously modify roundedImage() to add the parameters to define the shadows at the calling time.
Downsides:
Borders behave like cornerRadius (they get drawn inside the image).
But I think I made it working somewhere, then investigating further I
lost the changes
I am not exactly sure this is the right way to do it. I am quite confident there must be some methods out there doing it in a couple of lines. In tvOS 11 Apple introduced the round badges (animatable and all), shown at WWDC 2017. I just can't find a sample for them.
Otherwise, tvOS 12 (beta for now) introduced the Lockup. I managed to implement them programmatically, as shown in this answer.
https://forums.developer.apple.com/thread/20513
We are also facing this same issue. When you round the corners, you can see the "shine" still has a rectange shape.
I showed the issue to the Dev Evangelists at the Tech Talks in Toronto and they said it's a bug. It's reported and open rdar://23846376
For 2022.
Note that you can just use UICardView on tvOS for the effect.
Simply put the UIImageView inside the card view.
Don't forget to actually turn OFF "adjust on ancestor focus" and "user interaction enabled" on the image view, or else it will "doubly expand" when the card view expands!
There's also a weird issue where you have to add 20 to the height of all card views to make them work neatly with and enclosed image view.

How do I get my Cocoa app to draw exactly the color specified by my designer in Sketch?

My designer has specified a color to draw. When I try to draw that color in a Cocoa app, I get a resulting color that’s visibly different from the reference image as displayed by Sketch.app.
I made a small Cocoa app that draws a custom view. Here’s the interesting part of the code. Note that I am initializing the color in SRGB space.
class View: NSView {
override func drawRect(dirtyRect: NSRect) {
let components : [CGFloat] = [156.0/255.0, 0, 254.0/255.0, 1]
let color = NSColor.init(SRGBRed: components[0], green: components[1], blue: components[2], alpha: components[3])
color.setFill()
NSRectFill(self.bounds)
}
}
Here’s what it draws. (Nevermind the part about the cursor. And I removed the window shadow so it would be easier to review this side by side with other windows.)
And here’s the Sketch file portion:
Putting it all together, here’s a side by side of the Sketch file and the custom view, as well as Xscope loupe displaying the color value under the mouse cursor.
When hovering over Sketch file, I see this:
When hovering over my custom view, I see this:
You can see that the color value of the pixel under the black mouse cursor as read by Xscope is significantly different. The colors also look significantly different on my Retina Macbook Pro display, though interestingly, not so different in the captured screenshot PNG.
HOWEVER: so far, this was all done with default display settings and color profile “Color LCD” (the hardware is Retina Macbook Pro with its built-in display). When I manually change the display profile to “sRGB IEC61966-2.1” in OSX Settings app, and then sample the colors again with Xscope, you can see these sampled values:
And when sampling the custom view:
Most interestingly, you can see that the values sampled by Xscope on my custom view exactly match the specified values, and the color is also visually correct. But of course, I can’t make my users change their display profile.
My question: how do I make my custom view color exactly match the color in Sketch (both for visual inspection and when sampled with the Xscope loupe) with the default Color LCD display profile?
Just worked through this issue myself. Here's my process. Just tested on a Retina Macbook Pro.
Open Sketch.
Open Digital Color Meter (installed on OSX)
Switch to 'Display in Generic RGB'
In menu, ensure that 'View -> Display Values -> As Decimal`
Mouse over your color of the artwork in sketch and note the values (e.g. 0, 150, 200)
Use that value in Cocoa:
-(void)drawRect:(NSRect)dirtyRect {
[[NSColor colorWithCalibratedRed:0/255.0 green:150/255.0 blue:200/255.0 alpha:1] set];
NSRectFill(self.bounds);
}
This should work, as 'Generic RGB' is a device independent space equivalent to the 'calibrated' color space in Cocoa.

Image sprite has white background and wrong position

I am not very experienced with image sprites so here is the question..
I made an image sprite on the web; this is the code:
.sprite-slidebutton {
background-position: 0 0;
width: 70px;
height: 63px;
}
.sprite-slidecross {
background-position: 0 -113px;
width: 70px;
height: 63px;
}
The image I got, I downloaded to my page and I called the .png wherever wanted it. It does appear! And the sprite is working, the image is switching like i want it to..
But the PNG is not showing a transparent background :S also the image is not in the middle, I only see half of the both images.. where and how to adjust?
To see it live:
solved
If i had to guess i would say there is a problem with the alpha values in the image. Download gimp and see what they are.
the png probably has a white background, instead of a transparent background. that's something you will need to edit in photoshop, gimp or similar.
as far as only seeing half the image it might be due to the element you are assigning the class to. if its an inline element like <a> or <span>. try adding "display:block;" inside the sprite-slidebutton class.

Resources