In XCode 4 when working on an iOS project (maybe it was in XCode 3 too and I just hadn't noticed it) there is field under build settings called "Combine High Resolution Artwork", which can be set to yes or no.
What exactly does this setting do?
From Xcode's quick help:
Combine High Resolution Artwork COMBINE_HIDPI_IMAGES
Combines image files at different resolutions into one multi-page TIFF file that is HiDPI compliant for Mac OS X 10.7 and later. Only image files in the same directory and with the same base name and extension are combined. The file names must conform to the naming convention used in HiDPI. [COMBINE_HIDPI_IMAGES]
In other words, the setting probably has no effect under iOS at the moment. It would combine abc.png and abc#2x.png into one multi-page TIFF file, which would be convenient under OS X because NSImage can handle such files and use the image representation that is best suited for the desired output size and device. If future Apple hardware will have higher screen resolutions, this setting will probably play an important role in how developers deal with it.
Related
We have number of images used as app assets, for case of simplicity, let's say 10 images only.
As far as I understand, Xamarin.Forms will not cashe these images, so if I have these 10 images showing in my toolbar (this is just to explain issue, we would not really put 10 images in a tiny toolbar) and I have 5 pages showing same toolbar, that means the 10 images will be loaded each 5 times resulting in total 50 images loaded. I would like to load only 10 images rather than loading 5pages x 10images = 50 images.
In addition, if I have to add these 10 images in our repository, I would have to add at least 3 copies of same image to our repository (1 for Android, 1 for IOS, 1 for UWP). This results in 30 images in repository but in reality I needed only 10.
So, these 2 issues make be believe there should be a better cross-platform solution so I can share same image across all 3 platforms, have only one image used by all 3 platforms (Android, IOS, UWP), and I load an image only once in memory regardless how many times I show in the UI.
You can look into FFImageLoading which supports caching. To have images in your shared code instead of your platform-specific code you could look into embedded images:
Embedded images are also shipped with an application (like local images) but instead of having a copy of the image in each application's file structure the image file is embedded in the assembly as a resource. This method of distributing images is particularly suited to creating components, as the image is bundled with the code.
https://developer.xamarin.com/guides/xamarin-forms/user-interface/images/#Embedded_Images
Keep in mind though that adding images per platform isn't a bad thing. Each platform has its own image versions due to different pixel densities etc. so to make it look good on each platform you might want to consider the platform-specific route.
Another alternative is adding your images as file-linked images in each platform specific file. The image file itself can saved in a single location and file-linked into the correct directories per platform.
There is a library for Xamarin.Forms called Forms9Patch which supports different resolutions and devices just like the specific projects for Android / iOS.
Forms9Patch has a MIT-license, here's the github.
Also this is a video that should give a good general overview.
I drag pdf intro image assets (universal) and building Xcode.
There is no generated png#2 png#3 . (only Pdf)
I follow this tutorial :
https://icons8.com/articles/how-to-use-vectors-in-xcode-7/
When I start app on the device images are so bad quality...
Xcode 6, 7 working ....
Maybe PDF file needs optimisation ?!
In research I found "size of svg is not important" but in this case it is (I test it).
Not sure if this helps, but from what I was reading about this, Xcode doesn't fully support vector graphics. It sounds like instead of you being able to load a vector image (.pdf) and it scales however necessary. Instead, whatever the default size of your .pdf is, it'll assume that is the 1x size, then scale the pdf to automatically create the 2x and 3x image when it's built. So its not actually scaling the original, just creating a 1x, 2x, and 3x png of the original size of your pdf. From what I'm reading, people are thinking this is done to maintain backwards compatibility. I couldn't find anything that says it has changed in Xcode 8, so I'm assuming it still works the same way.
This question seems to answer it well: How do vector images work in Xcode (i.e. pdf files)?
Hope I was able to help.
Possible hepful support links, I would suggest updating to the latest. xcode 13
xcode 8.1
xcode 8
Using Xcode 6.3.1 I needed to simulate sizes to make the app compatible with older devices, I went to menu just to find that I can't select different screen sizes.
What am I missing?
EDIT
Another project, this time using size classes:
This option makes the entire storyboard use a specific screen size so I can see what is happening at design time
Maybe this is intended? To make people use size classes?
So, after some time (almost a year) I still haven't found a solution. The new preview is supposed to replace the old "Simulated Screen".
This is how you enable it:
And this is how you add any number of devices side-by-side:
The reason they added this is because you can now see many resolutions at the same time and see better what you build and how it will look in the various devices available.
I don't agree with that, I still miss the "Simulated Screen". I think it was a better solution, but they removed. We have to deal with it.
Before this turns into a race to see who gets the bounty I'm going to close this.
#mcatach : Thats not the point. Xcode before 6.3.1 allowed you to see your storyboard in the interface builder with the device size you wanted. So we could use size classes and see in a particular resolution how it should look.
#aman.sood : This wont do. Changing the development target will not do anything to the interface builder, only remove size classes on older targets. Preview helps, but could be better. Using actual device is required to publish but it's not a solution to see different resolutions, that would mean you need at least 6 devices (7 with iPad mini).
Couple of things you can do
Use Xcode by setting active schema for different devices like iPhone 4,iPhone 6s etc. But these drop down option will be available as per your deployment target settings selection
Second option is to use Preview in Xcode. That will provide you with different configuration without running your app. Check out this video for more details.
And last is to use actual device which only one uses when there is device specific issue.
When you're using size classes you will not see specific resolutions sizes. You need to design your screens based on global sizes (Compact/Regular/Any). You can use Any to fit any possible height/width.
So, my advice is to decide if you want to support iPhone and/or iPad, and then start to draw your screens.
I have a PDF that seems to have some internal color profile attached. If I render this in iPhone simulator the colors come out as they look in Photoshop which apparently can parse this color profile. If I render the same PDF on Mac I get the same colors (less bright, muddy) as in Preview and Pixelmator.
Is there some way how I can achieve the same (correct) rendering result on Mac as I was getting in iPhone simulator?
On iOS Simulator, I used CGColorSpaceCreateDeviceRGB with a kCGImageAlphaPremultipliedFirst bitmap context. I also set rendering intent kCGRenderingIntentPerceptual, though I don't know if this makes any difference.
on Mac I tried the same settings, as well as all the different kinds of color spaces, but I'm never able to achieve the same result us in Simulator.
I also tried the two ICC based approaches mentioned here: http://developer.apple.com/library/mac/#qa/qa1396/_index.html
I'm getting desperate. What is different between iOS simulator which gets the colors right and Mac? I thought iOS doesn't do color correction, but Mac does? Right now I am drawing the individual pages with PDFKit. Is there a difference in PDFKit on Mac versus iOS related to color correction that makes it work properly on iOS Simulator, but fails on Mac?
I also took a JPG that was rendered from this PDF in iOS Simulator, put it into a new PDF in Preview and there preview and my Mac rendering yielded exactly the same colors as the input.
It looks to me like iOS Simulator has a magical ability to use a color profile embedded in this PDF which Preview or Quartz on Mac does not.
Please help!
kind regards
Oliver Drobnik
I don't know exactly how simulator works, but I can assure you that both Preview and iOS have very incorrect handling of certain features of PDF files; especially when it comes to color management, transparency, overprint, advanced compression of images etc...
Two tips:
On Mac, open the PDF file in Adobe Reader (free download from Adobe.com). The color you see in Reader should be very close to the actual truth. If your PDF file contains ICC profiles (for objects in the file or in the output intent, meaning for the whole file) it will be used correctly. On iOS, also look at Adobe Reader - it currently is the best (highest quality) display tool on that platform.
Secondly, if you want to know what the simulator or some other tool can or cannot do, have a look at the test patches from the GWG (http://gwg.org/ghentoutputsuite.phtml). These patches were designed to give very easy to interpret results on whether certain tools or printers can handle specific PDF features.
These two steps should at least tell you what works and where it works. That should make it easier to figure out what you need to correct.
Unfortunately this is a confirmed bug in CGPDF on Mac. It manifests itself if you have CMYK as a transparency color space. iOS ignores this correctly, the Mac messes up the colors.
I'm developing a screenshot application working in fullscreen mode. I have a bug report about issues with MBP Retina, but I have no idea how to test and fix them. It looks like QuartzDebug can change displays to HiDPI mode, but I'm not sure that will do the trick. Can't find any "Retina Emultaion" related topics in Apple docs.
So my question is how can I test app (not just icons, but whole fullscreen application) for being compatible with retina display without buying one.
It's actually all in the Apple docs, though slightly hard to find: Testing High Resolution Content.
I'll sum it up for you: you should always test on a real device (or go to the Apple Store and put your application on to one of their demo retinas). But as an intermediate step, emulating the retina works too.
Quartz Debug's HiDPI mode works for this, and is a method Apple delineates as one to test with. You can also tint high resolution images using the command (in Terminal)
defaults write -g CGContextHighlight2xScaledImages YES