IKImageBrowserView on retina screen - macos

Has anyone successfully used an IKImageBrowserView with a Retina Mac? What I get is that the image size is wildly misinterpreted. Previously I was using CGImage images which don't have a logical size, so it makes sense that the browser can't draw the at the right size. However, I've switched to NSImage, created using -initWithCGImage:size: and that still doesn't work right.
My images are 244x184 pixels and should be drawn at a logical size of 122x92. When passing 122x92 as the size, they are drawn way too large, at about 180 pixels wide. If I pass exactly half this, 61x46, the size is correct, but the image looks downscaled and not sharp. If I pass 122x92 and run with NSHighResolutionCapable set to NO in Info.plist, everything works well.
My conclusion is that IKImageBrowserView is not Retina compatible even with the 10.10 SDK on a Retina MacBook Pro running OS X 10.11. Or am I missing something? Any pointers would be appreciated!

I discovered that I wasn't really thinking the right way. The browser is supposed to always scale its images, so that's why the Retina-sized images ended up larger. I just subclassed the browser to be able to use a custom cell and customize the image frame per cell. There are however some subtle bugs in the browser that cause it to scale the images just a little bit in Retina mode, but I was able to work around that by creating a custom foreground layer for each cell that contains the image without scaling. Problem solved. Hopefully this will help someone else in the future.

Related

Custom antialiasing settings in three.js

I am trying to find a way to specify some antialiasing settings in three.js/WebGL to try and improve the results.
The thing is that with the exact same code, if I load a model on a Retina Display, the antialiasing works quite fine (even if I move it to my non-retina external monitor afterwards), but it's all pixelated if I load it first on a non-retina screen.
Here is a screenshot (both on Chrome, both displayed on a retina display). Left was loaded on a non-retina, right on a retina: https://i.imgur.com/krNavZU.png
What I get from this is that three.js somehow uses the pixel density when initializing the antialiasing. Is there anyway to tweak this so that I can force it to something better?
Thanks a lot in advance for your help :)
Side note: For the record, it seems that the antialiasing works much better on Firefox as well, anyone knows why?
Just in case someone is looking to do the same kind of tweaking I was trying, I'll answer my own question.
Based on WaclawJasper's comment, I found some documentation in Three.js related to my issue. From http://threejs.org/docs/#Reference/Renderers/WebGLRenderer:
.setPixelRatio ( value )
Sets device pixel ratio. This is usually used for HiDPI device to prevent blurring output canvas.
I was using renderer.setPixelRatio( window.devicePixelRatio ); in my initialization, which was the reason why the rendering was depending on where the page was first loaded.
Related to the antialiasing, I now artificially raise the pixel density using renderer.setPixelRatio(2); on non-retina screens. This results in a much more effective antialiasing, at the cost of increased computation.

Image resize when running the app on my device (Android Studio)

I'm just starting with Android Studio and I have an old device running Android Froyo (API 8) with 240x320px display resolution.
I have an image with 240px width and when I use an ImageView with wrap_content to display that image on my device, it doesn't use the whole width of my device screen.
When I set the ImageView width to 240px, It occupies the whole width of my device (so I know my device has actually 240px width), but I can see it's blurred out.
Apparently the image is being resized to a lower resolution before being compiled and loaded to run on my device.
If anyone can help me explaining why this happens, I would really appreciate, because I couldn't find out, searching here and on Google.
.
Thanks!
As I didn't get an answer to my question, thinking overnight, I believe I figured it out.
My display has 2.8", so I thought it's density would be around 143dpi ( SQRT(240^2+320^2)/2.8 ).
With that density, objects declared in 'dp' would take too much space on the screen.
So, after comparing the 'dp' value with the actual number of pixels on my screen, I found out its density is actually 120dpi (LDPI), so a 160dp object actually doesn't have one inch on my screen, but 3/4 of an inch.
Maybe this is done by the device, not the compiler (I don't know), but everything I declare in 'px', is multiplied by 3/4 before loading due to its density.
I hope I'm right and that I have helped anyone with the same doubt.

What algorithm does a browser like Chrome or Firefox use to zoom images?

I have noticed that when I view a image in a browser using either the zoom provided in the setting or on a webpage using style attributes the pixelation is either negligible or un noticable. But when you use programs such as paint or photoshop or windows picture viewer you start to notice pixelation. Does anyone know how the browser zoom its image contents?
Here is a sample image the one on right is from paint while one on left is when viewing in chrome. The zoom is set at 500%.
For fonts, I believe it has to do with font sizing. Okay, so say you are in a word processor and type something up you increase the font size the text gets bigger. A similar thing happens in a web browser when you zoom in.
On the other hand when you take an image the resolution is set so as you zoom in the the pixels become larger and more noticeable this is called aliasing. Many times a program or browser, etc. will try and smooth the edges in the image to make the pixels look less blocky to the eye, this is called anti-aliasing.
As far as the actual algorithms behind behind paint or a web browser go, I am unsure. It may take some more research to find out.

XCode - when it comes to UiWebView there is an issue

I'm totally new when it comes to xcode, but there occured a problem:
I'm designing an iPad-App(Retina Display) in html/css with the standard retina resolution of 2048x1536px...the problem is, that when I open the app on the pad, the page turns out to be way too huge. If I'm reading out the UiWebView-Resolution I get 1024x768...am I able to change this to get the real iPad dimension?
thx for your help!
Best regards,
daft
Dimension values on iOS are described in points. Each point can have different number of pixels - depends on screen's pixel density. UIWebView interprets html document size value as point - so 1 html pixel means 1 point.
I suggest you two options to cope with that:
1. Design you're web app to resolution 1024x768 and insert images which are scaled to 50% size to have more pixel density.
2. Leave page in 2048x1536 resolution and use UIWebView api to scale content.

Game on widescreen display

I'm new to OpenGL development for MacOS.
I make game 1024x768 resolution. In the fullscreen mode on widescreen monitors my game looks streched, it's not good.
Is there any function in OpenGL to get pixel per inch value? If I find it, I can decide whether to add bars to the sides of the screen.
OpenGL is a graphics library, which means that it is not meant to perform such tasks, its only for rendering something on to the screen. It is quite low level. You could use the Cocoa API NSScreen in order to get the correct information about the connected screens of your Mac.
I make game 1024x768 resolution.
That's the wrong approach. Never hardcode any resolutions. If you want to make a fullscreen game, use the fullscreen resolution. If you want to adjust the rendering resolution, switch the screen resolution and let the display do the proper scaling. By using the resolutions offered to you by the display and OS you'll always get proper aspect ratio.
Note that it still may be neccessary to take pixel aspect ratio into account. However neither switching the display resolution, nor determining the pixel aspect ratio is part of OpenGL. Those are facilities provided by the OS.

Resources