Anchor detection issues in AR application - xcode

I have an augmented reality app with a simple Reality Composer project.
It works fine on an ipad 14.4 but I'm having problems on higher versions (14.7 and 15).
Anchor detection is much more sensitive. This has the consequence of restarting my scenes with each new image detection.
On the other hand, the scenes are interrupted as soon as the image of the anchor is no longer visible by the camera.
I am using xcode 13.1
I use this simple code :
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
guard let anchor2 = try? Enigme1.loadDebut() else { return }
arView.scene.anchors.append(anchor2)
}
}
Thank you very much for the help you could give me.

The behavior style of Reality Composer's and RealityKit's AnchorEntity(.image) is the same as ARKit's anchor in ARImageTrackingConfiguration – if a tracked image is no longer visible in a view, there will be no ARImageAnchor, thus, there will be no 3D model.
When using AnchorEntity(.image), if your 3D model has more than 100,000+ polygons, every time it reappears on the screen it will cause a slight freeze.

Related

SCNView overlay causes tearing on resize

I'm using SceneKit to display a 3D scene (so far, a single quad), and the overlaySKScene to display a 2D overlay (which so far is just a SKNode with no geometry, though I had previously used a single SKLabelNode). It's a pretty simple view inside a bunch of nested NSSplitView. And during normal use, it works brilliantly. The problem comes when I try to resize the window or split view — I get areas of red leaking through my nice background, which disappear shortly after.
I'm running this on a 2016 MBP with a Radeon Pro 460, and captured this frame using Quicktime's screen capture:
Disabling the overlay removes the red areas, which makes me think that it's the problem. Disabling the statistics bar or the scroller (a child view of the SCNView) do not have any impact. My most minimal SKScene subclass is defined as
#implementation TestOverlay
- (instancetype) initWithSize: (CGSize) size
{
if( self = [super initWithSize: size] )
{
// Setup the default state
self.scaleMode = SKSceneScaleModeResizeFill;
self.userInteractionEnabled = NO;
self.backgroundColor = [NSColor blackColor];
}
return self;
}
#end
Has anybody run into similar issues before? Annoyingly, the apple sample Fox2 doesn't have similar problems...
For true enlightenment, one needs to read the documentation carefully, then comment everything out and restore functionality one step at a time. And then read the documentation again.
In the discussion section of -[SCNSceneRendererDelegate renderer:willRenderScene:atTime:], the solution is obvious (emphasis mine):
You should only execute Metal or OpenGL drawing commands (and any setup required to perform them) in this method—the results of modifying SceneKit objects during this method are undefined.
Which is exactly what I was doing. I had misread this as modifying geometry, so thought that assigning textures would be reasonable to do here (after all, "will" render means it hadn't started rendering yet, right?), and would therefore pick the most recently created texture. And unfortunately, before I decided that I needed an overlay, this actually works perfectly well! As soon as the overlay was added, however, the tearing appeared.
The correct place to update material properties seems to be -[SCNSceneRendererDelegate renderer:updateAtTime]. Use that to avoid silly bugs like this one folks!
Try to reset SMC (System Management Controller). It helped me for solving a similar problem but with Autodesk Maya 2018 on MBP 2017 (Radeon 560).
So, shut down and unplug your MBP.
On the built-in keyboard, press and hold the Shift-Option-Control keys on the left side and press the Power Button and hold all of these down for 10 seconds, then release the keys.
Connect the power adapter and then turn the Mac on normally.
Hope this helps.
P.S. In case it doesn't help, try to check/uncheck Automatic graphics switching option in System Preferences–Energy Saver to see if there's a difference.

SpriteKit unarchiveFromFile does not load textures from asset catalogues and atlases

I'm still new with SpriteKit, and wanted to see if I can construct a level using Xcode's SKS editor. When i added a couple of sprites with the "Spaceship.png" texture and built the template app, the textures don't load.
Here's a screenshot of the vanilla OSX Game template using Swift as the language, and adding a supplied "Spaceship.png" sprite. The texture shows fine:
And here's the result of building and running the app with only that modification to the template:
The debug console displays this warning message:
I tried to add an .atlas folder, and got the same result. the scene just displays the red X icon in place of the sprite. If the added sprites where just color sprites, they display fine. I had an app some time ago that I scrapped, where I used to load the SKScene and would manually add the sprite assets within to my SKScene sub-class, and it worked fine.
If, however, i moved the textures - "Spaceship.png" as an example - to the root of the project, i.e. not inside an asset catalogue or .atlas folder, the scene loads with the textures displaying fine.
Here's the texture added to the root of the project:
And this is the desired result:
I tried to manually add the loaded assets from the SKS file to the scene via enumerateChildNodesWithName(_,usingBlock) and I get the same result if the textures were not in the root of the project folder.
This is me trying to add the assets manually:
func applicationDidFinishLaunching(aNotification: NSNotification) {
/* Pick a size for the scene */
if let scene = GameScene.unarchiveFromFile("GameScene") as? GameScene {
let sceneToBePresented = GameScene()
/* Set the scale mode to scale to fit the window */
sceneToBePresented.scaleMode = .AspectFill
sceneToBePresented.size = scene.size
/* Sprite Kit applies additional optimizations to improve rendering performance */
self.skView!.ignoresSiblingOrder = true
self.skView!.showsFPS = true
self.skView!.showsNodeCount = true
scene.enumerateChildNodesWithName("*") { node, stop in
sceneToBePresented.addChild(node.copy() as SKSpriteNode)
}
self.skView!.presentScene(sceneToBePresented)
}
}
I looked around SpriteKit classes - SKNode, SKScene, SKTexture, SKSPriteNode - for any clue about paths, caching, preloading, or anything but was unable to find a thing that could make this work.
I am running Xcode 6.1.1 and my target is 10.9, and the language I'm using is Swift, although the same behavior holds true using ObjC. Is this a feature or a bug? Anybody else running/ran into a similar situation ?
Thanks in advance for any help
[UPDATE]
Looks like this is not implemented yet - loading textures from asset catalogs - as this post in the dev forums discusses the same issue, and his solution is to rename the asset catalogues to be the same name as the texture. In essence, what I found out about having the image files in the root folder: How do you load sprite textures from Images.xcassets when using SKS scene file, although my earlier app which i managed to roll-back to a working state does load textures from .atlas'es, but i can't seem to do it with a clean template!!!
[Answering my own question here]
As far as I can tell, at this moment, loading texture from asset catalogues - Images.xcassets - while unarchiving or deserializing an SKScene file does not work on OSX based on my attempts and the devforumns post referenced above.
Loading of the textures from image atlases, however, works by forcing SKTexture to load the image. Forcing or 'touch'ing SKTexture can be done via the preloadWithCompletionHandler(_:) or preloadTextures(_:, withCompletionHandler:) methods, or simply by printing the description of the sprite's SKTexture as i have discovered.
For the benefit of anybody who might need further assistance, here is a code snippet that preloads the textures after unarchiving an SKS file:
if let scene = GameScene.unarchiveFromFile("GameScene") as? GameScene {
for child in scene.children as [SKNode] {
if child is SKSpriteNode {
let sprite = child as SKSpriteNode
sprite.texture?.preloadWithCompletionHandler({ })
/* or even a simple debug log of the texture */
/* println(sprite.texture) */
}
}
/* Do your other stuff ... */
...
}
If I'm wrong please correct me. Until somebody does, or Apple fixes the discrepancy between SpriteKit's behavior between iOS and OSX, I will not be looking for another solution, and will follow Murphey's law:
If it works, don't fix it
Follow Salam Horani solution's, for Swift 2.x you dont have yet "unarchiveFromFile" so you can create scene like this:
if let gameScene = GameScene(fileNamed: "GameScene") {
// ...
}

UICollectionView scroll lag

I have setup a collection view with 10 subviews in a cell.
The subviews are
-imageview with label on it
-text view
-imageview
-uilabel
-imageview
-uilabel
Initially the collection view have 15 cells displayed at the time on an iPad. No when I scroll the scroll pauses when it is time to replace the cells at the bottom or top(reusing cells). I removed the shadow but still the same issue.
So the problem happens when the old cell is reused causing a lag.
Btw, no images loaded via the network.
I had the answer to this long time ago but for the benefit of others and who may be in the same issue.
Apart from removing shadows, you also need to remove "clear color" backgrounds. Any additional drawing that will require additional processing should be removed or replaced with an alternative. Any heavy lifting, text formatting, date formatting should be done before even showing the collection or table views. Make sure you cell only does the presenting and not processing. If you can't avoid it do the processing at another thread.
To measure the rate of the scroll you will need to use the instruments > graphics > core animation tool to measure the frame rate.
Try it and you will notice a difference.
EDIT: No need to experiment with autoresizing masks, just read this short article about UICollectionView performance boost http://noxytrux.github.io/blog/2014/09/25/ios8-weirdness-part3-laggy-uicollectionview/
It is probably an autolayout overhead. Consider trying autoresizing masks instead.
You can just commit everything and make an experiment:
turn off autolayout on your cell xib file
run an app to test performance (don't worry about messed up layout)
setup autoresizing masks (and do layout in code if needed) instead of autolayout if the effect is noticeable
I fixed my UICollectionView performance problems this way. It helps most when you have a lot of visible cells at a time.
Also, if you have image views, please see this answer Setting image property of UIImageView causes major lag
Don't forget about Instruments: run Time Profiler and see what eats your main thread time most.
I guess the issue can be also because of ImageView.
Using Kingfisher 5 for prefetching, loading, showing and caching images will solve the issue.
let url = URL(fileURLWithPath: path)
let provider = LocalFileImageDataProvider(fileURL: url)
imageView.kf.setImage(with: provider)
https://github.com/onevcat/Kingfisher/wiki/Cheat-Sheet#image-from-local-file
fileprivate func downloadForPerformance()->void{
for i in urlToFetchImageArray {
var image: UIImage = dowloadImageFunc(link: i.urlString() );
imageArray.append( image );
}
}
fileprivate func dowloadImageFunc(link: String)->UIImage {
let url = URL(string: link)
let data = try? Data(contentsOf: url!)
return UIImage(data: data!) ?? UIImage(named: "default.png")!;
}
cell?.mainImage.image = self.imageArrayPerformance[indexPath.row];
Usually the problem is slow download as each cell is dequeue.
Call downloadForPerformance func before the view has appeared.
This will fill an array called imageArray as a global variable
Then use that array in cellForItem func
basically have an array of already downloaded images you will need, make this a [UIImage] array, then use theImageArray[indexPath.row]

Pinch to zoom gesture working perfectly on ipad2 but not on retina iPad?

The title of the question speaks itself. For more assistance I would like to tell that my app has been developed using the Apple Photo Scroller(A modification of Apple's PhotoScroller sample code to load the UIPageViewController inside a UIViewController subclass
), with multiple image galleries. The problem is the pinch-to-zoom functions perfectly on ipad2 but not on retina iPads. My images are of size 2048x1536.
Can anybody tell me why the zoom is not working on retina iPads?
I would start by checking that the contentScaleFactor is set to 1.
From PhotoScroller's TilingView.m file:
// to handle the interaction between CATiledLayer and high resolution screens, we need to
// always keep the tiling view's contentScaleFactor at 1.0. UIKit will try to set it back
// to 2.0 on retina displays, which is the right call in most cases, but since we're backed
// by a CATiledLayer it will actually cause us to load the wrong sized tiles.
//
- (void)setContentScaleFactor:(CGFloat)contentScaleFactor
{
[super setContentScaleFactor:1.f];
}
See these related questions and answers here and here for more info on contentScaleFactor.

NSScrollView Zooming of subviews

Apologies for the noob question - coming from an iOS background I'm struggling a little with OSX.
The good news - I have an NSScrollView with a large NSView as it's documentView. I have been adjusting the bounds of the contentView to effectively zoom in on the documentView - and all works well with respect to anything I do in drawRect (of the documentView)
The not so good news - I have now added another NSView as a child of the large documentView and expected it to simply zoom just like it would in iOS land - but it doesn't. If anyone can help fill in the rather large gap in my understanding of all this, I'd be extremely grateful
Thanks.
[UPDATE] Fixed it myself - 'problem' was that autolayout (layout constraints) were enabled. Once I disabled them and set the autosizing appropriately then everything was ok. I guess I should learn about layout constraints...
I know this is very old but I just implemented mouse scroll zooming using the following after spending days trying to figure it out using various solutions posted by others, all of which had fundamental issues. As background I and using a CALayers in a NSView subclass with a large PDF building layout in the background and 100+ draggable CALayer objects overplayed on top of that.
The zooming is instant and smooth and everything scales perfectly with no pixellation that I was expecting from something called 'magnification'. I wasted many days on this.
override func scrollWheel(with event: NSEvent) {
guard event.modifierFlags.contains(.option) else {
super.scrollWheel(with: event)
return
}
let dy = event.deltaY
if dy != 0.0 {
let magnification = self.scrollView.magnification + dy/30
let point = self.scrollView.contentView.convert(event.locationInWindow, from: nil)
self.scrollView.setMagnification(magnification, centeredAt: point)
}
}
LOL, I had exactly the same problem. I lost like two days messing around with autolayout. After I read your update I went in and just added another NSBox to the view and it gets drawn correctly and zooms as well.
Though, does it work for NSImageViews as subviews as well?

Resources