iOS15 / iPhone12 setContentOffset animate Issue - animation

It occurs on iOS15/iPhone12 series of devices.
Multiple CollectionViews are paged with each timer, but paging does not work normally only on iOS15/iPhone12 devices.
The paging is stuttering and the animation is not working normally.
I tried implementing a timer using Rx, but the symptoms are the same.
It operates normally on other devices, other iOS versions.
Has anyone experienced the same issue as me?
// CollectionView has the following timer code and performs timer operation independently.
// each CollectionView's paging interval is 1.0 seconds 1.3 seconds and 1.6 seconds.
func startTimer(interval: CGFloat) {
let timer1 = Timer.scheduledTimer(timeInterval: interval,
target: self,
selector: #selector(rolling1),
userInfo: nil,
repeats: true)
}
#objc func rolling1() {
DispatchQueue.main.async {
self.offset1 = CGPoint.init(x: self.offset1.x + UIScreen.main.bounds.width, y: self.offset1.y)
self.collectionView?.setContentOffset(self.offset1, animated: true)
}
}
The motion screen is as follows.
It works normally on all iOS 14 devices.
It also works normally on iOS15 devices except iPhone 12 series.
There is only a problem with iPhone 12 series on iOS15.

It is fixed in iOS 15.1 Thank for Apple..

Related

Updating rotary knob in real time with SwiftUI

I'm currently trying to convert a slider in to a rotary knob and having a tough time of it all. The knob works in design but i'm struggling to set the correct value within the knob and as a result change the value within the app in real time.
I'm using AVAudio to set up an engine for people to record with that has effects like Reverb and Delay.
The reverb value is set as followed within the Audio Class:
#Published var reverbValue: Float = 0.0
and later on referenced in a function to change it's value
func changeReverbValue() {
setReverb.wetDryMix = reverbValue
}
When I use a regular slider as follows the change works:
Slider(value: $recordingsettings.reverbValue, in: Float(0.0)...recordingsettings.reverbMaxValue, onEditingChanged: { _ in
self.recordingsettings.changeReverbValue()
}).accentColor(Color.white)
As mentioned the knob works fine in its design:
ZStack {
Knobs(color: .orange)
.rotationEffect(
.degrees(max(0, initialCircleState()))
)
.gesture(DragGesture(minimumDistance: 0)
.onEnded({ _ in
startDragValue = -1.0
})
.onChanged { dragValue in
let touchDifferential = touchDifference(dragValue)
setInitialDragVal()
let computedTouch = computeTouch(touchDifferential)
print(computedTouch)
baseValue = getBaseVal(computedTouch)
let normalizeVal = baseValue / touchAmt
value = Float(normalizeVal * rngOffset(range: bounds) + bounds.lowerBound)
print("vaule is: \(value)")
}
)
GrayCircle(bounds: bounds)
OrangeCircle(baseValue: $value, bounds: bounds)
}
.rotationEffect(bounds.lowerBound < 0 ? .degrees(90) : .degrees(107))
I've had some success connecting the knob to the reverb value to the point where the slider also moves when the rotary knob does, however the changeReverbValue function doesn't work.
The success comes from setting the value within the knob view as follows:
#Binding var value: AUValue
And then referencing the knob on the same struct of the main view as the slider:
Knob(value: $recordingsettings.reverbValue, bounds: 0...CGFloat(recordingsettings.reverbMaxValue))
.onTapGesture {
self.recordingsettings.changeReverbValue()
}
The on tap gesture was a way in which I thought it might call the change reverb value function when the knob was turned but to no avail.
The binding value passed in the knob also has other challenges. For some reason when I playback audio without headphones and then turn the knob the audio starts to stutter. This doesn't happen with headphones and I find that pretty weird.
Anyone know how I could reference the reverb value within the rotary knob and have the changeReverbValue function called at the same time?
I just want to replace the slider with something that looks better. Otherwise i'm going to have to leave this for a bit and just implement the sliders instead throughout the app.
If I don't set the value of the knob as #binding in the rotary knob view the track doesn't stutter on playback but then I don't know if it's possible to change the reverb value without a #binding var.
I struggled to parse a precise singular problem statement from the narrative, so this is perhaps just an off-base commentary and not a solution. I walked away thinking your problem is: a custom UI component is "jumpy/stuttery" during interaction and produces similarly punctate effects on app state.
If that's fair, I worked around the same issue in my first SwiftUI app. The cause could be two things:
Not using the right async queue by accident.
Forcing an #State or #Published property to update for all global state changes. That means you are pushing stale state from earlier back into an interaction, possibly with a circular feedback loop.
The solution is pretty simple. Request and consume model updates with both a value and a source tag. Throttle and filter out the self-tag to keep local state responsive to only one just-in-time data stream.
I used that pattern in that first app (free, Inclusivity for Mac) to coordinate an HSV color wheel and color channel custom slider components. The wheel, sliders, and other interactions feed/read a shared Combine pipeline (CurrentValueSubject<SourcedColorVector,Never>.erasedToAny()).
Some sample gestures, which simply punt the gating work to a view model:
The HSVWheel drag-around or click gesture
private func touchUpInWheel() -> ExclusiveGesture<_ChangedGesture<DragGesture>, _EndedGesture<DragGesture>> {
ExclusiveGesture(
DragGesture(minimumDistance: 10, coordinateSpace: .named(wheel))
.onChanged { change in
let adjusted = CGPoint(x: change.translation.width - targetDiameter + change.startLocation.x / 2,
y: change.translation.height - targetDiameter + change.startLocation.y / 2)
vm.setHueSat(drag: adjusted)
},
DragGesture(minimumDistance: 0, coordinateSpace: .named(wheel))
.onEnded { end in
let click = CGPoint(x: end.location.x - vm.radius,
y: end.location.y - vm.radius)
vm.setHueSat(click: click)
}
)
}
A typical slider gesture (this is the vertical value slider)
private func tapAndDrag() -> _EndedGesture<_ChangedGesture<DragGesture>> {
DragGesture(minimumDistance: 0,
coordinateSpace: .named(valuePickerSpace))
.onChanged { value in
let location = value.location.y - .valueSliderGestureOffset
vm.setValueKnobLocation(raw: location)
}
.onEnded { end in
let location = end.location.y - .valueSliderGestureOffset
vm.setValueKnobLocation(raw: location)
}
}

Swift: Keep NSTimer Running While Computer is Asleep (OSX)

I am writing an OS X app that includes a generic stopwatch timer. I'm using NSTimer. I'd like the user to be able to start the timer and come back to it after a long time (say, 30 minutes), and the timer would still be running. The problem is that my timer does not continue running while the computer is closed or asleep, and I don't want to keep my computer open and on for really long periods of time. There are several threads about this problem concerning iOS apps, but none (at least that I've found) pertaining to OS X. Does anyone know of a workaround for this issue? As an example, I'm trying to mimic the "Stopwatch" functionality of the "Clock" app that comes with iOS, except with a laptop instead of a phone. The stopwatch in the "clock" app will continue running even when the phone is off for extended periods of time.
The way I figured out to do this was not to actually run the NSTimer in the background, but rather to find out how much time had elapsed between when the app goes into the background and when it comes back into focus. Using the delegate methods applicationWillResignActive: and applicationWillBecomeActive: of NSApplicationDelegate:
let timer = NSTimer.scheduledTimerWithTimeInterval(1, target: self, selector: #selector(self), userInfo: nil, repeats: true)
var resignDate: NSDate?
var stopwatch = 0
func update() {
stopwatch += 1
}
func applicationWillResignActive(notification: NSNotification) {
timer.invalidate()
resignDate = NSDate() // save current time
}
func applicationWillBecomeActive(notification: NSNotification) {
if resignDate != nil {
let timeSinceResign = NSDate().timeIntervalSinceDate(resignDate!))
stopwatch += Int(timeSinceResign)
timer = NSTimer.scheduledTimerWithTimeInterval(1, target: self, selector: #selector(self), userInfo: nil, repeats: true)
resignDate = nil
}
}
applicationWillResignActive: will get called every time the app goes out of focus. When this happens, I save the current date (NSDate()) in a variable called resignDate. Then, when the application is reactivated (after who knows how long; it doesn't matter) applicationWillBecomeActive: is called. Then, I take another NSDate value that is the current time, and I find the amount of time between the current time and resignDate. After adding this amount of time to my time value, I can revalidate and the NSTimer so it keeps going.

OS X agent app with NSTimer in the background is not working after deep sleep

I have a OS X agent app (which only runs from the icon in the menu bar). My app creates a NSTimer with random intervals to play a sound.
func setNewTimer(timeInterval: NSTimeInterval) {
self.timer = NSTimer.scheduledTimerWithTimeInterval(timeInterval, target: self, selector: "playSound", userInfo: nil, repeats: false)
NSRunLoop.currentRunLoop().addTimer(self.timer!, forMode: NSRunLoopCommonModes)
NSLog("Timer created for interval: \(timeInterval)")
}
The app works all fine after I start it and keep doing other work in other apps. It plays the sound at random times as expected.
If the computer goes to sleep for a short period and comes back the app will keep playing sounds at random times as expected.
However, if my computer goes to sleep for a long time (e.g. throughout the night), the app will not play sounds anymore.
It may be possible that the problem is that the timer may be disabled if the computer goes to deep sleep? or (preferably) Is there a way to detect that the computer awoke from sleep so I can reset my timer?
Note: Every time I call this function I first self.timer.invalidate() and recalculate the timeInterval. At sleep time (e.g. 23:00 to 08:00 ) the timer will not run, but instead will create an interval from 23:00 to 08:00 so that it 'fires' the next day in the morning.
I've came up with a solution myself after I found that there was no reply for some time. The solution was pretty simple as I only needed to register for sleep and wake notifications (I added the code updated to Swift 3):
// App should get notifified when it goes to sleep
func receiveSleepNotification(_ notification: Notification) {
NSLog("Sleep nottification received: \(notification.name)")
// do invalidation work
}
/// App should get notified when the PC wakes up from sleep
func receiveWakeNotification(_ notification: Notification) {
NSLog("Wake nottification received: \(notification.name)")
// Reset/Restart tasks
}
func registerForNotitications() {
//These notifications are filed on NSWorkspace's notification center, not the default
// notification center. You will not receive sleep/wake notifications if you file
//with the default notification center.
NSWorkspace.shared().notificationCenter.addObserver(self, selector: #selector(AppDelegate.receiveSleepNotification(_:)), name: NSNotification.Name.NSWorkspaceWillSleep, object: nil)
NSWorkspace.shared().notificationCenter.addObserver(self, selector: #selector(AppDelegate.receiveWakeNotification(_:)), name: NSNotification.Name.NSWorkspaceDidWake, object: nil)
}
func deRegisterFromNotifications() {
NSWorkspace.shared().notificationCenter.removeObserver(self)
}

XCTest sleep() function?

I am testing an app which samples data. Part of the test I am setting up requires a few data points to be stored. I would like to do this by having XCTest execute the acquisition method followed by a sleep() function and yet another call to the acquisition method.
Though there are methods to wait for an expectation with timeout, there doesn't seem to be a simple wait()/sleep() method that simply pauses execution for specified amount of time. Any idea how I can do this using Xcode 6 and Swift?
You can use NSTimer to space out your data calls instead of locking up the app with sleep
func dataCall(timer : NSTimer) {
// get data
}
let myTimer : NSTimer = NSTimer.scheduledTimerWithTimeInterval(4, target: self, selector: Selector("dataCall:"), userInfo: nil, repeats: false)
and of course you can alter those parameters to your liking and needs.

Empty UIView snapshots from view controller loaded from storyboard in XCTest tests (intermittent)

We've been tracking an issue on our project where we would have intermittently failing snapshot test cases. The gist of our approach is to render a view controller's view and compare that image to a reference image to see if they're different. There are several layers to our approach here:
FBSnapshotTestCases
Quick
Custom Matchers
Our issue is that sometimes the image created by the rendered view is empty. Just a large (correct size) transparent image.
I've tested each one in isolation and determined that none of those is the problem. Instead, I've been able to reproduce this in a standalone, plain Xcode project.
By using the same approach that FBSnapshotTestCases uses to render a view, I've created a simple test. To reproduce, create a new project of the "Master-Detail" template and give the detail view controller a Storyboard ID of "Detail". Then create this simple unit test.
func testExample1() {
let storyboard = UIStoryboard(name: "Main", bundle: NSBundle.mainBundle())
let sut = storyboard.instantiateViewControllerWithIdentifier("Detail") as UIViewController
sut.beginAppearanceTransition(true, animated: false)
sut.endAppearanceTransition()
UIGraphicsBeginImageContextWithOptions(sut.view.frame.size, false, 0)
sut.view.drawViewHierarchyInRect(sut.view.bounds, afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
let data = UIImagePNGRepresentation(image)
println("byte length: \(data.length)")
}
Nothing too fancy, and it will most likely pass. But, if you duplicate the code a few more times:
func testExample1() { ... }
func testExample2() { ... }
func testExample3() { ... }
The output is very strange (truncated):
Test Suite 'All tests' started at 2014-10-02 07:46:52 +0000
byte length: 27760
byte length: 17645
byte length: 27760
Test Suite 'All tests' passed at 2014-10-02 07:55:29 +0000.
Executed 3 tests, with 0 failures (0 unexpected) in 517.778 (517.781) seconds
The byte lengths should be identical, but they're not. The second test (and sometimes the third) will have an empty view, just like our problem.
A sample project demonstrating the problem is available here.
I was able to reproduce the issue using an Objective-C test project, so it's unlikely that it's a Swift problem. In past projects, we haven't used Storyboards for our view controller UI, so it's possible that there is an extra step necessary in order to "force" the view to load. It's also possible that this is an Xcode 6.x or iOS 8 issue (I've reproduced the problem with Xcode 6.0.1).
Has anyone experienced an issue like this, where rendered images of views from controllers loaded from Storyboards have been transparent?
Seems to do the trick...
let storyboard = UIStoryboard(name: "Main", bundle: NSBundle.mainBundle())
let sut = storyboard.instantiateViewControllerWithIdentifier("Detail") as UIViewController
sut.beginAppearanceTransition(true, animated: false)
sut.endAppearanceTransition()
UIGraphicsBeginImageContextWithOptions(sut.view.frame.size, false, 0)
let context = UIGraphicsGetCurrentContext
sut.view.layer.renderInContext(context())
let image: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let data = UIImagePNGRepresentation(image)
println("byte length: \(data.length)")
Taking the storyboards out of the equation by switching to a generated UIView:
let view = UIView(frame: CGRectMake(0, 0, 300, 300))
view.backgroundColor = UIColor.blueColor()
UIGraphicsBeginImageContextWithOptions(view.frame.size, false, 0)
view.drawViewHierarchyInRect(view.bounds, afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
let data = UIImagePNGRepresentation(image)
println("byte length: \(data.length)")
Gives similar results.
Test Case '-[TestTestTests.TestTestTests testExample1]' started.
byte length: 9663
Test Case '-[TestTestTests.TestTestTests testExample1]' passed (1.000 seconds).
Test Case '-[TestTestTests.TestTestTests testExample2]' started.
byte length: 9663
Test Case '-[TestTestTests.TestTestTests testExample2]' passed (0.112 seconds).
Test Case '-[TestTestTests.TestTestTests testExample3]' started.
byte length: 6469
As this topic suggests you try using "[self.view.layer renderInContext:UIGraphicsGetCurrentContext()]" instead:
drawViewHierarchyInRect:afterScreenUpdates: delays other animations

Resources