We have an issue when we screenshare from iOS client. During screenshare, the invitees are unable to view other apps or screens when the publisher navigates to other application.
//Following is the code that we are using--
fileprivate func startScreenSharing() {
self.isSharingScreen = true
multipartyScreenSharer = OTMultiPartyCommunicator.init(view: UIApplication.shared.keyWindow)
multipartyScreenSharer?.dataSource = self
// publishOnly here is to avoid subscripting to those who already subscribed
multipartyScreenSharer?.isPublishOnly = true
publisherView?.isHidden = true
multipartyScreenSharer?.connect {
[unowned self](signal, remote, error) in
self.isSharingScreen = true
guard error == nil else {
self.dismiss(animated: true) {
SVProgressHUD.showError(withStatus: error!.localizedDescription)
}
return
}
if signal == .publisherCreated {
self.multipartyScreenSharer?.isPublishAudio = true
}
}
}
multipartyScreenSharer = OTMultiPartyCommunicator.init(view:UIApplication.shared.keyWindow)
We only can share the application window. Can someone explain how we can share besides the application window.
Thanks.
It is not straight forward to share other apps screens or home screen. You will need to implement screen sharing extension and implement OpenTok streaming in the extension.
Related
I am working on a MacOS app that can process images. The idea is the app hides out of the way most of the time, but if you drag an image from another app or the finder, my app will show and you can drag the image on top of it. Basically a drop zone on standby for when you need it.
Everything is working perfectly except I can't figure out how to only show the app for certain types of draggable items (URLs, fileURLs, FilePromises and Images). What I have now shows the app for any kind of drag, even selecting text on a page or clicking and clicking and dragging through the menu bar.
I've tried looking at the NSPasteboard for dragging, but that doesn't seem to be updated at drag time. I've seen some posts about using accessibility to see what's under the mouse, but that seems brittle and I'm not yet understanding how to do it.
Here is the code I'm using to detect global drag and drop:
dragMonitor = NSEvent.addGlobalMonitorForEvents(matching:.leftMouseDragged) { event in
if !self.isDragging {
self.isDragging = true
if let dropzoneViewController = self.dropzoneViewController, dropzoneViewController.shouldShowForDrag(event: event) {
self.show()
}
}
}
upMonitor = NSEvent.addGlobalMonitorForEvents(matching:.leftMouseUp) { event in
if self.isDragging {
self.hide()
self.isDragging = false
}
}
That function, in turn, calls the following, which applies the app's logic for determining whether to handle a drag or not.
func shouldShowForDrag(event: NSEvent) -> Bool {
return self.dropTarget.canHandleDrop(NSPasteboard(name: .drag))
}
For clarity's sake, here's how the app handles drags once they are over the app's window:
override func performDragOperation(_ draggingInfo: NSDraggingInfo) -> Bool {
isReceivingDrag = false
if let dropTarget = dropTarget, dropTarget.canHandleDrop(draggingInfo.draggingPasteboard) {
dropTarget.handleDrop(draggingInfo.draggingPasteboard)
return true
} else {
return false
}
}
The only difference between those two checks is the global check (shouldShowForDrag(event:)) uses NSPasteboard(name: .drag) which is not current at the time NSEvent.addGlobalMonitorForEvents(matching:) fires. The logic when the draggable enters my window uses the provided pasteboard (draggingInfo.draggingPasteboard) which, of course, is accurate to what's being dragged.
Finally, here's the basic logic for determining what drags to accept:
func canHandleDrop(_ pasteBoard: NSPasteboard) -> Bool {
let urlFilteringOptions = [NSPasteboard.ReadingOptionKey.urlReadingContentsConformToTypes:NSImage.imageTypes]
if let urls = pasteBoard.readObjects(forClasses: [NSURL.self], options:urlFilteringOptions) as? [URL], urls.count > 0 {
return true
} else if let filePromises = pasteBoard.readObjects(forClasses: [NSFilePromiseReceiver.self], options: nil) as? [NSFilePromiseReceiver], filePromises.count > 0 {
return true
} else if let images = pasteBoard.readObjects(forClasses: [NSImage.self], options: [:]) as? [NSImage], images.count > 0 {
return true
}
return false
}
The first two clauses are the most important. Detecting NSImages is not strictly required.
I know it can be done because I'm using other apps (to do similar, but different, things), and they work exactly like I'm trying to achieve. But so far I'm banging my head against the wall.
Thanks
I'm using Xcode to do UI testing on a sandboxed macOS app that has the com.apple.security.files.user-selected.read-write entitlement (i.e., can access files and folders explicitly selected by the user via an NSOpenPanel GUI).
I have noticed that code coverage stops right after the open panel is presented modally. This is my code:
#IBAction func go(_ sender: Any) {
let panel = NSOpenPanel()
panel.canCreateDirectories = true
panel.canChooseDirectories = true
panel.canChooseFiles = false
panel.allowsMultipleSelection = false
let response = panel.runModal()
switch response {
case NSApplication.ModalResponse.OK:
openPanelDidSelectURL(panel.urls[0])
default:
return
}
}
(I have recorded my UI tests so that the NSOpenPanel is accepted right away, choosing the folder where it was open.)
Code coverage ends up highlighted like this:
I have tried replacing the switch statement with a fatalError() call, but the UI test still completes successfully, suggesting that anything immediately after:
let response = panel.runModal()
...is not executed during the test.
Disabling sandboxing seems to have no effect, so I suspect it is running the open panel modally that causes trouble...
I tried all other available methods for presenting the open panel, namely:
panel.begin { (response) in
switch response {
case NSApplication.ModalResponse.OK:
self.openPanelDidSelectURL(panel.urls[0])
default:
return
}
}
...and also:
panel.beginSheetModal(for: view.window!) { (response) in
switch response {
case NSApplication.ModalResponse.OK:
self.openPanelDidSelectURL(panel.urls[0])
default:
return
}
}
...but the result is always the same: All code immediately after presenting the panel is not covered during tests.
In the end, I realized that my UI tests cannot rely on some user-selectable folder being present wherever the open panel lands (last visited directory?), so I opted for using mocking instead.
First, in my UI test classes, I adopted this setup logic:
override func setUp() {
continueAfterFailure = false
let app = XCUIApplication()
app.launchArguments.append("-Testing")
app.launch()
}
(the hyphen before "Testing" is mandatory, otherwise my document-based macOS app will think I am launching it to open a document named "Testing", and fail to do so)
Next, On the app side, I defined a global computed property to determine whether we are running under a test or not:
public var isTesting: Bool {
return ProcessInfo().arguments.contains("-Testing")
}
Finally, also on the app side I wrapped all NSOpenPanel calls into two methods: One for prompting the user for input files to read, and another to prompt the user for an output directory into which to write the resulting files (this is all my app needs from NSOpenPanel):
public func promptImportInput(completionHandler: #escaping (([URL]) -> Void)) {
guard isTesting == false else {
/*
Always returns the URLs of the bundled resource files:
- 01#2x.png,
- 02#2x.png,
- 03#2x.png,
...
- 09#2x.png,
*/
let urls = (1 ... 9).compactMap { (index) -> URL? in
let fileName = String(format: "%02d", index) + "#2x"
return Bundle.main.url(forResource: fileName, withExtension: "png")
}
return completionHandler(urls)
}
// (The code below cannot be covered during automated testing)
let panel = NSOpenPanel()
panel.canChooseFiles = true
panel.canChooseDirectories = true
panel.canCreateDirectories = false
panel.allowsMultipleSelection = true
let response = panel.runModal()
switch response {
case NSApplication.ModalResponse.OK:
completionHandler(panel.urls)
default:
completionHandler([])
}
}
public func promptExportDestination(completionHandler: #escaping((URL?) -> Void)) {
guard isTesting == false else {
// Testing: write output to the temp directory
// (works even on sandboxed apps):
let tempPath = NSTemporaryDirectory()
return completionHandler(URL(fileURLWithPath: tempPath))
}
// (The code below cannot be covered during automated testing)
let panel = NSOpenPanel()
panel.canChooseFiles = false
panel.canChooseDirectories = true
panel.canCreateDirectories = true
panel.allowsMultipleSelection = false
let response = panel.runModal()
switch response {
case NSApplication.ModalResponse.OK:
completionHandler(panel.urls.first)
default:
completionHandler(nil)
}
}
The portions of these two functions that use the actual NSOpenPanel instead of mocking the user-selected files/directories are still excluded from gathering code coverage statistics (but this time, it's by design).
But at least now it's just this two places. The rest of my code just calls these two functions and does no longer interact with NSOpenPanel directly. I have 'abstracted' the OS's file browsing interface away from my app...
My app reports and records location, altitude, rotation and accelerometer data (DeviceMotion) while in the background. This works fine on ios 10.3.3. On IOS 11, I no longer have access motion data while the device is locked. Altitude data and location data is still streaming to the console, though.
Has something changed in IOS 11 that prevents me from accessing motion data or am I doing trying to access it in a way that Apple now blocks like OperationQueue.main
Here is how I'm starting motion updates. If the phone is unlocked, all works fine. If I locking the phone, no more updates.:
let motionManager = self.motionManager
if motionManager.isDeviceMotionAvailable {
motionUpdateInterval = 0.15
motionManager.deviceMotionUpdateInterval = motionUpdateInterval
motionManager.startDeviceMotionUpdates(using: .xArbitraryZVertical, to: OperationQueue.main) {deviceMotion, error in
guard let deviceMotion = deviceMotion else { return }
I can't find anything about Motion background modes changing but it seems there must be a way otherwise RunKeeper, Strava will break. Can someone help me get this working again before IOS11 launch?
Thanks!
Also came across this problem.
Our solution was to ensure we have another background mode enabled and running (in our case location updates + audio) and restart core motion updates when switching background/foreground.
Code sample:
import UIKit
import CoreMotion
final class MotionDetector {
private let motionManager = CMMotionManager()
private let opQueue: OperationQueue = {
let o = OperationQueue()
o.name = "core-motion-updates"
return o
}()
private var shouldRestartMotionUpdates = false
init() {
NotificationCenter.default.addObserver(self,
selector: #selector(appDidEnterBackground),
name: .UIApplicationDidEnterBackground,
object: nil)
NotificationCenter.default.addObserver(self,
selector: #selector(appDidBecomeActive),
name: .UIApplicationDidBecomeActive,
object: nil)
}
deinit {
NotificationCenter.default.removeObserver(self,
name: .UIApplicationDidEnterBackground,
object: nil)
NotificationCenter.default.removeObserver(self,
name: .UIApplicationDidBecomeActive,
object: nil)
}
func start() {
self.shouldRestartMotionUpdates = true
self.restartMotionUpdates()
}
func stop() {
self.shouldRestartMotionUpdates = false
self.motionManager.stopDeviceMotionUpdates()
}
#objc private func appDidEnterBackground() {
self.restartMotionUpdates()
}
#objc private func appDidBecomeActive() {
self.restartMotionUpdates()
}
private func restartMotionUpdates() {
guard self.shouldRestartMotionUpdates else { return }
self.motionManager.stopDeviceMotionUpdates()
self.motionManager.startDeviceMotionUpdates(using: .xArbitraryZVertical, to: self.opQueue) { deviceMotion, error in
guard let deviceMotion = deviceMotion else { return }
print(deviceMotion)
}
}
}
The official 11.1 release fixed the issue and I've heard from iPhone 8 users that the original implementation is working for them.
The 11.2 beta has not broken anything.
I am trying to present a local notification on an Apple Watch simulator with a button. This is the code:
#IBAction func buttonOne() {
print("button one pressed")
let content = UNMutableNotificationContent()
content.title = NSString.localizedUserNotificationString(forKey: "Notified!", arguments: nil)
content.body = NSString.localizedUserNotificationString(forKey: "This is a notification appearing!", arguments: nil)
// Deliver the notification in five seconds.
content.sound = UNNotificationSound.default()
let trigger = UNTimeIntervalNotificationTrigger(timeInterval: 5,
repeats: false)
// Schedule the notification.
let request = UNNotificationRequest(identifier: "Notify", content: content, trigger: trigger)
center.add(request) { (error : Error?) in
if let theError = error {
print(theError.localizedDescription)
} else {
print("successful notification")
}
}
}
The console is successfully printing "successful notification," but the notification never appears on the watch simulator. I have no idea why this is
1) Before scheduling a notification, request permissions. Use requestAuthorization method, for example:
let center = UNUserNotificationCenter.current()
center.requestAuthorization(options: [.alert, .sound]) { (granted, error) in
// Enable or disable features based on authorization
}
2) Notifications will not appear on watch face if the application is running in foreground. Considering this you may use cmd + H to hide the app.
I'm working on my first app for OSX 10.8 using Swift. I want to be able to have the state of the battery dictate the text in Today pulldown menu. That aspect of the code works, but I am very frustrated as my class never gets called. It is in a separate supporting script called 'DeviceMonitor.swift'. Thanks for the help!
Code:
import Foundation
import UIKit
class BatteryState {
var device: UIDevice
init() {
self.device = UIDevice.currentDevice()
println("Device Initialized")
}
func isPluggedIn(value: Bool) {
let sharedDefaults = NSUserDefaults(suiteName: "group.WidgetExtension")
let batteryState = self.device.batteryState
if (batteryState == UIDeviceBatteryState.Charging || batteryState == UIDeviceBatteryState.Full){
let isPluggedIn = true
println("Plugged In")
sharedDefaults?.setObject("Plugged In", forKey: "stringKey")
}
else {
let isPluggedIn = false
println("Not Plugged In")
sharedDefaults?.setObject("Not Plugged In", forKey: "stringKey")
}
sharedDefaults?.synchronize()
}
}
Unless you have done it somewhere else in your code, it looks like you haven't registered to receive UIDeviceBatteryLevelDidChangeNotification events. You should also ensure that the current UIDevice has batteryMonitoringEnabled as YES.