i am trying Timeline mission. I created timeline mission with basic flow like:-
takeoff Action
DJIGoToAction()
DJIGimbalAttitudeAction()
DJIShootPhotoAction()
DJIHotpointAction()
DJIGoHomeAction()
When i am trying with DJI Mavic Pro then mission is executed successfully but same sequence i am trying with Phantom 4 and it is not executing at all. Moreover i am not getting any error with Phantom 4 device.
Phantom 4 is standing at one place after executing GoToAction() and not able to execute DJIGimbalAttitudeAction().
Here is my sample code:-
func createTimeline() {
let goToAction = DJIGoToAction(coordinate: coordinate, altitude: kTargetAltitude)
// DJIShootPhotoAction works if there is SD card available in the drone
// The captured photo can be read from SD card
let shootPhotoAction = DJIShootPhotoAction()
let continuousShootPhotoAction = DJIShootPhotoAction(photoCount: 24, timeInterval: 3.0, waitUntilFinish: false)
let hotPointMission1 = self.defaultHotPointAction(location: coordinate, altitude: kTargetAltitude, radius: Float(20.0))!
let gimbalAction1 = DJIGimbalAttitudeAction(attitude: DJIGimbalAttitude(pitch: -90.0, roll: 0.0, yaw: 0.0))
gimbalAction1?.completionTime = 2
let returmHomeAction = DJIGoHomeAction()
timelineActions = [ goToAction, shootPhotoAction, continuousShootPhotoAction, hotPointMission1, returmHomeAction] as! [DJIMissionControlTimelineElement]
}
func startTimeline() {
self.addListenerForTimelineUpdates()
self.startTakeoff(completion: { (error) in
guard error == nil else { return }
logText += "\n start timeline"
DJISDKManager.missionControl()?.scheduleElements(timelineElements)
DJISDKManager.missionControl()?.startTimeline()
})
}
func addListenerForTimelineUpdates() {
DJISDKManager.missionControl()?.addListener(self, toTimelineProgressWith: { (event, element, error, info) in
guard error == nil else { return }
print("### \n\n Event: \(event.rawValue) \n Element:\(String(describing: element)) \n Error: \(String(describing: error)) \n Info: \(String(describing: info)) \n\n ")
//once event is finished, stop the timeline and initiate return home
if event.rawValue == DJIMissionControlTimelineEvent.finished.rawValue && element == nil {
logText += "DJIMissionControlTimelineEvent.finished, stop timeline and remove listeners"
DJISDKManager.missionControl()?.removeAllListeners()
DJISDKManager.missionControl()?.stopTimeline()
}
})
}
}
I tried my best but not able to fix this. I am working with DJI IOS SDK 4.10/4.11 & Xcode 10.2.
Any help would be much appreciated.
Thanks in advance.
Related
I'm having a project where I have to write automation for Apple Watch & iPhone app. I'm forced to use XCUITest with Swift because Appium is not yet supporting WatchOS as a testing target.
In my tests, at some points, I have to communicate with an external device via BLE. I wanted to use the CoreBluetooth framework, but I got no luck with that. I have a working CoreBT code that I've used to create a simple BLE chat app, but for some reason, that same code is not working when called from the UI test.
The main problem is that I'm not able to get CBCentralManager in a powered state:
[CoreBluetooth] API MISUSE: <CBCentralManager: 0x28175ce00> can only accept this command while in the powered on state
Later I checked the BT authorization state inside the test with cbCentralManager.authorization == .notDetermined, and it confirmed that authorization is not confirmed. That might be a problem why CBCentralManager is not powering up, but I'm not sure how to resolve it. The app I'm testing is able to use BT without any issues (has all permissions).
Is it even possible to use CoreBluetooth from UITests??
This is the test class code:
import CoreBluetooth
import XCTest
let peripheralName = "MY-BLE-DEVICE-NAME"
let service_First = CBUUID(string: "0000ffe0-0000-1000-8000-00805f9b34fb")
let readCharacteristic = CBUUID(string: "0000ffe1-0000-1000-8000-00805f9b34fb")
let writeCharacteristic = CBUUID(string: "0000ffe1-0000-1000-8000-00805f9b34fb")
var ble_characteristic: CBCharacteristic?
var ble_perip: CBPeripheral?
class myBLERemoteUITests : XCTestCase {
var cbCentralManager : CBCentralManager!
func wait(timeout: TimeInterval){
let exp = expectation(description: "Wait for X seconds")
let result = XCTWaiter.wait(for: [exp], timeout: timeout)
print("LOG - Timer started for " + String(timeout) + " seconds...")
if result == XCTWaiter.Result.timedOut {
print("LOG - Timer stopped after " + String(timeout) + " seconds...")
}else {
XCTFail("Delay interrupted")
}
}
func test_someBLEtest() throws {
let app = XCUIApplication()
app.launch()
// MARK: Some UI automation steps
app.buttons["Get Started"].tap()
let tablesQuery2 = app.tables
let shoeImage = tablesQuery2.cells["ItemA"].images["item"]
shoeImage.tap()
let shoe2 = tablesQuery2.cells["ItemB"].images["item"]
shoe2.tap()
wait(timeout: 10)
app.buttons["Finish"].tap()
// MARK: Part where I need to communicate with BLE device
// Initialize CBCentralManager
cbCentralManager = CBCentralManager.init(delegate: self, queue: nil) // --> getting API MISUSE mentioned above
wait(timeout: 10)
cbCentralManager = CBCentralManager.init(delegate: self, queue: nil) // --> getting API MISUSE mentioned above
wait(timeout: 10)
cbCentralManager = CBCentralManager.init() // --> getting [CoreBluetooth] XPC connection invalid
// MARK: I check CBCentral Manager Authorization state
if (cbCentralManager.authorization == .allowedAlways){
print("allowed")
}else if (cbCentralManager.authorization == .notDetermined){
print("not determined") // --> I get that authorization is not determined
}
}
}
// MARK: Over here I have all needed CoreBluetooth callback functions:
extension myBLERemoteUITests : CBPeripheralDelegate {
func peripheral(_ peripheral: CBPeripheral, didDiscoverServices error: Error?) {
if let services = peripheral.services {
//discover characteristics of services
for service in services {
peripheral.discoverCharacteristics(nil, for: service)
}
}
}
func peripheral(_ peripheral: CBPeripheral, didDiscoverCharacteristicsFor service: CBService, error: Error?) {
if let charac = service.characteristics {
for characteristic in charac {
peripheral.setNotifyValue(true, for: characteristic)
if characteristic.uuid == writeCharacteristic {
ble_characteristic = characteristic
}
for newChar: CBCharacteristic in service.characteristics!{
peripheral.readValue(for: newChar)
}
}
}
}
func peripheral(_ peripheral: CBPeripheral, didUpdateValueFor characteristic: CBCharacteristic, error: Error?) {
if characteristic.uuid == readCharacteristic {
print("Read Value : \(characteristic)")
}
}
func peripheral(_ peripheral: CBPeripheral, didWriteValueFor characteristic: CBCharacteristic, error: Error?) {
if characteristic.uuid == writeCharacteristic {
print("WRITE VALUE : \(characteristic)")
}
}
}
extension myBLERemoteUITests : CBCentralManagerDelegate {
func centralManagerDidUpdateState(_ central: CBCentralManager) {
if central.state == .poweredOn {
central.scanForPeripherals(withServices: nil, options: nil)
print("Scanning...")
}
}
func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral, advertisementData: [String : Any], rssi RSSI: NSNumber) {
guard peripheral.name != nil else {return}
if peripheral.name! == peripheralName {
print("BLE Device Found!")
//stopScan
cbCentralManager.stopScan()
//connect
cbCentralManager.connect(peripheral, options: nil)
ble_perip = peripheral
}
}
func centralManager(_ central: CBCentralManager, didConnect peripheral: CBPeripheral) {
print("Connected : \(peripheral.name ?? "No Name")")
peripheral.discoverServices([service_First])
//discover all service
//peripheral.discoverServices(nil)
peripheral.delegate = self
}
func centralManager(_ central: CBCentralManager, didDisconnectPeripheral peripheral: CBPeripheral, error: Error?) {
print("Disconnected : \(peripheral.name ?? "No Name")")
cbCentralManager.scanForPeripherals(withServices: nil, options: nil)
}
}
I am trying to set the camera mode when the Mavic 2 Enterprise Dual is attached but I am getting an error. This product should support setDisplayMode since it has Visual, Infrared, and MSX capability:
camera!.setDisplayMode(DJICameraDisplayMode.MSX, withCompletion: nil)
Error: Current product does not support this feature.(code:-1013)
SDK Version: 4.10
Swift Version: 5.0
Discovered the solution:
The Mavic 2 Enterprise Dual has two camera types: Visual and Thermal. In order to change the camera display type you have to use the thermal camera.
Code to get the thermal camera:
var cameraMode: String = ""
viewDidLoad() {
checkForMavic2Enterprise()
}
func checkForMavic2Enterprise() {
guard let product = DJISDKManager.product() else {
print("No product")
switchCameraButton.isHidden = true
return
}
if product.model == "Mavic 2 Enterprise Dual" {
let camera = fetchCamera()
camera!.setDisplayMode(DJICameraDisplayMode.visualOnly, withCompletion: nil)
cameraMode = "visual"
} else {
switchCameraButton.isHidden = true
}
}
// get the drone camera
func fetchCamera() -> DJICamera? {
if let product = DJISDKManager.product() {
if let productKind = product as? DJIAircraft {
if productKind.cameras!.count > 1 {
return productKind.cameras![1] // thermal
} else {
return productKind.camera
}
} else if let productKind = product as? DJIHandheld {
print("HANDHELD CAMERA: \(productKind.camera.debugDescription)")
return productKind.camera
}
}
return nil
}
#IBAction func switchCamera(_ sender: Any) {
let camera = fetchCamera()
switch cameraMode {
case "visual":
print("SWITCHING TO MSX")
camera!.setDisplayMode(DJICameraDisplayMode.MSX, withCompletion: nil)
cameraMode = "MSX"
break
case "MSX":
print("SWITCHING TO THERMAL")
camera!.setDisplayMode(DJICameraDisplayMode.thermalOnly, withCompletion: nil)
cameraMode = "thermal"
break
default:
print("SWITCHING TO VISUAL")
camera!.setDisplayMode(DJICameraDisplayMode.visualOnly, withCompletion: nil)
cameraMode = "visual"
break
}
}
I'm running some regular edgePan code to initialise edgePan detection:
func setupGestures() {
let edgePan = UIScreenEdgePanGestureRecognizer(target: self, action: #selector(screenEdgeSwiped))
edgePan.edges = .left
edgePan.cancelsTouchesInView = false
self.view.addGestureRecognizer(edgePan)
}
and the selector:
func screenEdgeSwiped(recognizer: UIScreenEdgePanGestureRecognizer) {
if recognizer.state == .recognized {
if slideOutMenuView != nil {
slideOutMenuView.show()
}
else {
print("THE SLIDE OUT MENU VIEW IS NIL")
}
}
}
This all works fine but when I give it a test run on the iPhone X. It seems to not even register the gesture.
Is there a different screenGesture that they have introduced or have they completely overwritten the functionality?
The App is in landscape mode.
I am trying to present a local notification on an Apple Watch simulator with a button. This is the code:
#IBAction func buttonOne() {
print("button one pressed")
let content = UNMutableNotificationContent()
content.title = NSString.localizedUserNotificationString(forKey: "Notified!", arguments: nil)
content.body = NSString.localizedUserNotificationString(forKey: "This is a notification appearing!", arguments: nil)
// Deliver the notification in five seconds.
content.sound = UNNotificationSound.default()
let trigger = UNTimeIntervalNotificationTrigger(timeInterval: 5,
repeats: false)
// Schedule the notification.
let request = UNNotificationRequest(identifier: "Notify", content: content, trigger: trigger)
center.add(request) { (error : Error?) in
if let theError = error {
print(theError.localizedDescription)
} else {
print("successful notification")
}
}
}
The console is successfully printing "successful notification," but the notification never appears on the watch simulator. I have no idea why this is
1) Before scheduling a notification, request permissions. Use requestAuthorization method, for example:
let center = UNUserNotificationCenter.current()
center.requestAuthorization(options: [.alert, .sound]) { (granted, error) in
// Enable or disable features based on authorization
}
2) Notifications will not appear on watch face if the application is running in foreground. Considering this you may use cmd + H to hide the app.
I've been implementing a test of the new Vision framework which Apple introduced in WWDC2017. I am specifically looking at the barcode detection - I've been able to get after scanning the image from Camera/Gallery that it's a barcode image or not. However, I can't see what the actual barcode value or the payload data when looking at the barcodeDescriptor. There appears to be nothing exposed on the https://developer.apple.com/documentation/coreimage/cibarcodedescriptor page to identify any of the properties.
I am getting these errors:
Cannot connect to remote service: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named
com.apple.BarcodeSupport.BarcodeNotificationService"
libMobileGestalt MobileGestalt.c:555: no access to InverseDeviceID (see problem/11744455>)
connection to service named com.apple.BarcodeSupport.BarcodeNotificationService Error
Domain=NSCocoaErrorDomain Code=4097
Is there any way to access the barcode value from the VNBarcodeObservation?
Any help would be greatly appreciated. Thank you!
Here is the code I am using:
#IBAction func chooseImage(_ sender: Any) {
imagePicker.allowsEditing = true
imagePicker.sourceType = .photoLibrary
present(imagePicker, animated: true, completion: nil)
}
#IBAction func takePicture(_ sender: Any) {
if(UIImagePickerController .isSourceTypeAvailable(UIImagePickerControllerSourceType.camera)){
imagePicker.sourceType = UIImagePickerControllerSourceType.camera
self .present(imagePicker, animated: true, completion: nil)
}
else{
let alert = UIAlertController(title: "Warning", message: "Camera not available", preferredStyle: UIAlertControllerStyle.alert)
alert.addAction(UIAlertAction(title: "Dismiss", style: UIAlertActionStyle.default, handler: nil))
self.present(alert, animated: true, completion: nil)
}
}
//PickerView Delegate Methods
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
imagePicker .dismiss(animated: true, completion: nil)
classificationLabel.text = "Analyzing Imageā¦"
guard let pickedImage = info[UIImagePickerControllerOriginalImage] as? UIImage
else { fatalError("no image from image picker") }
guard let ciImage = CIImage(image: pickedImage)
else { fatalError("can't create CIImage from UIImage") }
imageView.image = pickedImage
inputImage = ciImage
// Run the rectangle detector, which upon completion runs the ML classifier.
let handler = VNImageRequestHandler(ciImage: ciImage, options: [.properties : ""])
DispatchQueue.global(qos: .userInteractive).async {
do {
try handler.perform([self.barcodeRequest])
} catch {
print(error)
}
}
}
func imagePickerControllerDidCancel(_ picker: UIImagePickerController){
picker .dismiss(animated: true, completion: nil)
print("picker cancel.")
}
lazy var barcodeRequest: VNDetectBarcodesRequest = {
return VNDetectBarcodesRequest(completionHandler: self.handleBarcodes)
}()
func handleBarcodes(request: VNRequest, error: Error?) {
guard let observations = request.results as? [VNBarcodeObservation]
else { fatalError("unexpected result type from VNBarcodeRequest") }
guard observations.first != nil else {
DispatchQueue.main.async {
self.classificationLabel.text = "No Barcode detected."
}
return
}
// Loop through the found results
for result in request.results! {
// Cast the result to a barcode-observation
if let barcode = result as? VNBarcodeObservation {
// Print barcode-values
print("Symbology: \(barcode.symbology.rawValue)")
if let desc = barcode.barcodeDescriptor as? CIQRCodeDescriptor {
let content = String(data: desc.errorCorrectedPayload, encoding: .utf8)
// FIXME: This currently returns nil. I did not find any docs on how to encode the data properly so far.
print("Payload: \(String(describing: content))\n")
print("Error-Correction-Level: \(desc.errorCorrectedPayload)\n")
print("Symbol-Version: \(desc.symbolVersion)\n")
}
}
}
}
Apparently, in the iOS 11 beta 5 Apple introduced new payloadStringValue property of VNBarcodeObservation. Now you can read info from QR-code with no problems
if let payload = barcodeObservation.payloadStringValue {
print("payload is \(payload)")
}
If Apple is not going to provide a library for this, something like the following will work:
extension CIQRCodeDescriptor {
var bytes: Data? {
return errorCorrectedPayload.withUnsafeBytes { (pointer: UnsafePointer<UInt8>) in
var cursor = pointer
let representation = (cursor.pointee >> 4) & 0x0f
guard representation == 4 /* byte encoding */ else { return nil }
var count = (cursor.pointee << 4) & 0xf0
cursor = cursor.successor()
count |= (cursor.pointee >> 4) & 0x0f
var out = Data(count: Int(count))
guard count > 0 else { return out }
var prev = (cursor.pointee << 4) & 0xf0
for i in 2...errorCorrectedPayload.count {
if (i - 2) == count { break }
let cursor = pointer.advanced(by: Int(i))
let byte = cursor.pointee
let current = prev | ((byte >> 4) & 0x0f)
out[i - 2] = current
prev = (cursor.pointee << 4) & 0xf0
}
return out
}
}
}
And then
String(data: descriptor.bytes!, encoding: .utf8 /* or whatever */)
If you want to get the raw Data from the VNBarcodeObservation directly without it having to conform to some string encoding you can strip of the first 2 and 1/2 bytes like this, and get actual data without the QR code header.
guard let barcode = barcodeObservation.barcodeDescriptor as? CIQRCodeDescriptor else { return }
let errorCorrectedPayload = barcode.errorCorrectedPayload
let payloadData = Data(bytes: zip(errorCorrectedPayload.advanced(by: 2),
errorCorrectedPayload.advanced(by: 3)).map { (byte1, byte2) in
return byte1 << 4 | byte2 >> 4
})