Hi I wanna fix the bug which occur in IOS16 for rotation screen
I refer to this article, but it's still not working....
BTW we used the Object-c instead of Swift
Reference
UIWindowScene *windowScene = ( UIWindowScene *)[[[ UIApplication sharedApplication] connectedScenes] allObjects].firstObject;
UIWindowSceneGeometryPreferencesIOS *perference = [[ UIWindowSceneGeometryPreferencesIOS alloc] init];
perference.interfaceOrientations = 1 < deviceOrientation;
[windowScene requestGeometryUpdateWithPreferences:perference errorHandler: ^( NSError * _Nonnull error) {
NSLog(# "error--%#", error);
}];
Calling
self.setNeedsUpdateOfSupportedInterfaceOrientations()
before requesting the geometry update fixes it for me.
Related
Does Gluon Mobile have any guidance on implementing a share button? My goal is to be able to share a string containing a link to different apps on the phone. At the moment, I need this only for the iOS implementation. I was able to find this link that provides a simple way to do this in Objective-C:
- (IBAction)shareButton:(UIBarButtonItem *)sender
{
NSString *textToShare = #"Look at this awesome website for aspiring iOS Developers!";
NSURL *myWebsite = [NSURL URLWithString:#"http://www.codingexplorer.com/"];
NSArray *objectsToShare = #[textToShare, myWebsite];
UIActivityViewController *activityVC = [[UIActivityViewController alloc] initWithActivityItems:objectsToShare applicationActivities:nil];
NSArray *excludeActivities = #[UIActivityTypeAirDrop,
UIActivityTypePrint,
UIActivityTypeAssignToContact,
UIActivityTypeSaveToCameraRoll,
UIActivityTypeAddToReadingList,
UIActivityTypePostToFlickr,
UIActivityTypePostToVimeo];
activityVC.excludedActivityTypes = excludeActivities;
[self presentViewController:activityVC animated:YES completion:nil];
}
Looking at the GoNative application example on the Gluon website, it seems like I can use the above code snippet where needed as the native iOS code. Do I have to update the ios build gradle to account for the UIActivity class mentioned in the first link above?
Update*
I have been able to get this to work based on help in this question here.
However when trying to install the native library, I get this error which is understandable as self is unknown in the scope of the code. How would I be able to do this? Instantiate a popover or dialog and pass the activityVC to it?
/Users/ashishsharma/NetBeansProjects/konfamdbranch/src/ios/native/Share.m:25:6: error: use of undeclared identifier 'self' [self presentViewController:activityVC animated:YES completion:nil];
So I was able to solve this using examples on the internet (shown above) along with going through the existing code for the Barcode Scan Service. The issue I was experiencing with the above code was that the present view controller could not be found. However, looking at the bit bucket source for Barcode Scan, I was able to get the root view with the following code:
if(![[UIApplication sharedApplication] keyWindow])
{
NSLog(#"key window was nil");
return;
}
// get the root view controller
UIViewController *rootViewController = [[[UIApplication sharedApplication] keyWindow] rootViewController];
if(!rootViewController)
{
NSLog(#"rootViewController was nil");
return;
}
Then in the code snippet I placed in the question, replace self with rootViewController:
[rootViewController presentViewController:activityVC animated:YES completion:nil];
This leads to the modified code snippet:
#import <UIKit/UIKit.h>
#include "/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/System/Library/Frameworks/JavaVM.framework/Versions/A/Headers/jni.h"
JNIEXPORT void JNICALL Java_com_gluonhq_charm_down_plugins_ios_IOSShareService_shareMessage
(JNIEnv *env, jclass jClass, jstring jMessage) {
if(![[UIApplication sharedApplication] keyWindow])
{
NSLog(#"key window was nil");
return;
}
// get the root view controller
UIViewController *rootViewController = [[[UIApplication sharedApplication] keyWindow] rootViewController];
if(!rootViewController)
{
NSLog(#"rootViewController was nil");
return;
}
NSString *textToShare = #"Check out this site!";
NSURL *myWebsite = [NSURL URLWithString:#"http://www.google.com/"];
NSArray *objectsToShare = #[textToShare, myWebsite];
UIActivityViewController *activityVC = [[UIActivityViewController alloc] initWithActivityItems:objectsToShare applicationActivities:nil];
NSArray *excludeActivities = #[UIActivityTypeAirDrop,
UIActivityTypePrint,
UIActivityTypeAssignToContact,
UIActivityTypeSaveToCameraRoll,
UIActivityTypeAddToReadingList,
UIActivityTypePostToFlickr,
UIActivityTypePostToVimeo];
activityVC.excludedActivityTypes = excludeActivities;
[rootViewController presentViewController:activityVC animated:YES completion:nil];
}
Note I followed the GoNative application to generate my objective-c/ios files correctly.
This leads to a minimal functionality share implementation only because I don't have Facebook installed on the IPhone simulator.
Hi all you awesome coders! I've put together this thing from various helpful sources over the last couple of weeks (including a lot of posts from stackoverflow), trying to create something that will take a webcam feed and detect smiles when they occur (might as well draw boxes around the faces and the smiles as well, that doesn't seem like it would be hard once they are detected). Please give me some lee-way if it's messy code because I'm still very much learning.
Currently I'm stuck at trying to pass the image to a CIImage so it can be analysed for faces (I plan to deal with smiles after the face hurdle is overcome). As it is the compiler succeeds if I comment out the block after (5) - it brings up a simple AVCaptureVideoPreviewLayer in a window. I think this is what I've called "rootLayer", so it's like the first layer of the displayed output, and after I detect faces in the video frames I'll show a rectangle following the "bounds" of any detected face in a new layer overlaid on top of this one, and I've called that layer "previewLayer"... correct?
But with the block after (5) there, the compiler throws out three errors -
Undefined symbols for architecture x86_64:
"_CMCopyDictionaryOfAttachments", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
"_CMSampleBufferGetImageBuffer", referenced from:
-[AVRecorderDocument captureOutput:didOutputSampleBuffer:fromConnection:] in AVRecorderDocument.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Can anyone tell me where I'm going wrong and what my next steps are?
Thanks for any help, I've been stuck at this point for a couple of days and I can't figure it out, all the examples I can find are for IOS and don't work in OSX.
- (id)init
{
self = [super init];
if (self) {
// Move the output part to another function
[self addVideoDataOutput];
// Create a capture session
session = [[AVCaptureSession alloc] init];
// Set a session preset (resolution)
self.session.sessionPreset = AVCaptureSessionPreset640x480;
// Select devices if any exist
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
[self setSelectedVideoDevice:videoDevice];
} else {
[self setSelectedVideoDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeMuxed]];
}
NSError *error = nil;
// Add an input
videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
[self.session addInput:self.videoDeviceInput];
// Start the session (app opens slower if it is here but I think it is needed in order to send the frames for processing)
[[self session] startRunning];
// Initial refresh of device list
[self refreshDevices];
}
return self;
}
-(void) addVideoDataOutput {
// (1) Instantiate a new video data output object
AVCaptureVideoDataOutput * captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.videoSettings = #{ (NSString *) kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
// discard if the data output queue is blocked (while CI processes the still image)
captureOutput.alwaysDiscardsLateVideoFrames = YES;
// (2) The sample buffer delegate requires a serial dispatch queue
dispatch_queue_t captureOutputQueue;
captureOutputQueue = dispatch_queue_create("CaptureOutputQueue", DISPATCH_QUEUE_SERIAL);
[captureOutput setSampleBufferDelegate:self queue:captureOutputQueue];
dispatch_release(captureOutputQueue); //what does this do and should it be here or after we receive the processed image back?
// (3) Define the pixel format for the video data output
NSString * key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = #{key:value};
[captureOutput setVideoSettings:settings];
// (4) Configure the output port on the captureSession property
if ( [self.session canAddOutput:captureOutput] )
[session addOutput:captureOutput];
}
// Implement the Sample Buffer Delegate Method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// I *think* I have a video frame now in some sort of image format... so have to convert it into a CIImage before I can process it:
// (5) Convert CMSampleBufferRef to CVImageBufferRef, then to a CI Image (per weichsel's answer in July '13)
CVImageBufferRef cvFrameImage = CMSampleBufferGetImageBuffer(sampleBuffer); // Having trouble here, prog. stops and won't recognise CMSampleBufferGetImageBuffer.
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage options:(__bridge NSDictionary *)attachments];
//self.ciFrameImage = [[CIImage alloc] initWithCVImageBuffer:cvFrameImage];
//OK so it is a CIImage. Find some way to send it to a separate CIImage function to find the faces, then smiles. Then send it somewhere else to be displayed on top of AVCaptureVideoPreviewLayer
//TBW
}
- (NSString *)windowNibName
{
return #"AVRecorderDocument";
}
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
[super windowControllerDidLoadNib:aController];
// Attach preview to session
CALayer *rootLayer = self.previewView.layer;
[rootLayer setMasksToBounds:YES]; //aaron added
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[self.previewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorBlack)];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[self.previewLayer setFrame:[rootLayer bounds]];
//[newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable]; //don't think I need this for OSX?
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[rootLayer addSublayer:previewLayer];
// [newPreviewLayer release]; //what's this for?
}
(moved from the comments section)
Wow. I guess two days and one StackOverflow post is what it takes to figure out that I haven't added CoreMedia.framework to my project.
Until iOS7 update I was using...
UIImage *image = [moviePlayer thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
...with great success, so that my app could show a still of the video that the user had just taken.
I understand this method, as of iOS7 has now deprecated and I need an alternative. I see there's a method of
- (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option
though how do I return the image from it so I can place it within the videoReview button image?
Thanks in advance, Jim.
****Edited question, after trying notification centre method***
I used the following code -
[moviePlayer requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionNearestKeyFrame];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
I made the NSArray times of two NSNumber objects 1 & 2.
I then tried to capture the notification in the following method
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSDictionary*)info{
UIImage *image = [info objectForKey:MPMoviePlayerThumbnailImageKey];
Then proceeded to use this thumbnail image as the button image as a preview.... but it didn't work.
If you can see from my coding where I've went wrong your help would be appreciated again. Cheers
Managed to find a great way using AVAssetImageGenerator, please see code below...
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
[_firstImage setImage:one];
_firstImage.contentMode = UIViewContentModeScaleAspectFit;
Within header file, please import
#import <AVFoundation/AVFoundation.h>
It works perfect and I've been able to call it from viewDidLoad, which was quicker than calling the deprecated thumbNailImageAtTime: from the viewDidAppear.
Hope this helps anyone else who had the same problem.
* **Update for Swift 5.1 ****
Useful function...
func createThumbnailOfVideoUrl(url: URL) -> UIImage? {
let asset = AVAsset(url: url)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(1.0, preferredTimescale: 600)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
print(error.localizedDescription)
return nil
}
}
The requestThumbnailImagesAtTimes:timeOption: method will post a MPMoviePlayerThumbnailImageRequestDidFinishNotification notification when an image request completes. Your code that needs the thumbnail image should subscribe to this notification using NSNotificationCenter, and use the image when it receives the notification.
The problem is that you have to specify float values in requestThumbnailImagesAtTimes.
For example, this will work
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14.f] timeOption:MPMovieTimeOptionNearestKeyFrame];
but this won't work:
[self.moviePlayer requestThumbnailImagesAtTimes:#[#14] timeOption:MPMovieTimeOptionNearestKeyFrame];
The way to do it, at least in iOS7 is to use floats for your times
NSNumber *timeStamp = #1.f;
[moviePlayer requestThumbnailImagesAtTimes:timeStamp timeOption:MPMovieTimeOptionNearestKeyFrame];
Hope this helps
Jeely provides a good work around but it requires an additional library that isn't necessary when the MPMoviePlayer already provides functions for this task. I noticed a syntax error in the original poster's code. The thumbnail notification handler expects an object of type NSNotification, not a dictionary object. Here's a corrected example:
-(void)MPMoviePlayerThumbnailImageRequestDidFinishNotification: (NSNotification*)note
{
NSDictionary * userInfo = [note userInfo];
UIImage *image = (UIImage *)[userInfo objectForKey:MPMoviePlayerThumbnailImageKey];
if(image!=NULL)
[thumbView setImage:image];
}
I've just looked for a solution for this problem myself and got good help from your question.
Got your code above to work with one small change, removed a colon...
Change
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification::) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
to
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MPMoviePlayerThumbnailImageRequestDidFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:moviePlayer];
Got this to work nicely. Also, I've found that you can't call a method the rely on NotificationCenter if you're already in a notification selector. Something I tried at first - I tried calling requestThumbnailImagesAtTimes inside the notification selector for MPMoviePlayerPlaybackDidFinishNotification - something that won't work. I think because the notification won't fire.
The code in Swift 2.1 would look like this:
do{
let asset1 = AVURLAsset(URL: url)
let generate1: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset1)
generate1.appliesPreferredTrackTransform = true
let time: CMTime = CMTimeMake(3, 1) //TO CATCH THE THIRD SECOND OF THE VIDEO
let oneRef: CGImageRef = try generate1.copyCGImageAtTime(time, actualTime: nil)
let resultImage = UIImage(CGImage: oneRef)
}
catch let error as NSError{
print(error)
}
I'm trying to get a screenshot of a MKMapView.
and I'm using the following code:
UIGraphicsBeginImageContext(myMapView.frame.size);
[myMapView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenShot=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenShot;
And I'm getting and almost blank image with the map current location icon and a Google logo in it.
What could be causing that?
I should tell you that myMapView is actually on another viewController's view but since I'm getting the blue spot showing the location and the google logo I assume the reference I have is the correct one.
Thank you.
iOS 7 introduced a new method to generate screenshots of a MKMapView. It is now possible to use the new MKMapSnapshot API as follows:
MKMapView *mapView = [..your mapview..]
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc]init];
options.region = mapView.region;
options.mapType = MKMapTypeStandard;
options.showsBuildings = NO;
options.showsPointsOfInterest = NO;
options.size = CGSizeMake(1000, 500);
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc]initWithOptions:options];
[snapshotter startWithQueue:dispatch_get_main_queue() completionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
if( error ) {
NSLog( #"An error occurred: %#", error );
} else {
[UIImagePNGRepresentation( snapshot.image ) writeToFile:#"/Users/<yourAccountName>/map.png" atomically:YES];
}
}];
Currently all overlays and annotations are not rendered. You have to render them afterwards onto the resulting snapshot image yourself. The provided MKMapSnapshot object has a handy helper method to do the mapping between coordinates and points:
CGPoint point = [snapshot pointForCoordinate:locationCoordinate2D];
As mentioned here, you can try this
- (UIImage*) renderToImage
{
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, 0.0);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I'd like to adjust the NSApplicationIcon image that gets shown automatically in all alerts to be something different than what is in the app bundle.
I know that it's possible to set the dock icon with [NSApplication setApplicationIconImage:] -- but this only affects the dock, and nothing else.
I'm able to work around this issue some of the time: I have an NSAlert *, I can call setIcon: to display my alternate image.
Unfortunately, I have a lot of nibs that have NSImageView's with NSApplicationIcon, that I would like to affect, and it would be a hassle to create outlets and put in code to change the icon. And for any alerts that I'm bringing up with the BeginAlert... type calls (which don't give an NSAlert object to muck with), I'm completely out of luck.
Can anybody think of a reasonable way to globally (for the life of a running application) override the NSApplicationIcon that is used by AppKit, with my own image, so that I can get 100% of the alerts replaced (and make my code simpler)?
Swizzle the [NSImage imageNamed:] method? This method works at least on Snow Leopard, YMMV.
In an NSImage category:
#implementation NSImage (Magic)
+ (void)load {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
// have to call imageNamed: once prior to swizzling to avoid infinite loop
[[NSApplication sharedApplication] applicationIconImage];
// swizzle!
NSError *error = nil;
if (![NSImage jr_swizzleClassMethod:#selector(imageNamed:) withClassMethod:#selector(_sensible_imageNamed:) error:&error])
NSLog(#"couldn't swizzle imageNamed: application icons will not update: %#", error);
[pool release];
}
+ (id)_sensible_imageNamed:(NSString *)name {
if ([name isEqualToString:#"NSApplicationIcon"])
return [[NSApplication sharedApplication] applicationIconImage];
return [self _sensible_imageNamed:name];
}
#end
With this hacked up (untested, just wrote it) jr_swizzleClassMethod:... implementation:
+ (BOOL)jr_swizzleClassMethod:(SEL)origSel_ withClassMethod:(SEL)altSel_ error:(NSError**)error_ {
#if OBJC_API_VERSION >= 2
Method origMethod = class_getClassMethod(self, origSel_);
if (!origMethod) {
SetNSError(error_, #"original method %# not found for class %#", NSStringFromSelector(origSel_), [self className]);
return NO;
}
Method altMethod = class_getClassMethod(self, altSel_);
if (!altMethod) {
SetNSError(error_, #"alternate method %# not found for class %#", NSStringFromSelector(altSel_), [self className]);
return NO;
}
id metaClass = objc_getMetaClass(class_getName(self));
class_addMethod(metaClass,
origSel_,
class_getMethodImplementation(metaClass, origSel_),
method_getTypeEncoding(origMethod));
class_addMethod(metaClass,
altSel_,
class_getMethodImplementation(metaClass, altSel_),
method_getTypeEncoding(altMethod));
method_exchangeImplementations(class_getClassMethod(self, origSel_), class_getClassMethod(self, altSel_));
return YES;
#else
assert(0);
return NO;
#endif
}
Then, this method to illustrate the point:
- (void)doMagic:(id)sender {
static int i = 0;
i = (i+1) % 2;
if (i)
[[NSApplication sharedApplication] setApplicationIconImage:[NSImage imageNamed:NSImageNameBonjour]];
else
[[NSApplication sharedApplication] setApplicationIconImage:[NSImage imageNamed:NSImageNameDotMac]];
// any pre-populated image views have to be set to nil first, otherwise their icon won't change
// [imageView setImage:nil];
// [imageView setImage:[NSImage imageNamed:NSImageNameApplicationIcon]];
NSAlert *alert = [[[NSAlert alloc] init] autorelease];
[alert setMessageText:#"Shazam!"];
[alert runModal];
}
A couple of caveats:
Any image view already created must have setImage: called twice, as seen above to register the image changing. Don't know why.
There may be a better way to force the initial imageNamed: call with #"NSApplicationIcon" than how I've done it.
Try [myImage setName:#"NSApplicationIcon"] (after setting it as the application icon image in NSApp).
Note: On 10.6 and later, you can and should use NSImageNameApplicationIcon instead of the string literal #"NSApplicationIcon".