Creating MTLTexture costs lots of time, how to improve it? - performance

I tried to create a MTLTexture(size is 1920x1080) and it costs lots of time when calling [replaceRegion:mipmapLevel:withBytes:bytesPerRow], more about 15ms on my iPhoneX. Is there any way to improve the performance?
Here's my test code, I found out that, if i make texture in [viewDidAppear] it only costs about 4ms. What's the difference?
#import "ViewController.h"
#import <Metal/Metal.h>
#define I_WIDTH 1920
#define I_HEIHG 1080
#interface ViewController ()
#property(strong, nonatomic) id<MTLDevice> device;
#property(strong, nonatomic) NSTimer* timer;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.device = MTLCreateSystemDefaultDevice();
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
//Method 1. This would run really slow, aboult 15ms per loop
self.timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(mkTexture) userInfo:nil repeats:true];
//Method 2. This would run really fast, aboult 3ms per loop
// for (int i = 0; i < 3000; i++) {
// [self mkTexture];
// }
}
- (void)mkTexture {
double start = CFAbsoluteTimeGetCurrent();
MTLTextureDescriptor* desc = [[MTLTextureDescriptor alloc] init];
desc.width = I_WIDTH;
desc.height = I_HEIHG;
desc.pixelFormat = MTLPixelFormatBGRA8Unorm;
desc.usage = MTLTextureUsageShaderRead;
id<MTLTexture> texture = [self.device newTextureWithDescriptor:desc];
char* bytes = (char *)malloc(I_WIDTH * I_HEIHG * 4);
[texture replaceRegion:MTLRegionMake3D(0, 0, 0, I_WIDTH, I_HEIHG, 1) mipmapLevel:0 withBytes:bytes bytesPerRow:I_WIDTH * 4];
double end = CFAbsoluteTimeGetCurrent();
NSLog(#"%.2fms", (end - start) * 1000);
free(bytes);
}
#end
With [Method 1], the function mkTexture would cost about 15ms, with [Method 2], the function mkTexture only costs 4ms. It's really strange.

Related

PHImagemanager requestImageForAsset memory issue

I want to use PHImagemanager to get all photos on the device.
If I set the targetsize too high the app will crash because of memory warnings. So I tested the request without any use of the returned images and set each image to nil, but still app is crashing. I don't know what I'm doing wrong. Can someone help please?
requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
requestOptions.synchronous = false;
assetsOfPhotos = [PHAsset fetchAssetsWithMediaType: PHAssetMediaTypeImage options: nil];
PHImageManager *manager = [PHImageManager defaultManager];
#autoreleasepool {
for (int i = 0; i <= totalImages - 1; i++) {
PHAsset *asset = assetsOfPhotos[i];
[manager requestImageForAsset: asset
targetSize: CGSizeMake(640, 480)
contentMode: PHImageContentModeDefault
options: requestOptions
resultHandler: ^void(UIImage *image, NSDictionary *info) {
image = nil;
}];
}
}
Setting size to 640x480 crash after about 200 images, 320x240 after about 800 images. As a 640x480 image needs 4 times memory then 320x240 image it seems that the app crashes after the same amount of memory that was allocated. So for me this means that I cannot show more images than 200 imags with 640x480 on the test device, because I cannot free allocated memory.
In order to make your #autoreleasepool work you need to set requestOptions.synchronous to YES, and use your own async queue if you want to make the request operation asynchronously.
Please use #autoreleasepool inside the for loop.
for (int i = 0; i <= totalImages - 1; i++) {
#autoreleasepool {
//Your code
}
}
If you want load all photos that you have in Photos.app and you didn't want iCloud. You can do:
That example works with a collection view.
#interface GalleryViewModel ()
#property (strong, nonatomic) NSMutableArray<PHAsset *> *assets;
#property (strong, nonatomic) PHImageManager *imageManager;
#property (strong, nonatomic) PHImageRequestOptions *requestOptions;
#property (strong, nonatomic) NSMutableArray<UIImage *> *imagesList;
#end
#implementation GalleryViewModel
- (instancetype) initWithContext:(ITXAppContext *)context {
self = [super initWithContext:context];
if (self) {
_assets = [[NSMutableArray alloc] init];
_imageManager = [PHImageManager defaultManager];
_requestOptions = [[PHImageRequestOptions alloc] init];
_imagesList = [[NSMutableArray alloc] init];
}
return self;
}
#pragma mark - Public methods
// ==================================================================================
// Public methods
- (void) viewModelDidLoad {
[self obtainAllPhotos];
}
#pragma mark - Private methods
// ==================================================================================
// Private methods
- (void) obtainAllPhotos {
self.requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
self.requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
self.requestOptions.synchronous = YES;
self.requestOptions.networkAccessAllowed = NO;
PHFetchOptions *fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult<PHAsset *> *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:fetchOptions];
__weak GalleryViewModel *weakSelf = self;
[result enumerateObjectsUsingBlock:^(PHAsset * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
[weakSelf.assets addObject:obj];
if (idx >= ([result count] - 1)) {
[weakSelf.viewDelegate setupView];
}
}];
}
#pragma mark - Get data from object
// ==================================================================================
// Get data from object
- (NSInteger) sizeGallery {
if (self.assets) {
return [self.assets count];
}
return 0;
}
- (UIImage *) imagesFromList:(NSInteger) index {
__block UIImage *imageBlock;
[self.imageManager requestImageForAsset:[self.assets objectAtIndex:index] targetSize:CGSizeMake(200, 200) contentMode:PHImageContentModeAspectFit options:self.requestOptions resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
if (result) {
imageBlock = result;
}
}];
return imageBlock;
}
#end

What would be the condition in a while loop using the FaceRecognizer

How do I make an while loop with FaceRecognizer so that while a face is recognized, a command will happen? I am not sure what the condition would be. What variable do I use and what to I equate it to?
Here is my code
#import "ViewController.h"
NSString* const faceCascadeFilename = #"haarcascade_frontalface_alt2";
const int HaarOptions = CV_HAAR_FIND_BIGGEST_OBJECT | CV_HAAR_DO_ROUGH_SEARCH;
#interface ViewController ()
#end
#implementation ViewController
#synthesize videoCamera;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
self.videoCamera = [[CvVideoCamera alloc] initWithParentView:imageView];
self.videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionFront;
self.videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPreset352x288;
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
self.videoCamera.defaultFPS = 30;
self.videoCamera.grayscaleMode = NO;
self.videoCamera.delegate = self;
NSString* faceCascadePath = [[NSBundle mainBundle] pathForResource:faceCascadeFilename ofType:#"xml"];
faceCascade.load([faceCascadePath UTF8String]);
Label1.hidden=YES;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
#pragma mark - Protocol CvVideoCameraDelegate
#ifdef __cplusplus
- (void)processImage:(Mat&)image;
{
Mat grayscaleFrame;
cvtColor(image, grayscaleFrame, CV_BGR2GRAY);
equalizeHist(grayscaleFrame, grayscaleFrame);
std::vector<cv::Rect> faces;
faceCascade.detectMultiScale(grayscaleFrame, faces, 1.1, 2, HaarOptions, cv::Size(60, 60));
for (int i = 0; i < faces.size(); i++)
{
cv::Point pt1(faces[i].x + faces[i].width, faces[i].y + faces[i].height);
cv::Point pt2(faces[i].x, faces[i].y);
cv::rectangle(image, pt1, pt2, cvScalar(0, 255, 0, 0), 1, 8 ,0);
}
}
//#endif
#pragma mark - UI Actions
- (IBAction)startCamera:(id)sender
{
[self.videoCamera start];
imageView.hidden=YES;
while (FaceRecognizer...) {
Label1.hidden=NO;
}
}

How to make an if statement with FaceRecognizer

How do I make an if statement with FaceRecognizer so that if a face is recognized, a command will happen?
I am not sure what the condition would be. What variable do I use and what to I equate it to?
Here is my code
#import "ViewController.h"
NSString* const faceCascadeFilename = #"haarcascade_frontalface_alt2";
const int HaarOptions = CV_HAAR_FIND_BIGGEST_OBJECT | CV_HAAR_DO_ROUGH_SEARCH;
#interface ViewController ()
#end
#implementation ViewController
#synthesize videoCamera;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
self.videoCamera = [[CvVideoCamera alloc] initWithParentView:imageView];
self.videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionFront;
self.videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPreset352x288;
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
self.videoCamera.defaultFPS = 30;
self.videoCamera.grayscaleMode = NO;
self.videoCamera.delegate = self;
NSString* faceCascadePath = [[NSBundle mainBundle] pathForResource:faceCascadeFilename ofType:#"xml"];
faceCascade.load([faceCascadePath UTF8String]);
Label1.hidden=YES;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
#pragma mark - Protocol CvVideoCameraDelegate
#ifdef __cplusplus
- (void)processImage:(Mat&)image;
{
Mat grayscaleFrame;
cvtColor(image, grayscaleFrame, CV_BGR2GRAY);
equalizeHist(grayscaleFrame, grayscaleFrame);
std::vector<cv::Rect> faces;
faceCascade.detectMultiScale(grayscaleFrame, faces, 1.1, 2, HaarOptions, cv::Size(60, 60));
for (int i = 0; i < faces.size(); i++)
{
cv::Point pt1(faces[i].x + faces[i].width, faces[i].y + faces[i].height);
cv::Point pt2(faces[i].x, faces[i].y);
cv::rectangle(image, pt1, pt2, cvScalar(0, 255, 0, 0), 1, 8 ,0);
}
}
//#endif
#pragma mark - UI Actions
- (IBAction)startCamera:(id)sender
{
[self.videoCamera start];
imageView.hidden=YES;
while (FaceRecognizer...) {
Label1.hidden=NO;
}
}

Why does only one of my NSTimers work?

I don't understand why only the animateDarkAst animation works. The first two timers (the one that operates on processKeys and the one that operates on animateDarkAst) work fine but the other timers don't. It doesn't matter what order I write the timers only those two methods work with their respective timers. For the other 3 animations, nothing appears on the screen because no code is being processed within their methods (animateLightAst, animateSmallAst, animateComet).
// Prepare asteroids.
_asteroiddark = [[NSImageView alloc] init];
[_asteroiddark setImage: [NSImage imageNamed:#"asteroiddark"]];
[_asteroiddark setFrame: theModel.darkAstRect];
_asteroidlight = [[NSImageView alloc] init];
[_asteroidlight setImage: [NSImage imageNamed:#"asteroidwhite"]];
[_asteroidlight setFrame: theModel.lightAstRect];
_asteroidsmall = [[NSImageView alloc] init];
[_asteroidsmall setImage: [NSImage imageNamed:#"asteroidsmall"]];
[_asteroidsmall setFrame: theModel.smallAstRect];
// ... and comets
_comet = [[NSImageView alloc] init];
[_comet setImage:[NSImage imageNamed:#"comet"]];
[_comet setFrame: theModel.cometRect];
// Set up key Processing timer for fluid spaceship movement.
timer1 = [NSTimer scheduledTimerWithTimeInterval:0.03 target:self selector:#selector(processKeys) userInfo:nil repeats:YES];
// Make a random value for first animation timer.
double randomInterval1 = arc4random() % (4 - 1) + 1;
// Set up key Processing timer for animation.
timer2 = [NSTimer scheduledTimerWithTimeInterval:randomInterval1 target:self selector:#selector(animateDarkAst) userInfo:nil repeats:YES];
double randomInterval2 = ((double)arc4random() / 3) * (3 - 1) + 1;
timer3 = [NSTimer scheduledTimerWithTimeInterval:randomInterval2 target:self selector:#selector(animateSmallAst) userInfo:nil repeats:YES];
double randomInterval3 = ((double)arc4random() / 5) * (5 - 1) + 1;
timer4 = [NSTimer scheduledTimerWithTimeInterval:randomInterval3 target:self selector:#selector(animateLightAst) userInfo:nil repeats:YES];
double randomInterval4 = ((double)arc4random() / 6) * (6 - 1) + 1;
timer5 = [NSTimer scheduledTimerWithTimeInterval:randomInterval4 target:self selector:#selector(animateComet) userInfo:nil repeats:YES];
}
return self;
}
-(void) animateDarkAst
{
int randX = arc4random_uniform(self.bounds.size.width);
int randSize = 40 + arc4random() % (120-40+1);
CGPoint startPoint = CGPointMake(randX, self.bounds.size.height);
[_asteroiddark setFrame: NSMakeRect(startPoint.x, startPoint.y, randSize, randSize)];
[self addSubview:_asteroiddark];
// Create animation (down y-axis)
[NSAnimationContext runAnimationGroup:^(NSAnimationContext *context) {
[context setDuration:1.5];
[context setTimingFunction:[CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear]];
_asteroiddark.animator.frame = CGRectOffset(_asteroiddark.frame, 0, -self.bounds.size.height - 180);
} completionHandler:nil];
}
-(void) animateSmallAst
{
int randX = arc4random_uniform(self.bounds.size.width);
int randSize = 40 + arc4random() % (120-40+1);
CGPoint startPoint = CGPointMake(randX, self.bounds.size.height);
[_asteroidsmall setFrame: NSMakeRect(startPoint.x, startPoint.y, randSize, randSize)];
[self addSubview:_asteroidsmall];
// Create animation (down y-axis)
[NSAnimationContext runAnimationGroup:^(NSAnimationContext *context) {
[context setDuration:1.5];
[context setTimingFunction:[CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear]];
_asteroidsmall.animator.frame = CGRectOffset(_asteroidsmall.frame, 0, -self.bounds.size.height - 180);
} completionHandler:nil];
}
-(void) animateLightAst
{
int randX = arc4random_uniform(self.bounds.size.width);
int randSize = 40 + arc4random() % (120-40+1);
CGPoint startPoint = CGPointMake(randX, self.bounds.size.height);
[_asteroidlight setFrame: NSMakeRect(startPoint.x, startPoint.y, randSize, randSize)];
[self addSubview:_asteroidlight];
// Create animation (down y-axis)
[NSAnimationContext runAnimationGroup:^(NSAnimationContext *context) {
[context setDuration:1.5];
[context setTimingFunction:[CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear]];
_asteroidlight.animator.frame = CGRectOffset(_asteroidlight.frame, 0, -self.bounds.size.height - 180);
} completionHandler:nil];
}
-(void) animateComet
{
int randX = arc4random_uniform(self.bounds.size.width);
int randSize = 40 + arc4random() % (120-40+1);
CGPoint startPoint = CGPointMake(randX, self.bounds.size.height);
[_comet setFrame: NSMakeRect(startPoint.x, startPoint.y, randSize, randSize)];
[self addSubview:_comet];
// Create animation (down y-axis)
[NSAnimationContext runAnimationGroup:^(NSAnimationContext *context) {
[context setDuration:1.5];
[context setTimingFunction:[CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear]];
_comet.animator.frame = CGRectOffset(_comet.frame, 0, -self.bounds.size.height - 180);
} completionHandler:nil];
}
How should I be handling this? Can I not have 5 timers at once?
The correct Q would be: "Why does at least one timer work?"
As you can see in the documentation the signature of the method has to take one parameter (the instance of NSTimer that send the message to the delegate).
aSelector The message to send to target when the timer fires.
The selector should have the following signature: timerFireMethod:
(including a colon to indicate that the method takes an argument). The
timer passes itself as the argument, thus the method would adopt the
following pattern:
Change the signature of the methods and of the selector when adding the timer.

How make [UIView animateWithDuration: ..] work in application porting by apportable?

On portal application by apportable I need to make some animation (move/scale/change alpha) of UIView *object via call:
[UIView
animateWithDuration:1.f
delay:0.5f
options:UIViewAnimationOptionAllowUserInteraction
animations:^(void)
{
myview.center = moveTo;
myview.transform = transformTo;
myview.alpha = alphaTo;
}
completion:^(BOOL finished)
{
[self animationFinished];
}];
For now it's only make delay, then execute animation code & completion code immediately.
Thank you for answer.
But I need animation "today", so I make next class.
It's work not good enoght, but it's much better then nothing.
Maybe for one it will be helpful
AOTAnimate.h
//
// AOTAnimate.h
//
// Created by Andrei Bakulin on 18/11/2013.
//
#import <Foundation/Foundation.h>
#interface AOTAnimate : NSObject
{
UIView *view;
NSInteger animationTicksLeft;
CGFloat scaleX;
CGFloat scaleY;
CGPoint moveDelta;
CGSize scaleCurrent;
CGSize scaleDelta;
CGFloat alphaDelta;
void (^completeAction)();
}
#property (nonatomic, assign) CGFloat duration;
#property (nonatomic, assign) CGFloat delay;
#property (nonatomic, assign) CGFloat frequency;
#property (nonatomic, assign) UIViewAnimationOptions options;
#property (nonatomic, assign) CGPoint moveFrom;
#property (nonatomic, assign) CGAffineTransform transformFrom;
#property (nonatomic, assign) CGFloat alphaFrom;
#property (nonatomic, assign) CGPoint moveTo;
#property (nonatomic, assign) CGAffineTransform transformTo;
#property (nonatomic, assign) CGFloat alphaTo;
+ (AOTAnimate*)makeAnimationOnView:(UIView*)view_ duration:(CGFloat)duration_;
+ (AOTAnimate*)makeAnimationOnView:(UIView*)view_ duration:(CGFloat)duration_ delay:(CGFloat)delay_;
- (void)run;
- (void)runWithCompleteAction:(void (^)(void))complete_;
#end
AOTAnimate.m
//
// AOTAnimate.m
//
// Created by Andrei Bakulin on 18/11/2013.
//
#import "AOTAnimate.h"
#implementation AOTAnimate
#synthesize duration, delay, frequency, options;
#synthesize moveFrom, transformFrom, alphaFrom;
#synthesize moveTo, transformTo, alphaTo;
+ (AOTAnimate*)makeAnimationOnView:(UIView*)view_ duration:(CGFloat)duration_
{
return [self makeAnimationOnView:view_ duration:duration_ delay:0.f];
}
+ (AOTAnimate*)makeAnimationOnView:(UIView*)view_ duration:(CGFloat)duration_ delay:(CGFloat)delay_
{
return [[AOTAnimate alloc] initWithView:view_ duration:(CGFloat)duration_ delay:(CGFloat)delay_];
}
//----------------------------------
- (void)dealloc
{
[view release];
if( completeAction )
Block_release(completeAction);
[super dealloc];
}
- (id)initWithView:(UIView*)view_ duration:(CGFloat)duration_ delay:(CGFloat)delay_
{
self = [super init];
if (self)
{
view = [view_ retain];
duration = duration_;
delay = delay_;
frequency = 0.025f;
options = UIViewAnimationOptionAllowUserInteraction;
moveFrom = view.center;
transformFrom = view.transform;
alphaFrom = view.alpha;
moveTo = view.center;
transformTo = view.transform;
alphaTo = view.alpha;
}
return self;
}
//----------------------------------
#pragma mark - Run animation
- (void)run
{
[self runWithCompleteAction:nil];
}
- (void)runWithCompleteAction:(void (^)(void))complete_
{
view.center = moveFrom;
view.transform = transformFrom;
view.alpha = alphaFrom;
#ifndef ANDROID
[UIView
animateWithDuration:duration
delay:delay
options:options
animations:^(void)
{
view.center = moveTo;
view.transform = transformTo;
view.alpha = alphaTo;
}
completion:^(BOOL finished)
{
if( complete_ )
complete_();
}];
#else
if( duration <= 0.f )
{
[self doAnimationComplete];
return;
}
animationTicksLeft = ceil( duration / frequency );
if( animationTicksLeft == 0 )
{
[self doAnimationComplete];
return;
}
moveDelta = CGPointMake( (moveTo.x-moveFrom.x)/animationTicksLeft, (moveTo.y-moveFrom.y)/animationTicksLeft );
alphaDelta = (alphaTo-alphaFrom)/animationTicksLeft;
CGSize scaleFrom = CGSizeMake( [self scaleX:transformFrom], [self scaleY:transformFrom] );
CGSize scaleTo = CGSizeMake( [self scaleX:transformTo], [self scaleY:transformTo] );
scaleDelta = CGSizeMake((scaleTo.width - scaleFrom.width)/animationTicksLeft,
(scaleTo.height - scaleFrom.height)/animationTicksLeft );
scaleCurrent = scaleFrom;
if( complete_ )
{
completeAction = Block_copy(complete_);
}
[self performSelector:#selector(doAnimationTick) withObject:nil afterDelay:delay];
#endif
}
//----------------------------------
#pragma mark - Manual animation
#ifdef ANDROID
- (void)doAnimationTick
{
if( CGPointEqualToPoint( moveDelta, CGPointZero ) == NO )
{
view.center = CGPointMake( view.center.x + moveDelta.x, view.center.y + moveDelta.y );
}
if( CGSizeEqualToSize( scaleDelta, CGSizeZero) == NO )
{
view.transform = CGAffineTransformMakeScale( scaleCurrent.width, scaleCurrent.height );
scaleCurrent.width += scaleDelta.width;
scaleCurrent.height += scaleDelta.height;
}
if( alphaDelta != 0.f )
{
view.alpha = view.alpha + alphaDelta;
}
// - - - - - - - - - - - - - - - - - - - - - - - - - - - -
animationTicksLeft--;
if( animationTicksLeft > 0 )
{
[self performSelector:#selector(doAnimationTick) withObject:nil afterDelay:frequency];
}
else
{
[self doAnimationComplete];
}
}
- (void)doAnimationComplete
{
view.center = moveTo;
view.transform = transformTo;
view.alpha = alphaTo;
if( completeAction )
completeAction();
}
//----------------------------------
#pragma mark - Helpers
- (CGFloat)scaleX:(CGAffineTransform)t
{
return sqrt(t.a * t.a + t.c * t.c);
}
- (CGFloat)scaleY:(CGAffineTransform)t
{
return sqrt(t.b * t.b + t.d * t.d);
}
#endif
#end
Use like this:
UIView *someview;
AOTAnimate *animate = [AOTAnimate makeAnimationOnView:someview duration:1.f delay:0.5f];
// allow to assign - animate.moveFrom / .tranfromFrom / alphaFrom properties,
// but by default they are copy from UIView* object
animate.moveTo = CGPointMake( 100, 200 ); // new point where need to move
animate.transformTo = CGAffineTransformScale( CGAffineTransformIdentity, 1.5f, 1.5f );
animate.alphaTo = 0.5f;
[animate runWithCompleteAction:^{
NSLog(#"Animation done..);
}];
If this method will run on iOS device - it'll use normal [UIView animateWithDuration:...] method
PS: This class do only "move" from one center point to another. Transform use only to scale object (not move). Alpha on my 2 test devices not supported, but maybe some where it does.
Animations do not work on the current version of Apportable's UIKit. We have fully functioning animations coming in the next version of UIKit, though. We will be releasing that once we are satisfied with the quality and coverage.

Resources