Merge video is not working in iOS app - macos

I have working on merge video functionality.
In my application I have one video with multiple tracks (total 10 tracks 9 video tracks and one audio track is common ).
Now I want to get 3 video from this multiple Tracks.
First video is combine with 1,4,7 track and
Second video is combine with 2,5,8 track and
Third video is combine with 3,6,9 track and
and Audio track is common for this three video.
I want to do this using following code.
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:[NSURL fileURLWithPath:urlSubItem.path]];
NSLog(#"%#",playerItem);
NSArray* arrTracks = [playerItem.asset tracksWithMediaType:AVMediaTypeVideo];
NSLog(#"%#",arrTracks);
NSArray* arrTracksText = [playerItem.asset tracksWithMediaType:AVMediaTypeText];
NSArray* arrTracksAudio = [playerItem.asset tracksWithMediaType:AVMediaTypeAudio];
NSLog(#"%#:%#",arrTracks,arrTracksAudio);
for (int i=0; i<3; i++) {
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, playerItem.asset.duration) ofTrack:[[playerItem.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
CMTime currentTime = kCMTimeZero;
for (int k=0;k<[arrTracks count];k++) {
NSLog(#"%d",k%3);
if(k%3==i){
AVAssetTrack* trackCombineVideo = [arrTracks objectAtIndex:k];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSError* errors;
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(currentTime, trackCombineVideo.timeRange.duration) ofTrack:[arrTracks objectAtIndex:k] atTime:currentTime error:&errors];
if (errors) {
NSLog(#"wait error:%#",errors);
}
currentTime = trackCombineVideo.timeRange.duration;
}
}
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
_assetExport.outputFileType = #"com.apple.quicktime-movie";
NSLog(#"file type %#",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl // Document Directory Path;
_assetExport.shouldOptimizeForNetworkUse = YES;
Using this code 3 separate video is create but each video have a 3 different track.
Now, my question is how to create video with only one track ?

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition
addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
for (int k=0;k<[arrTracks count];k++) {
NSLog(#"%d",k%3);
if(k%3==i){
AVAssetTrack* trackCombineVideo = [arrTracks objectAtIndex:k];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(currentTime, trackCombineVideo.timeRange.duration) ofTrack:[arrTracks objectAtIndex:k] atTime:currentTime error:&errors];
if (errors) {
NSLog(#"wait error:%#",errors);
}
currentTime = trackCombineVideo.timeRange.duration;
}
}
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
_assetExport.outputFileType = #"com.apple.quicktime-movie";
NSLog(#"file type %#",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl // Document Directory Path;
_assetExport.shouldOptimizeForNetworkUse = YES;
_assetExport.timeRange = CMTimeRangeMake(kCMTimeZero, playerItem.asset.duration);

Related

Get samples of audio clip in AVAudioPlayer?

Is there a property on the AVAudioPlayer class that I can use to get the samples? If not is there another class I can use to get this information?
Here's what I have:
var openDialog = NSOpenPanel.OpenPanel;
openDialog.CanChooseFiles = true;
openDialog.CanChooseDirectories = false;
openDialog.AllowedFileTypes = new string[] { "wav" };
if (openDialog.RunModal() == 1)
{
var url = openDialog.Urls[0];
if (url != null)
{
var path = url.Path;
var audioplayer = AVFoundation.AVAudioPlayer.FromUrl(file);
var samples = audioplayer.SAMPLES?;
Visual Studio Mac (C# / Xamarin)
AVAudioPlayer does not give you access to the sample data, but if you switch playback to AVPlayer you can use an MTAudioProcessingTap to "tap" the samples as they are played.
If you simply want to examine the samples in your file you can use AVAudioFile.
// get the total number of samples
var audioFile = new AVAudioFile(file, out outError);
var samples = audioFile.Length;

ExtAudioFileRead crash with code -40

My app plays music in background. I have audio key on in Background modes, my audio session looks like:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = NULL;
[audioSession setCategory:AVAudioSessionCategoryPlayback error:&err];
if( err ){
NSLog(#"There was an error creating the audio session");
}
[audioSession setMode:AVAudioSessionModeDefault error:&err];
if( err ){
NSLog(#"There was an error setting mote to the audio session");
}
[[AVAudioSession sharedInstance] setActive:YES error:&err];
if( err ){
NSLog(#"There was an error setting mote to the audio session");
}
I'm playing via AUGraph which is configured with 2 nodes: Remote I/O and Mixer:
AudioComponentDescription outputcd;
outputcd.componentFlags = 0;
outputcd.componentFlagsMask = 0;
outputcd.componentManufacturer = kAudioUnitManufacturer_Apple;
outputcd.componentSubType = kAudioUnitSubType_RemoteIO;
outputcd.componentType = kAudioUnitType_Output;
// Multichannel mixer unit
AudioComponentDescription MixerUnitDescription;
MixerUnitDescription.componentType = kAudioUnitType_Mixer;
MixerUnitDescription.componentSubType = kAudioUnitSubType_AU3DMixerEmbedded;
MixerUnitDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
MixerUnitDescription.componentFlags = 0;
MixerUnitDescription.componentFlagsMask = 0;
Also according to Technical QA I added
UInt32 maxFPS = 4096;
AudioUnitSetProperty(_mixerUnit, kAudioUnitProperty_MaximumFramesPerSlice,kAudioUnitScope_Global, 0, &maxFPS,sizeof(maxFPS));
But still no luck, my app keeps crashing on ExtAudioFileRead in Render callback function approx 10 seconds as i lock iPhone. Any suggestions?
Important to mention this bug is not reproduced on ios 7.
The issue was with Data Protection enabled in app capabilities. So as device was locked, files got encrypted and could not be played in background. Hence the crash.
Changing encryption properties for audio files fixes this issue.

How to retrieve photo extension (jpg/png) in iOS 8.0 using Photos API?

Am trying to get file extension of photos using the new Photos API in iOS 8 but haven't found a way to do so until now. Before iOS 8.0 I would use ALAssetRepresentation to get the file extension like:
// Get asset representation from asset
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
// Extract file extension from the original file name
NSString* fileExt = [[assetRepresentation filename] pathExtension];
Is there any way to get file extension of photos now?
PS: I am using new Photos API as I need to access all photos in my photos app, and ALassetsLibrary gives access to "Recently Added" photos only.
I've got it.
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:imageRequestOptions resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSLog(#"info - %#", info);
}];
PHImageFileDataKey = <PLXPCShMemData: 0x179ab6d0> bufferLength=1384448 dataLength=1384448;
PHImageFileOrientationKey = 1;
PHImageFileSandboxExtensionTokenKey = "c05e608acaf0bb212086ed2d512ccc97ea720ac3;00000000;00000000;0000001a;com.apple.app-sandbox.read;00000001;01000003;0000000000030b8c;/private/var/mobile/Media/DCIM/102APPLE/IMG_2607.JPG";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/102APPLE/IMG_2607.JPG";
PHImageFileUTIKey = "public.jpeg";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultWantedImageFormatKey = 9999;
Here is one way to do it:
PHImageRequestOptions * imageRequestOptions = [[PHImageRequestOptions alloc] init];
imageRequestOptions.synchronous = YES;
[[PHImageManager defaultManager]
requestImageForAsset:asset
targetSize:CGSizeMake(2048, 2048)
contentMode:PHImageContentModeAspectFit
options:imageRequestOptions
resultHandler:^(UIImage *result, NSDictionary *info) {
}];
The file name is contained in info. For example:
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/100APPLE/IMG_0066.JPG";
**imageManager.requestImageDataForAsset((images[indexPath.row] as PHAsset), options:PHImageRequestOptions(), resultHandler: {imagedata,dataUTI,Orientation,info in
var str:String=((((info as Dictionary)["PHImageFileURLKey"])! as NSURL).lastPathComponent as String)
cell.imageName.text=str as String**
These is also once simple way to get original file name selected from photos
PHAsset *asset; // Init this object with your PHAsset object
//and then use below line
NSLog(#"asset ext : %#",[[asset valueForKey:#"filename"] pathExtension]);

Is there a recommended way of making NSDatePicker use MY step sizes?

I'm wondering if some of you could offer advise on this.
I'm trying to change the NSDatePicker's step size to another value than 1.
On top of that I found that stepping minutes doesn't change the hours, nor the day.
I am using the delegate method datePickerCell:validateProposedDateValue:timeInterval:.
Now, though it does work as expected, the whole thing looks so much blown up that I started wondering if there is an easier way to accomplish this.
Any advise or direction for documentation is appreciated. Thank you.
Here's my code:
- (void)datePickerCell:(NSDatePickerCell *)aDatePickerCell validateProposedDateValue:(NSDate **)proposedDateValue
timeInterval:(NSTimeInterval *)proposedTimeInterval {
DLog(#"date picker for: %#", [aDatePickerCell identifier]);
NSDate *newProposedDateValue = nil;
// just in case that we don't need a correction
NSDate *correctedProposedDateValue = *proposedDateValue;
// the interval that the step generates
// > 0 means: the old date is later than the new proposed date
// < 0 means: the old date is earlier than the new proposed date
int interval = [[self dateValue] timeIntervalSinceDate:*proposedDateValue];
// define expected interval values for our scenarios
// we don't care about a minute step that does not cross the hour here
// nor do we care about an hour step that does not cross the day
// minutes are stepped: minute is stepped but hour remains (01:59 <-> 01:00), so the difference is 59 minutes
int const minuteSteppedUpAcrossHour = -59 *60;
int const minuteSteppedDownAcrossHour = - minuteSteppedUpAcrossHour;
// nor do we care about an hour step that does not cross the day
// hours are stepped: hour is stepped but day remains (10.03.13 00:30 <-> 10.03.13 23:30) we have a difference of 23 hours
int const hourSteppedUpAcrossDay = -23 *60 *60;
int const hourSteppedDownAcrossDay = - hourSteppedUpAcrossDay;
// define correction values for our scenarios
int const anHour = 60 *60;
int const aDay = anHour *24;
switch (interval) {
case hourSteppedUpAcrossDay:
correctedProposedDateValue = [*proposedDateValue dateByAddingTimeInterval:(-aDay)];
break;
case minuteSteppedDownAcrossHour:
correctedProposedDateValue = [*proposedDateValue dateByAddingTimeInterval:(+anHour)];
break;
case hourSteppedDownAcrossDay:
correctedProposedDateValue = [*proposedDateValue dateByAddingTimeInterval:(+aDay)];
break;
case minuteSteppedUpAcrossHour:
correctedProposedDateValue = [*proposedDateValue dateByAddingTimeInterval:(-anHour)];
break;
default:
break;
}
if ([self dateValue] < correctedProposedDateValue) {
newProposedDateValue = [self roundDateUpForMinuteIntervalConstraint:correctedProposedDateValue];
} else {
newProposedDateValue = [self roundDateDownForMinuteIntervalConstraint:correctedProposedDateValue];
}
*proposedDateValue = newProposedDateValue;
}
- (NSDate *)roundDateUpForMinuteIntervalConstraint:(NSDate *)date {
return [self date:date roundedUpToMinutes:MINUTE_INTERVAL_CONSTRAINT_FOR_SESSIONS_START];
}
- (NSDate *)roundDateDownForMinuteIntervalConstraint:(NSDate *)date {
return [self date:date roundedDownToMinutes:MINUTE_INTERVAL_CONSTRAINT_FOR_SESSIONS_START];
}
- (NSDate *)date:(NSDate *)date roundedUpToMinutes:(int)minutes {
// Strip miliseconds by converting to int
int referenceTimeInterval = (int)[date timeIntervalSinceReferenceDate];
int remainingSeconds = referenceTimeInterval %(60 *minutes);
int timeRoundedUpToMinutes = 0;
if (remainingSeconds== 0) {
timeRoundedUpToMinutes = referenceTimeInterval;
} else {
timeRoundedUpToMinutes = referenceTimeInterval -remainingSeconds +(60 *minutes);
}
return [NSDate dateWithTimeIntervalSinceReferenceDate:(NSTimeInterval)timeRoundedUpToMinutes];
}
- (NSDate *)date:(NSDate *)date roundedDownToMinutes:(int)minutes {
// Strip miliseconds by converting to int
int referenceTimeInterval = (int)[date timeIntervalSinceReferenceDate];
int remainingSeconds = referenceTimeInterval %(60 *minutes);
int timeRoundedUpToMinutes = referenceTimeInterval -remainingSeconds;
return [NSDate dateWithTimeIntervalSinceReferenceDate:(NSTimeInterval)timeRoundedUpToMinutes];
}

calculating directory size in cocoa

I want to calculate the directory (folder)size and i have to list all files and folders (subfolders) in a volume(drive) with its corresponding size.I am using the below code to calculate size.The problem with this code is the performance issue . I am using NSBrowser to display .
NSArray *filesArray = [[NSFileManager defaultManager] subpathsOfDirectoryAtPath:folderPath error:nil];
NSEnumerator *filesEnumerator = [filesArray objectEnumerator];
NSString *fileName;
unsigned long long int fileSize = 0;
while (fileName = [filesEnumerator nextObject])
{
NSDictionary *fileDictionary = [[NSFileManager defaultManager] attributesOfItemAtPath:folderPath error:nil];
fileSize += [fileDictionary fileSize];
}
return fileSize;
Questions:
Is there any built in function available?
If not what is the best way to calculate the size?
Is it good to use cache to store already calculated file size?
Thanks...
You can use stat.
-(unsigned long long)getFolderSize : (NSString *)folderPath;
{
char *dir = (char *)[folderPath fileSystemRepresentation];
DIR *cd;
struct dirent *dirinfo;
int lastchar;
struct stat linfo;
static unsigned long long totalSize = 0;
cd = opendir(dir);
if (!cd) {
return 0;
}
while ((dirinfo = readdir(cd)) != NULL) {
if (strcmp(dirinfo->d_name, ".") && strcmp(dirinfo->d_name, "..")) {
char *d_name;
d_name = (char*)malloc(strlen(dir)+strlen(dirinfo->d_name)+2);
if (!d_name) {
//out of memory
closedir(cd);
exit(1);
}
strcpy(d_name, dir);
lastchar = strlen(dir) - 1;
if (lastchar >= 0 && dir[lastchar] != '/')
strcat(d_name, "/");
strcat(d_name, dirinfo->d_name);
if (lstat(d_name, &linfo) == -1) {
free(d_name);
continue;
}
if (S_ISDIR(linfo.st_mode)) {
if (!S_ISLNK(linfo.st_mode))
[self getFolderSize:[NSString stringWithCString:d_name encoding:NSUTF8StringEncoding]];
free(d_name);
} else {
if (S_ISREG(linfo.st_mode)) {
totalSize+=linfo.st_size;
} else {
free(d_name);
}
}
}
}
closedir(cd);
return totalSize;
}
Take a look at Mac OS X not reporting directory sizes correctly?
1. Is there any built in function available?
fileSize is a built-in function that gives you size.
2. If not what is the best way to calculate the size?
This method is good enough to calculate size of a folder/directory.
3. Is it good to use cache to store already calculated file size?
Yes you can store it in cache.

Resources