I'm just getting started with TVOS and was wondering if anyone has addressed looping in TVOS / TVJS / TVML. I'm using this tutorial to create a simple interface for playing videos. My wrinkle is that I want these videos to play in a continuous seamless loop - basically a moving screensaver type effect - see example videos at art.chrisbaily.com.
Is there a simple way to do this, or do I need to build some kind of event listener to do the looping manually? I'd like the videos to be fairly hi res, and the videos would be somewhere between 1 and 3 minutes.
I was looking for an answer for this question as well. And found that the only way this would be possible is to create an event listener and adding a duplicate media item to the playlist. But honestly it is not that hard, given if you followed the tutorial that you listed in your post.
So the code would be something like
player.playlist = playlist;
player.playlist.push(mediaItem);
//Again push the same type of media item in playlist so now you have two of the same.
player.playlist.push(mediaItem);
player.present();
This will make sure that once your first video ends the second one starts playing which is essentially a loop. Then for the third one and on you implement an event listener using "mediaItemWillChange" property. This will make sure that once a video ends a new copy of the same video is added to the playlist.
player.addEventListener("mediaItemWillChange", function(e) {
player.playlist.push(mediaItem);
});
Just put the event listener before you start the player using
player.present();
Note this question and idea of this sort was already asked/provided on Apple's discussion board. I merely took the idea and had implemented it in my own project and now knowing that it works I am posting the solution. I found that if I pop the first video out of the playlist and add then push a new on the playlist as mentioned in the linked thread below, my videos were not looping. Below is the link to the original post.
How to repeat video with TVJS Player?
You could also set repeatMode on the playlist object
player.playlist.repeatMode = 1;
0 = no repeat
1 = repeat all items in playlist
2 = repeat current item
There's really no need to push a second media item onto the. Simply listen for the media to reach its end, then set the seek time back to zero. Here's the code.
Play the media.
private func playVideo(name: String) {
guard name != "" else {
return
}
let bundle = NSBundle(forClass:object_getClass(self))
let path = bundle.pathForResource(name, ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.avPlayer = AVPlayer(URL: url)
if (self.avPlayerLayer == nil) {
self.avPlayerLayer = AVPlayerLayer(player: self.avPlayer)
self.avPlayerLayer.frame = previewViews[1].frame
previewViews[1].layer.addSublayer(self.avPlayerLayer)
avPlayer.actionAtItemEnd = .None
//Listen for AVPlayerItemDidPlayToEndTimeNotification
NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: self.avPlayer.currentItem)
}
avPlayer.play()
}
Play Again
func playerItemDidReachEnd(notification: NSNotification) {
let item = notification.object as? AVPlayerItem
item?.seekToTime(kCMTimeZero)
}
Related
I'm making an application like ticktok and using IGListKit and AVplayer to play the videos. Im caching the videos and then displaying and playing it checking if the cell occupies the complete screen it should start to play. There is a 500 milli second delay when doing so but its required to work smooth like ticktok.
Using the below code in scrollViewDidEndDecelerating so that when scrolling end it should check for the visible cell which occupies the complete screen of the collection view
let collectionViewVisibleRect = getCollectionViewVisibleRect()
for visibleIndexPath in collectionView.indexPathsForVisibleItems {
if let cell = collectionView.cellForItem(at: visibleIndexPath) as? VideoFeedCollectionViewCell {
if collectionViewVisibleRect.contains(cell.frame) {
cell.queuePlayer?.play()
cell.imgPlay.isHidden = true
}else {
cell.queuePlayer?.pause()
}
}
}
When playing directly from the cell its playing smoothly but then multiple videos are playing at the same time.
We've been working with AudioUnits in Core Audio. It is simultaniously a very powerful audio framework, and one of the worst documented which makes it both a joy and a frustration to work with.
We want to accomplish something we know iPads had been able to do since iOS 6.0 - Multiple audio inputs.
So far - from the 2012 Developer Talk - It appears you have to set the audio session to MultiRoute. We've done this. If I plug in an a soundcard from a keyboard. I can see that there are two inputs. Great. We're then told that we need to set a ChannelMap on a Remote I/O unit.
To what? Well... here's where it gets vague. We need to set all the channels we don't want to -1 and the channels we want to 0 and 1 (for stereo input or for mono?).
We attempt this and... nothing. Sound still plays through on the 'last in wins' principle. Microphone if everything plugged out, soundcard if that's the one plugged in. But we can't switch between them.
This setup code is always run before the other function listed
func setupAudioSession() {
self.audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryMultiRoute, with: [.mixWithOthers])
try audioSession.setActive(true)
audioSessionWasSetup = true
} catch let error {
//TODO: Implement something here
print(error)
audioSessionWasSetup = false
}
}
We then have a remote I/O with an associated audiograph set up. This has been tested and works beautifully. But we need to be able to set where it's pulling sound from.
I've attempted to do it with this, but not only doesn't it have any effect... nothing happens.
Am I missing something?
private func setChannelMap(onAudioUnit audioUnit: AudioUnit?, toChannel channelIndex: Int = 0) {
var channelMap: [Int32] = []
if audioUnit == nil {
return
}
var numberOfInputChannels: UInt32 = 4 // Two stereo inputs? - I'm just guessing here
let mapSize: UInt32 = numberOfInputChannels * UInt32(MemoryLayout<Int32>.size);
for _ in 0...(numberOfInputChannels) {
channelMap.append(-1)
}
channelMap[2 * channelIndex] = 0;
channelMap[2 * channelIndex + 1] = 1;
let status = AudioUnitSetProperty(audioUnit!,
kAudioOutputUnitProperty_ChannelMap,
kAudioUnitScope_Input,
0,
&channelMap,
mapSize);
self.checkError(status, "Failed to set Channel Map on input unit")
}
There isn't any documentation on this at all as far as I've been able to find. Nor any code examples.
I hope you can help us.
I'm trying to use the SystemMediaTransportControls in an background audio app. I am using the MediaPlayer class to play the audio. Setting the music properties, thumbnail all seems to work fine, but setting the control buttons (i.e. "next" button) is not working at all. My use case is somewhat unique in that I can't get a complete playlist at once, the next track is only available through a internal method call.
Here is what I am doing:
This part is working fine, the volume control shows all the audio information and thumbnail correctly:
var playbackItem = new MediaPlaybackItem(source);
var displayProperties = playbackItem.GetDisplayProperties();
displayProperties.Type = Windows.Media.MediaPlaybackType.Music;
displayProperties.Thumbnail = RandomAccessStreamReference.CreateFromUri(new Uri(_currentTrack.AlbumArtUrl));
displayProperties.MusicProperties.AlbumArtist = displayProperties.MusicProperties.Artist = _currentTrack.Artist;
displayProperties.MusicProperties.Title = _currentTrack.SongTitle;
displayProperties.MusicProperties.AlbumTitle = _currentTrack.Album;
playbackItem.CanSkip = true;
playbackItem.ApplyDisplayProperties(displayProperties);
_player.Source = playbackItem;
This part is not working, the "Next" button is still disabled, the "Record" button is not showing.
var smtc = _player.SystemMediaTransportControls;
smtc.ButtonPressed += OnSMTCButtonPressed;
smtc.IsEnabled = true;
smtc.IsNextEnabled = true;
smtc.IsRecordEnabled = true;
I've been trying to look for answers online but was unable to find anything useful. Any answer is appreciated.
In UWP, apart SMTC, there is something like CommandManager - to properly work with your SMTC you will have to disable it. Just put the line:
mediaPlayer.CommandManager.IsEnabled = false;
once you initialize the player and it should work. You will find more information at MSDN:
If you are using MediaPlayer to play media, you can get an instance of the SystemMediaTransportControls class by accessing the MediaPlayer.SystemMediaTransportControls property. If you are going to manually control the SMTC, you should disable the automatic integration provided by MediaPlayer by setting the CommandManager.IsEnabled property to false.
I am very new to both programming and swift (please be kind!). I have a working example of how to generate a random image (by assigning a number and using Int(arc4random_uniform(x)). However I do not know how I can say if this image is shown, then assign this sound to the button. Every time an image is displayed it will have a matching sound to go with it. With my limited experience I can only think of doing long winded if statements?
Answer
Pass the name of the image to a NSMutableString, without the extension (.png, or just remove the extension.). Then (hopefully the audio file has the same name as the images name) you can now call the audio file with from the mutableString.
Explanation
For Example. If you have an image named Dog.png. Name your audio file that goes with the dog image: dog.mp3. (.mp3 is an example). When the user selects the dog image, pass the name to a muatableString. Lets call the mutableString name selectedImage.
Right now, selectedImage should have a value of .png. (Or whatever extension you have.) Remove the last 3 letters of selectedImage like this:
func deleteCharactersInRange(_ aRange: NSRange)
Next you want to call the audio file using selectedImage, and add the extension name like this:
func appendString(_ aString: String)
So selectedImage's value should now be: dog.mp3. And you can now call the audio file with selectedImgae.
Hope this helps!!
Edit
Here is code to play audio files:
var player : AVAudioPlayer! = nil // will be Optional, must supply initializer
let path = NSBundle.mainBundle().pathForResource("audioFile", ofType:"m4a")
let fileURL = NSURL(fileURLWithPath: path)
player = AVAudioPlayer(contentsOfURL: fileURL, error: nil)
player.prepareToPlay()
player.delegate = self
player.play()
I'm currently writing a small application which takes a folder containing many short video files (~1mn each) and plays them like they were ONE long video file.
I've been using AVQueuePlayer to play them all one after another but I was wondering if there were an alternative to this, because I'm running into some problems:
there is a small but noticeable gap when the player switches to the next file
I can't go back to the previous video file without having to remove all the items in the queue and put them back
I'd like to be able to go to any point in the video, just as if it were a single video file. Is AVPlayer the best approach for this?
I realize that it's been about 6 years since this was asked, but I found a solution to this shortly after seeing this question and maybe it will be helpful to someone else.
Instead of using a an AVQueuePlayer, I combined the clips in an AVMutableComposition (a subclass of AVAsset) which I could then play in a normal AVPlayer.
let assets: [AVAsset] = urlsOfVideos.map(AVAsset.init)
let composition = AVMutableComposition()
let compositionVideoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
let compositionAudioTrack = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
var insertTime = CMTime.zero
for asset in assets {
let range = CMTimeRange(start: .zero, duration: asset.duration)
guard let videoTrack = asset.tracks(withMediaType: .video).first,
let audioTrack = asset.tracks(withMediaType: .audio).first else {
continue
}
compositionVideoTrack?.preferredTransform = orientation!
try? compositionVideoTrack?.insertTimeRange(range, of: videoTrack, at: insertTime)
try? compositionAudioTrack?.insertTimeRange(range, of: audioTrack, at: insertTime)
insertTime = CMTimeAdd(insertTime, asset.duration)
}
Then you create the player like this
let player = AVPlayer(playerItem: AVPlayerItem(asset: composition))