I have MAC OS 10.6 (Snow Leopard) and according to Apple's guide on HTTP Live Streaming, the mediafilesegmenter tool should be there in /usr/bin/ directory but it isnt. There is only the mediastreamsegmenter there. I need the mediafilesegmenter tool to segment MPEG2 Transport streams.
How can I get/install the mediafilesegmenter tool?
Thanks and regards,
Farish
The mediastreamsegmenter tool will also segment a file - for example:
mediastreamsegmenter -b http://some_playback_host -B stream -f \ /Library/WebServer/Documents/my_stream -p < input_file.ts
(make sure you have created the output directory "my_stream" first).
The above should segment the file into the default of 10 second chunks prefixed with the name 'stream' and also create a .m3u8 file you can point clients at.
man mediastreamsegmenter for more info.
HTH.
You can Download the current version of the HTTP Live Streaming Tools from the Apple Developer website. You can access them if you are a member of either the iPhone or Mac Developer Program. One way to navigate to the tools is to log onto connect.apple.com, then click either iPhone or QuickTime under the Downloads heading.
Related
Having my webcam plugged in, I'm able to manipulate the video/audio stream in my MAC OS app.
Now I'd like to output it as a new virtual video/audio device that I can select as a camera input in apps like "Skype" or QuickTime.
I've looked into i/o kit framework and the reference webpage says this:
"To add digital video capabilities to your software, use the QuickTime
APIs."
I believe this needs to be updated because QuickTime APIs have been replaced by CoreMedia IO.
So I looked into CoreMediaIO and found a sample code from Apple dev website that also is obsolete and won't run on XCode 7.x with OS Yosemite+.
I've also looked in AVFoundation but it seems like a dead end.
I'm lost at this point. I know it's doable since CamTwist software is doing it.
Anyone has an idea how to approach this?
CoreMediaIO is definitely the way to go, as that's what Apple currently uses in its hardware. On my system (2015 rMBP), /Library/CoreMediaIO/Plug-Ins/DAL/ contains AppleCamera.plugin and iOSScreenCapture.plugin, for the webcam and capturing from iDevice.
I assume the example you're referring to is this one?
It doesn't quite compile out of the box, but I got it to build with the OSX 10.11 SDK eventually. You need Apple's Core Audio Utility Classes, point the 'Sources/Extras/CoreAudio/PublicUtility' group in the Xcode project at those, and then fix a variable initialisation (remove the = NULL where it complains about a private constructor) and comment out a few lines in SamplePrefix.h. I haven't run it, but I see no reason why it wouldn't. If you don't have a kext signing certificate, you may need to take steps to load unsigned kexts to run the sample.
Is your webcam using the old video-digitaizer driver (driver .component file is located in /Library/QuickTime)? I was able to see my UVC-camera and DAL camera in QuickTime player. My understanding is that Apps written in AVFoundation will not recongize old vdig driver. On the contrary, Apps written using Sequence Grabber (very old) / QTCapture (old) will recongize your device.
Hope this helps.
I was under the impression that AV Foundation doesn't support third-party codecs. If I try to open an Avid DNxHD QuickTime movie in my application it doesn't work, as expected.
However, if I open a DNxHD movie in QuickTime Player X (which also uses AVF) it opens and plays back fine and even says it's a DNxHD in the Info window.
Does anyone know how Apple is achieving this?
I believe that QuickTime Player X is built on QT X, not on top of AVF. In fact, I think it uses (something like) QTKit, which will launch a 32-bit proxy process if necessary to open files that are only supported by 32-bit codecs.
You can access QT X via the QTKit framework, if you specify the 'playback only' attribute when you open the file, but then you can only play it back (as the flag suggests!). Even enumerating the number of tracks will fail. If you don't specify playback-only, then you are limited to QT7 components, accessed via a 32-bit proxy if your app is 64-bit.
I need to be able to play a RealAudio (.RA) file from Xcode. If i cannot play the file directly from the URL, it is ok if i can download it and play. Primary help i am looking for his how to Play RealNetworks RealAudio (.RA) file in Xcode 4.3 under iOS5.0.
I haven't come across an iOS package for playing Real Audio files, however, iOS does support some of the RealAudio Codecs. Specifically, it should be able to play the audio streams from RA files encoded with the raac and racp codecs. iOS won't know what to do with the RealAudio container, so you will have to determine what needs to happen to extract the audio stream from the file. This website contains details about the container.
I want to be able to run HTTP live streaming from a server, so that I can play back the files on my iPhone via HTTP. I know it's possible to play media files through safari /without/ live streaming, but I'd like to give it a go.
As far as I can tell, the only tools available for converting media files into the formats required for the live stream are for Mac OSX. I don't have a Mac, and I'm wondering if there are any equivalent tools for Windows?
FFMpeg offers tools for Mac, Windows, and Linux. http://ffmpeg.org/download.html
I want to create cursor rsrc files on the Mac from png files. The application that uses the cursors requires it to be in a .rsrc format and I cannot change that. Does anybody know of any way I can create the cursor .rsrc files from png images.
You can use Rezilla to edit resource files on Mac OS X, it has a CURS (and crsr) editor among others. It's a PowerPC binary but it runs well under rosetta on intel.
Also, you don't create a CURS resource file, you create a resource file and add as many CURS resources to it as you need. Resource forks are generic and can contain any number/kind of resources.
Its been a long time since I've thought about MacOS resource forks. Are you using the classic MacOS (i.e. before MacOS X)?
As I recall, ResEdit was the application most often used to manipulate the resource fork of a classic Mac application. I know it can edit cursor resources, but I don't recall if it can read PNG files. You may need to convert the files to GIF.
ResEdit is a Classic MacOS application. MacOS X prior to 10.5 could run Classic apps in emulation, but in 10.5 this support has been removed. You'd need to find a system either running the classic MacOS directly, or running 10.4 with Classic installed.
According to this link http://www.macfixit.com/article.php?story=20060621071707921 I need to have a Power PC Mac to run Mac classic. Is this right? I have a Intel Mac running Mac OS 10.4.11 . Are there any other tools capable of running on Intel Mac and could help me create CURS rsrc files. I tried using ResKnife but it didnt seem to have an option to create CURS rsrc files.
If by .rsrc file you mean a standard Mac resource file, you can use the Resource Manager to save the image in a file of the appropriate format.