I am using video toolbox in iOS8 to play an h264 stream along with a PCM audio stream. Video is displayed fine as long as I dont start the audio stream. As soon as I call AudioQueueStart, all enqueueSampleBuffer calls stop displaying video and prints the error "Ignoring enqueueSampleBuffer: because status is "failed"".
There are no errors returned from CMBlockBuffer calls. enqueueSampleBuffer doesnt return anything and hence I cant write a code to recreate video layer again on this error.
This only happens on iOS device and not on simulator. Audio and video plays perfectly fine on Simulator. I run Audio and Video on separate threads such that one doesnt block the other. Has anyone faced the same issue?
The problem was happening because I had the mute switch on iphone turned ON.
I used AVAudioSessionCategoryPlayback for my audio session to overcome this.
This problem is resolved.
Related
i'm working on a project which produces sound alerts whenever a condition is met. But the problem is if the video player is running the alert sound get mixed with the sound of the video. Is there any way to mute the audio of the running application only using terminal so that the warning can be heard.
My operating system is Ubuntu.
I have a video device that exposes an MJPEG stream via a URL. For windows there are utility apps that can "create" a system webcam device useable by Skype or any other application based on the URL this video device exposes.
Example: smart phone is broadcasting MJPEG URL. Windows computer can run a utility app to "create" a system webcam based off of the MJPEG stream and then that webcam be used via Skype. The video shown is what the phone is broadcasting.
I'm trying to do the same but for my Mac. I can't seem to find any utility that creates a system webcam from an MJPEG stream. Googling isn't helping either, I'm just not finding a solid solution or anything I recognize as a solution.
Thanks!
I've had success with obs-studio and its vlc and virtual camera plugins.
You can add a "Vlc Video Source" and then click "Start Virtual Camera".
If you want to use it as a web-cam I just recommend that you reduce the "Network Caching (ms)" setting in the vlc video source settings as much as possible.
The hard-coded minimal value is 100ms, you can reduce it by changing this line: https://github.com/obsproject/obs-studio/blob/7217671eb0812681a9f83858bb02065b671673e7/plugins/vlc-video/vlc-video-source.c#L1079
There is still significant delay with this method regardless, but it's better than not having it working at all.
Every time I'm working on a Core Audio application, specifically one that utilizes real-time audio via the remoteIO audio unit render callbacks, the moment I start the audio unit engine running whatever music I am playing via youtube or itunes is muted. I must reload the youtube page or reset my system preference audio settings to get sound back. Is there a solution to this?
Set the mix-with-others property on the Simulator's RemoteIO Audio Unit before starting it. Also enable a proper AVAudioSession category.
It seems to me that EM::FileStreamer should be usable out of the box, but I tried it with the <video> tag and with an embedded Quicktime plugin, but neither one would actually show the video.
I've connected to my EM server with telnet and found that it does indeed stream my video file. I'm at a loss as to why it's not buffering and playing in my browser. Anyone have any hints for me?
That should work, however the video must be encoded properly to stream in that fashion. I am assuming your using an mp4? If so, have you run qt-faststart on the file?
Here's an article with a bit of info.
http://www.stoimen.com/blog/2010/11/12/how-to-make-mp4-progressive-with-qt-faststart/
I also have my sample video encoding app on github, which does this automatically for you when you upload videos.
https://github.com/zquestz/asset-manager
Just make sure qt-faststart is in your PATH. Once the index information is at the beginning of the file, thing should work as expected.
Videos encoded with libtheora should work out of the box for supported browsers.
I am trying to get the MPMovieplayerController to work. I load a video everything goes wel i even see the first frame but then it automatically pauses, if i press play it pauses again. In the simulator it works perfectly but on the ipad device it gives the problem. I can even seek through the video and i see the frame i seeked to but nothing plays. This is some output from the console:
2010-06-08 22:16:13.145 app[3089:207] Using two-stage rotation animation. To use the smoother single-stage animation, this application must remove two-stage method implementations.
[Switching to thread 12803]
warning: Unable to read symbols for "/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.2 (7B367)/Symbols/System/Library/VideoDecoders/VCH263.videodecoder" (file not found).
warning: Unable to read symbols for "/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.2 (7B367)/Symbols/System/Library/VideoDecoders/H264H2.videodecoder" (file not found).
warning: Unable to read symbols for "/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.2 (7B367)/Symbols/System/Library/VideoDecoders/MP4VH2.videodecoder" (file not found).
warning: Unable to read symbols for "/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.2 (7B367)/Symbols/System/Library/VideoDecoders/JPEGH1.videodecoder" (file not found).
2010-06-08 22:16:15.145 app[3089:207] setting file:///private/var/mobile/Applications/46CE5456-6338-4BBF-A560-DCEFF700ACE0/tmp/MediaCache/
I dont get those warning when using the simulator BTW.
Does anyone know how to fix this ?
Set the property "useApplicationAudioSession" of the MPMoviePlayerController to "NO" to resolve the problem.
Found the solution just restart the ipad and it works again weird but thats it :)
Had the same problem. Video was playing ok on simulation but not on device. The problem was either on my HTML5 code embeded in the UIView or on the mp4 video compression, I don't know what fixed it but I suggest you try both. I am still getting the error when I am testing the video on the device, but the video plays just fine!
I had an issue on the device where the video would show up but not play. I could scrub. The fix for me was that I was using the avaudiorecorder, and I was releasing it before playing the video without stopping the audio recorder. My solution was to add the stop call to the recorder before starting the video:
[recorder stop];
[recorder release];
Avaudioplayer was causing my problem. Apperently on the iPad Avaudioplayer and Mpmovieplayercontroller cannot both play at the same time.
If an Avaudioplayer object is open the Mpmovieplayer will only display a frame and immediately stop playing.
As far as I can tell this only happens with iPad device 3.2.1 and SDK 4.0.1; simulators and the iPhone work fine
I switched back to Audioservices since I need Audioplayer and Movieplayer to play at the same time.