With the Google Cast iOS SDK (Chromecast) is it possible to have a video with a separate audio stream? - chromecast

I have an MP4 URL with only video and a separate audio track for it. I can play one or the other by changing the "main" stream URL and the corresponding content-type, but I want both, not one or the other obviously.
There is a core URL at (silly video) https://v.redd.it/3hyw7hwoajn21/DASHPlaylist.mpd
You can get the MP4 video only with audio at https://v.redd.it/3hyw7hwoajn21/DASH_720 and its corresponding audio track is at https://v.redd.it/3hyw7hwoajn21/audio
If I play the MP4 with the iOS SDK it works fine, but no audio:
let url = URL(string: "https://v.redd.it/3hyw7hwoajn21/DASH_720")!
let mediaInfoBuilder = GCKMediaInformationBuilder(contentURL: url)
mediaInfoBuilder.contentID = url.absoluteString
mediaInfoBuilder.streamType = .buffered
mediaInfoBuilder.streamDuration = TimeInterval(75)
mediaInfoBuilder.contentType = "video/mp4"
mediaInfoBuilder.metadata = metadata
let mediaInfo = mediaInfoBuilder.build()
So I try to add in the audio track before calling build(), attempting to follow the documentation here:
mediaInfoBuilder.mediaTracks = [GCKMediaTrack(identifier: 98911, contentIdentifier: nil, contentType: "audio/mp4", type: GCKMediaTrackType.audio, textSubtype: GCKMediaTextTrackSubtype.unknown, name: "Fun time fun", languageCode: "en", customData: nil)]
But the result is the same: no audio.
Am I doing this wrong?

The audio and video streams have to be in the same manifest for us to support it if not then this is not supported by SDK. In general, the hardware of ChromeCast is limited to only allow one mediaElement. Some apps managed to add sound effect while reading a book, which might use WebAudio, but that's completely done in app.

Related

How to best compress videos using PHP-FFMpeg?

I'm using the PHP-FFMpeg v1.1.0 library to compress videos sent from a variety of mobile devices into 480p (640x480) MP4 videos.
Everything is working great, but I'm wondering how to reach an optimal compression level. It looks like I'm already using the recommended video bitrate for 480p, as well as a fairly low quality audio (64Kbps).
Am I missing any best practices/options here?
My code:
$videoFile = '/tmp/source.avi';
$destVideoFile = '/tmp/'.uniqid().'.mp4';
$ffmpeg = FFMpeg\FFMpeg::create();
$video = $ffmpeg->open($videoFile);
$video->filters()->resize(new FFMpeg\Coordinate\Dimension(640, 480), 'width')->synchronize();
$codec = new FFMpeg\Format\Video\X264();
$codec->setKiloBitrate(960)->setAudioKiloBitrate(64);
$video->save($codec, $destVideoFile);

In what way does this HEVC video not comply to Apples's requirements document?

My goal is to work out why a given video file does not play on Macos/Safari/Quicktime.
The background to this question is that it is possible to play HEVC videos with a transparent background/alpha channel on Safari/MacOS. To be playable, a video must meet the specific requirements set out by Apple in this document:
https://developer.apple.com/av-foundation/HEVC-Video-with-Alpha-Interoperability-Profile.pdf
The video that does not play on Apple/Safari/Quicktime is an HEVC video with an alpha transparency channel. Note that VLC for MacOS DOES play this file. Here it is:
https://drive.google.com/file/d/1ZnXjcDbk-_YxTgRuH_D7RSR9SXdY_XTv/view?usp=share_link
I have two example HEVC video files with a transparent background/alpha channel, and they both play fine using either Quicktime player or Safari:
Working video #1:
https://drive.google.com/file/d/1PJAyg_sVKVvb-Py8PAu42c1qm8l2qCbh/view?usp=share_link
Working video #2:
https://drive.google.com/file/d/1kk8ssUyT7qAaK15afp8VPR6mIWPFX8vQ/view?usp=sharing
The first step is to work out in what way my non-working video ( https://drive.google.com/file/d/1ZnXjcDbk-_YxTgRuH_D7RSR9SXdY_XTv/view?usp=share_link ) does not comply with the specification.
Once it is clear which requirements are not met by the non-working video then I can move onto the next phase, which is to try to formulate an ffmpeg command that will output a video meeting the requirements.
I have read Apples requirements document and I am out of my depth in trying to analyse the non working video against the requirements - I don't know how to do it.
Can anyone suggest a way to identify what is wrong with the video?
Additional context is that I am trying to find a way to create Apple/MacOS compatible alpha channel / transparent videos using ffmpeg with hevc_nvenc running on an Intel machine. I am aware that Apple hardware can create such videos, but for a wide variety of reasons it is not practical for me to use Apple hardware to do the job. I have spent many hours trying all sorts of ffmpeg and ffprobe commands to try to work out what is wrong and modify the video to fix it, but to be honest most of my attempts are guesswork.
The Apple specification for an alpha layer in HEVC requires that the encoder process and store the alpha in a certain manner. It also requires that the stream configuration syntax be formed in a specific manner. At time of writing, I'm aware of only the videotoolbox HEVC encoder being capable of emitting such a stream.

How to use h264 live stream with websocket?

Most websocket examples I have seen use either mp4 or wbem container data. Here is some sample javascript client code:
var ms = new MediaSource();
...
var buf = ms.addSourceBuffer('video/mp4; codecs="avc1.64001E"');
In my case, my server sends raw h264 data (video only, no audio). As there is no mp4/avc container for my data, I am wondering what is the proper way to define the parameter for addSourceBuffer(). Do I simply omit video/mp4 tag as follows? Regards.
var buf = ms.addSourceBuffer('codecs="avc1.64001E"');
I worked on a h264 play based on MediaSource several months ago. I didn't expect getting ups after such a long after the original answer, and I think I should edit this post to be more helpful. BTW I'm not a pro, this post is just based on my experience of using MediaSource API. Comments are welcome to correct me. Thanks!
var buf = ms.addSourceBuffer('video/mp4; codecs="avc1.64001E"');
After buf is created, I think buf expects fragmented MP4 data chunk each time when SourceBuffer.appendBuffer is called.
However you passed RAW H264 data to it which I think browser should give you an exception.
In my case, I used ffmpeg to read from a RTSP stream, convert the data to fMP4 format (without encoding) and send the output to stdout and then let other application to send the data to the browser. (I used WebSocket in fact.)
Here's the parameters:
ffmpeg -i rtsp://example.com/ -an -c:v copy -f mp4 \
-movflags +frag_keyframe+empty_moov+default_base_moof pipe:1
There's one more thing I want to share. I'm not sure how ffmpeg works, but it doesn't output a completed fragment each time I read from stdout. So in my backend program, I cached the data first. Here's pseudocode in Java:
byte[] oldbuf;
byte[] buffer = ReadDataFromFfmpegStdout();
if (buffer[4] == 'm' && buffer[5] == 'o' && buffer[6] == 'o' && buffer[7] == 'f') {
send(oldbuf); // the old buffer is a completed fragment now
oldbuf = buffer;
} else {
append(oldbuf, buffer); // append data to the old buffer
}
[ORIGINAL ANSWER]
You may checkout this project 131/h264-live-player on GitHub, which is based on mbebenita/Broadway, a JavaScript H.264 decoder.
The example of node server-static.js streams a raw h264 video over WebSocket, and the client code render it in a canvas. Git clone that repo, follow the installation instruction, put you h264 file in the samples folder, modify video_path to your video file in server-static.js#L28, execute the node server-static.js and you will see the video played in your browser.
Please be aware that, Broadway can only work with baseline profile.

Video playback using MediaStreamSource, MediaElement on WP7

We have been working on a streaming application for raw H.264 and AAC content. We are using MediaStreamSource to feed samples to MediaElement and observe no issues when we use PC SilverLight (on IE9) for audio/video playback. Audio-only stream also works fine on WP7. However, we face following problems with video playback on WP7:
•When video stream attribute for MediaStreamSource is initialized without CodecPrivateData, MediaElement "Failed" event handler is called with error code 3100. Video Stream attribute is initialized as:
Dictionary<MediaStreamAttributeKeys, string> videoStreamAttributes = new Dictionary<MediaStreamAttributeKeys, string>();
videoStreamAttributes[MediaStreamAttributeKeys.VideoFourCC] = "H264";
this.videoStreamDescription = new MediaStreamDescription(MediaStreamType.Video, videoStreamAttributes);
•When video stream attribute for MediaStreamSource is initialized with CodecPrivateData ([start code] [sps] [startcode] [pps]) the video plays but seems to be playing at a much faster rate - 2 to 3 times the specified FPS. Video Stream attribute is initialized as:
Dictionary<MediaStreamAttributeKeys, string> videoStreamAttributes = new Dictionary<MediaStreamAttributeKeys, string>();
videoStreamAttributes[MediaStreamAttributeKeys.VideoFourCC] = "H264";
videoStreamAttributes[MediaStreamAttributeKeys.CodecPrivateData] = "000000012742000D96540A0FD8080F162EA00000000128CE060C88";
this.videoStreamDescription = new MediaStreamDescription(MediaStreamType.Video, videoStreamAttributes);
Note that the same streams play fine on PC SilverLight with and without CodecPrivateData with audio as well as video.
Is there something wrong in which video stream attribute is initialized? What could be causing this problem and how can we resolve it?
Regards,
NKS.
The problem here was the clock that was being used for the timestamp. Our application used to calculate the timestamp as per 90Khz, the expected timestamp was in terms of 1 Mhz. So all the frames appeared after the time was elapsed and hence the player would play the frames as fast as it could (I had seen something around 120 fps also). after fixing the timestamp clock, it works fine

Audio Queue Services recording to .mp4 file, that cannot be play. And Magic Cookie issue

I am a newbie Mac programmer just for 3 months. I got a Audio Queue Services problem, hope anyone can help me.
I using Audio Queue Services API created a recording program, and output AAC format data. It's seems good, everything work fine.
Until I get to use the MP4V2 Library(an open source library) to output a .mp4 file, the problem is occur.
Problem 1:
I use magic cookie as a AAC header
to input to MP4V2 library function
MP4WriteSample(). Inside the .mp4 file has data, but the
player(ex:Quicktime) can't recognized
the .mp4 file, it can't play the audio data.
Problem 2:
I set my audio queue basic descriptions format in following :
aqData.mDataFormat.mSampleRate = 44100.0;
aqData.mDataFormat.mFormatID = kAudioFormatMPEG4AAC; // AAC codec.
aqData.mDataFormat.mChannelsPerFrame = 2;
aqData.mDataFormat.mFramesPerPacket = 1024;
and use AudioQueueGetProperty() to get magic cookie.
Than I print out my magic cookie contants, like that:
<03808080 22000000 04808080 14401500 18000001 f4000001 f4000580 80800212 10068080 800102>
total 39 Bytes.
What exactly it mean?
What the 39 Bytes each represented mean?
Can it convert to AAC header?
Reference :
Set a Magic Cookie for an Audio File
Set a Magic Cookie for a Playback Audio Queue
CoreAudio - how to determine the end of the playing aac file
Thanks a lot.
Ryan
set file type to kAudioFileM4AType
AudioFileCreateWithURL (
audioFileURL,
kAudioFileM4AType,
&audioFormat,
kAudioFileFlags_EraseFile,
&audioFileID
);

Resources