Change h.264 quality when using SinkWriter to encode video - windows-7

I am using Microsoft Media Foundation to encode a H.264 video file.
I am using the SinkWriter to create the video file. The input is a buffer (MFVideoFormat_RGB32) where I draw the frames and the output is a MFVideoFormat_H264.
The encoding works and it creates a video file with my frames in it. But I want to set the quality for that video file. More specifically, I want to set the CODECAPI_AVEncCommonQuality property on the H.264 encoder.
In order to get a handle to the H.264 encoder, I call GetServiceForStream on the SinkWriter. Then I set the CODECAPI_AVEncCommonQuality property.
The problem is that my property change is ignored. As stated in the documentation:
To set this parameter in Windows 7, set the property before calling IMFTransform::SetOutputType. The encoder ignores changes after the output type is set.
The problem is that I don't create the H.264 encoder manually. I set the input and the output type on the SinkWriter, and the SinkWriter creates the H.264 encoder automatically. As soon as it creates the encoder, it calls the IMFTransform::SetOutputType method, and I can't change the CODECAPI_AVEncCommonQuality property anymore. The documentation also says that the property change isn't ignored in Windows 8, but I need this to run on Windows 7.
Do you know how I can change the quality for the encoded file while using SinkWriter on Windows 7?
PS: Someone asked the same question on the msdn forums, and he didn't seem to get an answer.

As the documentation says, you just can't change the CODECAPI_AVEncCommonQuality property after the output type is set, and the SinkWriter sets the output type before you can get a hand on the encoder.
In order to bypass this problem I managed to create a class factory and register it in Media Foundation, so that the SinkWriter uses it to create a new encoder. In my class factory, I create a new H264 encoder and set whatever properties I want before passing it on to the SinkWriter.
I have written in more detail the steps I took to create this class factory on the MSDN forums, here: http://social.msdn.microsoft.com/Forums/en-US/mediafoundationdevelopment/thread/6da521e9-7bb3-4b79-a2b6-b31509224638
That was the only way I could get around my problem on Windows 7.

CODECAPI_AVEncCommonRateControlMode and CODECAPI_AVEncCommonQuality can be passed to the h.264 encoder using IMFSinkWriter->SetInputMediaType(/* ... */,, IMFAttributes pEncodingParameters). I suspect other CODECAPI_ values would work as well.
CComPtr<IMFAttributes> pEncAttrs;
ATLENSURE_SUCCEEDED(MFCreateAttributes(&pEncAttrs, 1));
ATLENSURE_SUCCEEDED(pEncAttrs->SetUINT32(CODECAPI_AVEncCommonRateControlMode, eAVEncCommonRateControlMode_Quality));
ATLENSURE_SUCCEEDED(pEncAttrs->SetUINT32(CODECAPI_AVEncCommonQuality, 40));
ATLENSURE_SUCCEEDED(writer->SetInputMediaType(sink_stream, mtSource, pEncAttrs));
// ^^^^^^^^^

Related

Extracting metadata using windows media format 11 sdk

I've never used the windows media format 11 sdk before and I'm just trying to understand the documentation behind text. I don't understand the line where it says, "In order to use the function, you must pass it a pointer to the IWMHeaderInfo3 interface of a metadata editor object, reader object, synchronous reader object, or writer object". Where do these objects come from? And how do I get them from a video file path?
I wish there was some function like printAllAttributes(std::string videoFilePath) or something.

Replacing deprecated AVStream codec parameter in libav

A long time ago, I implemented a C++ class to create MP4 video files from an array of images. The code works pretty well, nevertheless, I discovered a deprecation warning that I want to rid off. The parameter "codec" from the AVStream structure has been deprecated and I want to replace it.
Here is my current working code:
AVOutputFormat *outputFormat = av_guess_format("ffh264", movieFile.toLocal8Bit().data(), nullptr);
if (!outputFormat)
return false;
enum AVCodecID videoCodecID = outputFormat->video_codec;
AVCodec *videoCodec = avcodec_find_encoder(videoCodecID);
if (!videoCodec)
return false;
AVStream *stream = avformat_new_stream(formatContext, videoCodec);
if (!stream)
return false;
AVCodecContext *videoCodecContext = stream->codec; // <- codec is a deprecated parameter
videoCodecContext->width = videoW;
videoCodecContext->height = videoH;
Now, to replace the "codec" parameter, the libav developers team recommends using the parameter "codecpar" (AVCodecParameters) that was included in the AVStream structure. The example they use to share is this:
if (avcodec_parameters_to_context(videoCodecContext, stream->codecpar) < 0)
return nullptr;
Note: codecpar (AVCodecParameters) is a data structure itself.
Unfortunately, when I try to use that code, I got this problem: usually, all the information stored in the codecpar parameter comes from the data structure from a previous video file that was opened previously. In other words, the information already exists. In my case, the situation is different because I am creating an MP4 file from scratch so there is no previous codecpar record to use, therefore I have to create a new instance of AVCodecParameters structure by myself, setting every variable manually.
As far, I was able to set all the variables from the codecpar structure, except for two:
uint8_t * extradata
int extradata_size
Note: currently I can create an MP4 file "successfully" without setting those variables, but the file is incomplete and when I try to play it using "mplayer" I got this error message:
[extract_extradata # 0x55b5bb7e45c0] No start code is found.
I was researching these two fields, and it seems they store some kind of information related to the codec, which in my case is H264.
So, my specific question is: if I am setting a codecpar variable (AVCodecParameters) from scratch, how can I set values for the fields extradata and extradata_size in the right way for the codec H264?
Solution:
This is a basic list of steps I followed to replace the deprecated stream->codec data structure successfully:
Initialize AVFormatContext, AVOutputFormat variables (using av_guess_format and avformat_alloc_output_context2)
Open video codec (using avcodec_find_encoder)
Add/Initialize AVStream variable (using avformat_new_stream)
Initialize AVCodecContext variable (using avcodec_alloc_context3)
Customize AVCodecContext parameters, only if you need to. (In example: width, height, bit_rate, etc)
Add this piece of code:
if (formatContext->oformat->flags & AVFMT_GLOBALHEADER)
videoCodecContext->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
Open AVCodecContext variable (using avcodec_open2)
Copy AVCodecContext codecpar structure into AVStream codecpar (using avcodec_parameters_from_context)
From this point, you will be able to create and add frames to your output file.
PS: The example I used as a reference to code this implementation is available on doc/examples/muxing.c

Xamarin Forms Video Player Sample - Get Video Bytes for Uploading

I'm implementing a video player in a Xamarin Forms app just like the video player sample provided by Xamarin
https://learn.microsoft.com/en-us/xamarin/xamarin-forms/app-fundamentals/custom-renderer/video-player/
I'm able to select a video from the phone gallery, set the video player source to the selected video, and play the video. How do I get the actual stream or bytes of the selected video so that I can upload it to Blob Storage?
I've tried
using (FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read)) ..........
where fileName is the path and file name of the selected video as set to the video player source. It doesn't work as the Android file name string is not found. (When invoking this from xamarin forms). I realize the file name will be different even when on iOS. How do I reach down into the platform specific implementations and get the file bytes or stream of the selected file?
thanks
I would look into the libVLCSharp library which provides cross-platform .NET/Mono bindings for libVLC. They provide good support for Xamarin.Forms and the features you most likely need to implement the stream handling functionality. What you're trying to achieve won't be simple but it should be perfectly doable.
First, you should check out the documentation for Stream output:
Stream output is the name of the feature of VLC that allows to output any stream read by VLC to a file or as a network stream instead of displaying it.
Related tutorial: Stream to memory (smem) tutorial.
That should get you started but there will for sure be many caveats along the way. For example, if you try to play the video while capturing the bytes to be uploaded somewhere, you'll have to respect VERY tight timeframes. In case you take too long to process the stream, it will slow down the playback and the user experience suffers.
Edit: Another option you could look into is to interact directly with the MediaPlayer class of libVLC, as explained in this answer. The sample code is in C++ but the method names are very similar in the .NET bindings.
For example, the following piece of code:
libvlc_video_set_callbacks(mplayer,
lock_frame, unlock_frame,
0, user_data);
can be implemented with libVLCSharp by calling the SetVideoCallbacks(LibVLCVideoLockCb lockCb, LibVLCVideoUnlockCb unlockCb, LibVLCVideoDisplayCb displayCb) method in the binding library, as defined here.
You can do this pretty simply by using a DependencyService. You will need to adjust the below code to cater for a folder location that you're working with but, do this.
Change all of the "Test" namespaces to you're own project.
Add an interface into your shared project called IFileSystem that looks like this ...
using System;
namespace Test.Interfaces
{
public interface IFileSystem
{
byte[] GetFileInBytes(string fileName);
}
}
Create a dependency service down in each platform project. For this, I'm only supplying iOS and Android but as you'll see, the logic for both is essentially exactly the same, only the namespace differs ...
iOS
using System;
using System.IO;
using Test.Interfaces;
using Test.iOS.DependencyServices;
using Xamarin.Forms;
[assembly: Dependency(typeof(FileSystem))]
namespace Test.iOS.DependencyServices
{
public class FileSystem : IFileSystem
{
public byte[] GetFileInBytes(string fileName)
{
var folder = Environment.GetFolderPath(Environment.SpecialFolder.MyVideos);
fileName = Path.Combine(folder, fileName);
return File.Exists(fileName) ? File.ReadAllBytes(fileName) : null;
}
}
}
Android
using System;
using System.IO;
using Test.Interfaces;
using Test.Droid.DependencyServices;
using Xamarin.Forms;
[assembly: Dependency(typeof(FileSystem))]
namespace Test.Droid.DependencyServices
{
public class FileSystem : IFileSystem
{
public byte[] GetFileInBytes(string fileName)
{
var folder = Environment.GetFolderPath(Environment.SpecialFolder.MyVideos);
fileName = Path.Combine(folder, fileName);
return File.Exists(fileName) ? File.ReadAllBytes(fileName) : null;
}
}
}
... now call that from anywhere in your shared project.
var bytes = DependencyService.Get<IFileSystem>().GetFileInBytes("Test.mp4");
That should work for you, again though, you need to adjust the folder path to your appropriate location for each platform project. Essentially, this line is the one that may need to change ...
var folder = Environment.GetFolderPath(Environment.SpecialFolder.MyVideos);
Alternatively, change that code to suit your requirements. If the file path you've been given contains the fully qualified location, then remove the logic to add the folder altogether.
Here's hoping that works for you.

Laravel 5 Mime validation

Ok, I'm trying to upload a video, and validate the file type.
According to the documentation:
mimes:foo,bar,...
The file under validation must have a MIME type corresponding to one of the listed extensions.
Basic Usage Of MIME Rule
'photo' => 'mimes:jpeg,bmp,png'
I'm uploading a wmv video, and my rules are so:
return [
'file' => ['required', 'mimes:video/x-ms-wmv']
]
I've done a print_r() on Request::file('file') and I get the following data:
Symfony\Component\HttpFoundation\File\UploadedFile Object
(
[test:Symfony\Component\HttpFoundation\File\UploadedFile:private] =>
[originalName:Symfony\Component\HttpFoundation\File\UploadedFile:private] => SampleVideo.wmv
[mimeType:Symfony\Component\HttpFoundation\File\UploadedFile:private] => video/x-ms-wmv
[size:Symfony\Component\HttpFoundation\File\UploadedFile:private] => 70982901
[error:Symfony\Component\HttpFoundation\File\UploadedFile:private] => 0
[pathName:SplFileInfo:private] => C:\wamp\tmp\php6428.tmp
[fileName:SplFileInfo:private] => php6428.tmp
)
However I'm getting the error:
{"file":["The file must be a file of type: video\/x-ms-wmv."]}
I've tried changing the "mime type" to video/*, wmv (as per the docs) and also video/x-ms-wmv yet none of them validate the file correctly.
As you can see from the print_r() the mime type Symfony is getting is video/x-ms-wmv.
Am I doing something wrong? Or can Laravel/Symfony just not validate files well?
I appreciate the help
Edit
Ok, I opened validator.php and added echo $value->guessExtension(); to the ValidateMimes() method, and it outputs asf.
Why is Symfony outputting video\x-ms-wmv, the file extension is wmv, I'm validating both of them, but Laravel is guessing asf?!
It's too unreliable for video validation for me.
This is expected behaviour.
Laravel is calling guessExtension on Symphony's UploadedFile object, which will return the expected extension of the file, not the mimetype.
This is why the documenatation states that for an uploaded image you should use:
'photo' => 'mimes:jpeg,bmp,png'
Symfony's guessExtension calls getMimeType, which uses PHP's Fileinfo Functions to go and guess the mimetype of a given file.
Once getMimeType guesses the mimetype for the file, Symfony's MimeTypeExtensionGuesser kicks in to get the extension from the mime type retrieved from a file.
// ... cut from MimeTypeExtensionGuesser
'video/x-ms-asf' => 'asf',
'video/x-ms-wmv' => 'wmv',
'video/x-ms-wmx' => 'wmx',
'video/x-ms-wvx' => 'wvx',
'video/x-msvideo' => 'avi',
Therefore, your rules should be:
return [
'file' => ['required', 'mimes:wmv,asf']
]
The reason that asf should be included is mainly historical. To quote Wikipedia:
The most common media contained within an ASF file are Windows Media Audio (WMA) and Windows Media Video (WMV). The most common file extensions for ASF files are extension .WMA (audio-only files using Windows Media Audio, with MIME-type 'audio/x-ms-wma') and .WMV (files containing video, using the Windows Media Audio and Video codecs, with MIME-type 'video/x-ms-asf'). These files are identical to the old .ASF files but for their extension and MIME-type.
Microsoft's documentation about the difference between ASF and WMV/WMA files states:
The only difference between ASF files and WMV or WMA files are the file extensions and the MIME types [...] The basic internal structure of the files is identical.
Because the internal structure of the file is identical (including the magic numbers for the file format), wmv, wma and asf are one and the same. The only difference between the three extensions is the icon that is shown inside Explorer.
It's not just Windows Media files that will have this issue, Wikipedia lists many different video container formats that will have the same problem. If you want to find the video codec that is being used in a container, you are going to need to look at more then just the "magic patterns" that are used by the fileinfo functions.
That being said, expected behaviour != correct behaviour.
I submitted a pull request to add a new validator, called mimetypes. This does as you would expect and uses the guessed mimetype to validate an uploaded file, instead of the extension that is guessed from the mimetype.

Burning CD/DVD using IMAPI2.dll

I am trying to add the facility to burn CD/DVD into my app by using IMAPI2.dll. I am using Microsoft Visual FoxPro 9 SP 2 to devolopment. When I invork the method Write() which is a member of the IMAPI2.MsftDiscFormat2Data class (Last line of the sample code) Visual FoxPro gives the following error message. Error Msg : "OLE error code 0x80004002: No such interface supported."
OS : Windows 7
Please Help.
**--Creating MsftDiscMaster2 object to connect to optical drives.
loDiscMaster = CREATEOBJECT("IMAPI2.MsftDiscMaster2")
**--Creating MsftDiscRecorder2 object for the specified burning device.
loRecorder = CREATEOBJECT("IMAPI2.MsftDiscRecorder2")
lcUniqueId = loDiscMaster.ITEM(0)
loRecorder.InitializeDiscRecorder(lcUniqueId)
**--Create an image stream for the specified directory.
loFileSystem = CREATEOBJECT("IMAPI2FS.MsftFileSystemImage")
loRootDir = loFileSystem.Root
**--Create the new disc format and set the recorder.
loDataWriter = CREATEOBJECT("IMAPI2.MsftDiscFormat2Data")
loDataWriter.Recorder = loRecorder
loDataWriter.ClientName = "IMAPIv2 TEST"
loFileSystem.ChooseImageDefaults(loRecorder)
**--Add the directory and its contents to the file system.
loRootDir.AddTree("F:\VSS",.F.)
**--Create an image from the file system
loResultImage = loFileSystem.CreateResultImage()
loStream = loResultImage.ImageStream
**--Write stream to disc using the specified recorder.
loDataWriter.Write(loStream)
I'm afraid you are out of luck there. FoxPro interacts with COM objects at a fairly high level. In fact, it works in much the same way that VBScript interacts with COM. Normally, if your code works in VBScript, it will also work in FoxPro.
This is actually a common problem with some ActiveX/COM libraries. While the objects implemented in imapi2.dll and imapi2fs.dll all use IDispatch - the highest level and most interoperable form of COM interface - some of the method parameters, method returns, and properties of those objects are not IDispatch.
Specifically, the ImageStream property returns something called an IStream which inherits from IUnknown instead of IDispatch. Because of this, the ImageStream property returns something that FoxPro doesn't know how to deal with. FoxPro knows that it is a COM interface, but it doesn't know how to find or call the methods on that object.

Resources