how to improve video quality? - windows

I am using the following code snippets to record screen, and in most situations recorded wmv file is clear enough, but for some part of video it is not very clear (grey color for some parts). What I record is ppt with full screen mode. I am using Windows Media Encoder 9.
Here is my code snippet,
IWMEncSourceGroup SrcGrp;
IWMEncSourceGroupCollection SrcGrpColl;
SrcGrpColl = encoder.SourceGroupCollection;
SrcGrp = (IWMEncSourceGroup)SrcGrpColl.Add("SG_1");
IWMEncVideoSource2 SrcVid;
IWMEncSource SrcAud;
SrcVid = (IWMEncVideoSource2)SrcGrp.AddSource(WMENC_SOURCE_TYPE.WMENC_VIDEO);
SrcAud = SrcGrp.AddSource(WMENC_SOURCE_TYPE.WMENC_AUDIO);
SrcVid.SetInput("ScreenCap://ScreenCapture1", "", "");
SrcAud.SetInput("Device://Default_Audio_Device", "", "");
// Specify a file object in which to save encoded content.
IWMEncFile File = encoder.File;
string CurrentFileName = Guid.NewGuid().ToString();
File.LocalFileName = CurrentFileName;
CurrentFileName = File.LocalFileName;
// Choose a profile from the collection.
IWMEncProfileCollection ProColl = encoder.ProfileCollection;
IWMEncProfile Pro;
for (int i = 0; i < ProColl.Count; i++)
{
Pro = ProColl.Item(i);
if (Pro.Name == "Screen Video/Audio High (CBR)")
{
SrcGrp.set_Profile(Pro);
break;
}
}
encoder.Start();
thanks in advance,
George

I would guess that it's a problem with your encoder profile or settings, and not a problem with the code. If you're using the default "Screen Video/Audio High (CBR)" profile in WME9, it's using a video bitrate of 250Kbps, which is pretty low. I'd suggest creating a custom profile in the Windows Media Encoder Profile Editor Utility. Something like this:
awesomesc.prx
Name: Awesome Screen Profile
Audio: WMA 9.2 CBR (32kbps, 44kHz, mono CBR)
Video: WMV 9 Screen Quality VBR (Video size Same as video input, Frame rate 10fps, Key frame interval 3sec, Video quality 90)
Then just change the code to match the custom profile's name.
if (Pro.Name == "Awesome Screen Profile")
The encoder settings would take a much longer post to go through, but if you have not changed them from the defaults, you should be OK.
The Quality-based VBR algorithm can be pretty amazing, and will likely produce a surprisingly low average bitrate, but if VBR won't work for your needs, you can use the Windows Media Encoder Profile Editor utility to import the schia.prx profile that you're using and tweak the settings to find a higher CBR bitrate that produces acceptable quality.

"Screen Video/Audio Medium (CBR)"
it solved my problem

Related

How to get the sample rate from a mediaDevices.getUserMedia stream

Firefox is limited in its audio resampling ability for audio mediastreams. If the input media stream's sample rate is not the same as the AudioCotext's, then it complains :
DOMException: AudioContext.createMediaStreamSource: Connecting AudioNodes from AudioContexts with different sample-rate is currently not supported.
For example if we get an audio stream like so :
navigator.mediaDevices.getUserMedia(constraints).then(stream => {
let context = new (window.AudioContext || window.webkitAudioContext)({sampleRate : 48000});
let audioInput = this.context.createMediaStreamSource(stream);
});
Firefox will complain about mismatching sample rates - if they are different between the audio context and the hardware device's settings in the audio subsystem.
I can't find a way to get the sample rate from the audio track in the stream. I've tried :
let tracks = stream.getAudioTracks();
let settings = tracks[0].getSettings();
let constraints = tracks[0].getConstraints();
But none of these objects have the streams's sampleRate in them.
Is there another way to enquire an audio track's/stream's sample rate ?

Does Cobalt support webm progressive playback

It seems that MediaSource and Progressive playback use the different demuxer. ChunkDemuxer is used for MediaSource, ShellDemuxer is used for Progressive playback.
In ShellParser.cpp implementation:
PipelineStatus ShellParser::Construct(
scoped_refptr<ShellDataSourceReader> reader,
scoped_refptr<ShellParser>* parser,
const scoped_refptr<MediaLog>& media_log) {
DCHECK(parser);
DCHECK(media_log);
*parser = NULL;
// download first 16 bytes of stream to determine file type and extract basic
// container-specific stream configuration information
uint8 header[kInitialHeaderSize];
int bytes_read = reader->BlockingRead(0, kInitialHeaderSize, header);
if (bytes_read != kInitialHeaderSize) {
return DEMUXER_ERROR_COULD_NOT_PARSE;
}
// attempt to construct mp4 parser from this header
return ShellMP4Parser::Construct(reader, header, parser, media_log);
}
It seems that Cobalt can only demux MP4 container(Only ShellMP4Parser) for progressive playback.
Is it known status for Cobalt ?how can we support webm progressive playback on the device?
Cobalt will not support WebM/VP9 progressive playback. We changed the Progressive Conformance test to change VP9 to H264. This will be pushed soon.
https://github.com/youtube/js_mse_eme/commit/d7767e13be7ed8b8bdb2efda39337a4a2fb121ba

AVPlayer pathForResource not getting called

I have a video background for the welcome screen of the app but when I run only a blank screen appears.
The string for the bundle path for resource part is perfectly correct so a typo isn't the problem or anything. When I put a breakpoint on the "if path" block though it isn't getting called. This is in view did load and "AVKit" and "AVFoundation" are both imported and there are no errors.
Any insights? Thanks!
let moviePath = NSBundle.mainBundle().pathForResource("Ilmatic_Wearables", ofType: "mp4")
if let path = moviePath {
let url = NSURL.fileURLWithPath(path)
let player = AVPlayer(URL: url)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
playerViewController.view.frame = self.view.bounds
self.view.addSubview(playerViewController.view)
self.addChildViewController(playerViewController)
player.play()
}
Weird, your code should work if moviePath isn't nil as you are saying. Check this way:
if let moviePath != nil {
...
}
Update
Check if your video file's Target Membership is set
EDIT (11/13/2015) - Try this:
Right Click, delete the video file from your project navigator, select move to trash.
Then drag the video from finder back into your project navigator.
When choosing options for adding these files:
check Copy items if needed
check add to targets
Rebuild your project, see if it works now!
I have included sample code that works for me:
import AVFoundation
import AVKit
class ExampleViewController: UIViewController {
func loadAndPlayFile() {
guard let path = NSBundle.mainBundle().pathForResource("filename", ofType: "mp4") else {
print("Error: Could not locate video resource")
return
}
let url = NSURL(fileURLWithPath: path)
let moviePlayer = AVPlayer(URL: url)
let moviePlayerController = AVPlayerViewController()
moviePlayerController.player = moviePlayer
presentViewController(moviePlayerController, animated: true) {
moviePlayerController.player?.play()
}
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
loadAndPlayFile()
}
}
/edit
Check to see if the video file plays in iTunes. If it plays in iTunes, it should also play on the iOS simulator and on a device.
If it doesn't play in iTunes, but does work with VLC Media Player, Quicktime, etc. It need to be re-encoded.
iOS supports many industry-standard video formats and compression standards, including the following:
H.264 video, up to 1.5 Mbps, 640 by 480 pixels, 30 frames per second, Low-Complexity version of the H.264 Baseline Profile with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in .m4v, .mp4, and .mov file formats
H.264 video, up to 768 Kbps, 320 by 240 pixels, 30 frames per second, Baseline Profile up to Level 1.3 with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in .m4v, .mp4, and .mov file formats
MPEG-4 video, up to 2.5 Mbps, 640 by 480 pixels, 30 frames per second, Simple Profile with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in .m4v, .mp4, and .mov file formats
Numerous audio formats, including the ones listed in Audio Technologies

Get QuickTime metadata: codecs, bitrates, dimensions

I'm finding it difficult to determine how to extract the following information from a QuickTime movie, either using QTKit or the older QuickTime APIs in OS X, targeting 10.5+:
Video and audio codecs used (e.g. "H.264")
Video and audio bitrates (e.g. 64 kbps)
Dimensions
The specific problems I've encountered are:
1) The only means to the video and audio codec names that I've found involve the use of ImageDescriptionHandle and SoundDescriptionHandle, both of which appear to require the Carbon-only methods NewHandleClear and DisposeHandle, as well as requiring the 32-bit only Media object. Is there a more modern method that doesn't require the Carbon framework and is 64-bit compatible?
2) For the bitrate, I'm getting the GetMediaDataSizeTime64 and dividing by the track duration in seconds. However, in the case of one audio track, that method returns a value of 128 kbps, but calling QTSoundDescriptionGetProperty with the audio track media and the kQTAudioPropertyID_FormatString param returns a string of "64 kbps". Why would those two values be different? Is there a better way to calculate a track's bitrate?
3) Dimensions returned by [QTMovie movieAttributes] objectForKey:QTMovieNaturalSizeAttribute] or by [QTTrack attributeForKey:QTTrackDimensionsAttribute] are incorrect for one particular movie. The size returned is 720 x 480, but the actual view size in QuickTime Player is 640 x 480. Player's info window shows a size string of "720 x 480 (640 x 480)". Is there a better way to determine the actual movie dimensions?
Thanks in advance!
This metadata can be obtained from the [movie tracks] QTTrack* objects.
1) Enumerating through the tracks you can find the video and audio tracks.
QTMedia* media = [track media];
if ([media hasCharacteristic:QTMediaCharacteristicVisual])
{
// video track
}
if ([media hasCharacteristic:QTMediaCharacteristicAudio])
{
// audio track
}
The information about codecs:
NSString* summary = [track attributeForKey:QTTrackFormatSummaryAttribute];
2) To calculate the movie's bitrate you need to calculate the total data size of all tracks and divide it on the movie duration.
Enumerating through the tracks get the data size of each track:
QTMedia* media = [track media];
Track quicktimeTrack = [track quickTimeTrack];
TimeValue startTime = 0;
TimeValue duration = GetTrackDuration(quicktimeTrack);
long trackDataSize = GetTrackDataSize(quicktimeTrack, startTime, duration);
3) To get the movie's dimensions
NSSize movieSize = [(NSValue*)[[movie movieAttributes] objectForKey:QTMovieNaturalSizeAttribute] sizeValue];
However, the actual dimensions of the video track may be different:
Fixed width = 0;
Fixed height = 0;
GetTrackDimensions(videoTrack, &width, &height);

is it possible to change the playback pitch of an audioqueue

This is supposed to be possible on Mac OS X by overwriting the sample rate in the AudioStreamBasicDescription then create a new output queue.
I've been able to retrieve the default sample rate and write a new one (ie. replace 44100 with 48000) but this is not resulting in any pitch change in the output signal.
err = AudioFileGetProperty(mAudioFile, kAudioFilePropertyDataFormat, &size, &mDataFormat);
if (err != noErr)
NSLog(#"Couldn't determine the audio file format");
Float64 mySampleRate = mDataFormat.mSampleRate; //the initial rate
if (inRate != 1) {
//write a new value
mDataFormat.mSampleRate = inRate;
//then
err = AudioQueueNewOutput etc.
Any suggestions would be greatly appreciated.
Changing the sample rate doesn't change the pitch of the audio. You may perceive that something playing back faster has a higher pitch. However that's perception rather than reality.
To change pitch, you'll need to process the audio data through a Digital Signal Processing (DSP) library. Alternatively, take a look at running it through an AudioUnit:
Audio Unit Programming Guide

Resources