Is it possible to play UHD (3840x2160) video on cobalt? - cobalt

The vp9 codec will play 1920 * 1080 video, but will stop midway when playing 3840 * 2160 video.
I want the UHD video(3840 * 2160) to play.
Is it a memory issue?
If it's not a memory issue, am I not ported well?
(cobalt version - RC_9)

For the history, this issue was fixed by setting cobalt_media_buffer_initial_capacity to 80 * 1024 * 1024.
https://issuetracker.google.com/issues/38486998

Related

ffmpeg scaled output resolution is off by 1px for non-standard inputs [duplicate]

I want to convert a video recorded with Quick Time ( resolution 750*1334 ) to 1080*1920.
Apple only accept resolution 1080*1920p for iphone6PLUS for App review video.
My license key with final cut pro X has expired and
ffmpeg -i AppPreviewIphone6.mp4 -vf scale=1080:1920 AppPreviewIphone6PLUS.mp4 give me a resolution of 1079*1920 instead 1080*1920
Have you an idea what software I can use to change video resolution ?
Try
ffmpeg -i AppPreviewIphone6.mp4 -vf scale=1080:1920,setsar=1 AppPreviewIphone6PLUS.mp4
scale will adjust the aspect ratio flag to match the original display aspect ratio, and Quicktime/Apple considers the aspect-corrected dimensions. 750x1334 is not exactly 9:16 like 1080x1920 is, hence the 1079 being reported.

Why does a video larger than 8176 x 4088 created using AVFoundation come out with a uniform dark green color on my Mac?

When I use AVFoundation to create an 8K (7680 x 4320) MP4 with frames directly drawn onto pixel buffers obtained from the pixel buffer pool, it works with kCVPixelFormatType_32ARGB.
However if I use kCVPixelFormatType_32BGRA, the entire video has a uniform dark green color instead of the actual contents. This problem occurs for resolutions above 8176 x 4088.
What could be causing this problem?
AVAssetWriter.h in SDK 10.15 and in SDK 11.3 says:
The H.264 encoder natively supports ... If you need to work in the RGB domain then kCVPixelFormatType_32BGRA is recommended on iOS and kCVPixelFormatType_32ARGB is recommended on OSX.
AVAssetWriter.h in SDK 12.3 however says:
The H.264 and HEVC encoders natively support ... If you need to work in the RGB domain then kCVPixelFormatType_32BGRA is recommended on iOS and macOS.
AVAssetWriter.h on all three SDKs however also says:
If you are working with high bit depth sources the following yuv pixel formats are recommended when encoding to ProRes: kCVPixelFormatType_4444AYpCbCr16, kCVPixelFormatType_422YpCbCr16, and kCVPixelFormatType_422YpCbCr10. When working in the RGB domain kCVPixelFormatType_64ARGB is recommended.
Whatever be the recommendations, the below prelude states that all of them are just for optimal performance and not for error free encoding!
For optimal performance the format of the pixel buffer should match one of the native formats supported by the selected video encoder. Below are some recommendations:
Now, Keynote movie export with H.264 compression also results in the same problem with the same size limits on my Mid-2012 15-inch Retina MacBook Pro running Catalina (supports upto Keynote 11.1). This problem doesn't occur on a later Mac running Monterey where the latest version 12.2 of Keynote is supported.
I have not included code because Keynote movie export is a simple means to reproduce and understand the problem.
My motivation for asking this question is to obtain clarity on:
What is the right pixel format to use for MP4 creation?
What are safe size limits under which MP4 creation will be problem free?

FFMpeg getting too high fps

Actual fps of video is 25 but keep getting 90 fps from ffmpeg.
I'm not sure why it is detecting wrong framerate and if I set 'fps=40', fps is even getting higher.
This is my code of ffmpeg below :
var cmd = ffmpeg()
.input(data.img_path)
.format('mp4')
.videoCodec('libx264')
.audioBitrate('48k')
.audioChannels(2)
.outputOptions('-movflags', 'frag_keyframe')
.videoBitrate('1024k')
Is there any way to fix this?

VLC transcoding RTMP with 12 second delay

I am using VLC player to receive a RTMP stream from Unreal Media Server and transcode it with ogg/theora for html5 video element. Here is command line code
I use:
vlc.exe -I dummy rtmp://XX.XX.XX.XX:5119/live/LRFront :network-caching=0 :sout=#transcode{vcodec=theo,vb=200,scale=1,deinterlace=0,acodec=none}:http{mux=ogg,dst=:8181/stream.ogg}
But this produces 10-12 second delay when viewing and even freezes quite often.
When played directly in flash player (rtmp stream) there is 50ms delay at most. So it has to be something with transcoding.
Could anyone point me in the right direction?
My PC specs:
CPU: Core i7
RAM: 16 GB
GPU: Nvidia Gefore GTX660

how to set video preview size of AVCaptureDevice?

Working on a camera application with AVFoundation on Mac OS X 10.7
i used [session setSessionPreset] to set resolution of video size, but it only effects output movie file, my application need switch different preview resolution from low to high, some cameras that i am using output high resolution by default, like 2048x1536, it's not suitable for preview.
how can i get to this?

Resources