Configure ffmpeg/ffserver in openshift - ffmpeg

I've managed to setup ffserver and stream videos on laptop.
How can I do the same using openshift.
I wanted to install ffserver and ffmpeg in openshift and stream videos.
Googled few links nothing seems to be useful.

Since you can install rpm's inside your gear you'll need to compile everything from source. Looks like a solution was presented in this thread: https://www.openshift.com/forums/openshift/ffmpeg

Related

No such element or plugin 'cameraundistort'

How to install missing plugin in gStreamer windows 10? I want to make undistortion for incoming images from stream using gstreamer
However, when I try
gst-inspect-1.0 opencv cameraundistort
It gives info about it

JavaFX MediaPlayer stutters on Mac with mp4 files

To reproduce the problem you can use the JavaFX sample project AdvancedMedia. Replace the flv video with a mp4 one which encoding type is supported by JavaFX.
Edit: I thought maybe people are having trouble to edit the sample. Just place a video in the project root folder and set MEDIA_URL to new File("video.mp4").toURI().toString().
It works great on Windows platforms. However, on Mac the video stutters in the beginning and before it ends. When running the project on terminal with java -jar AdvancedMedia.jar it shows the following messages:
AVF info: checkDecoderUsage, vaCodecString: <Gen6>
AVF info: pp_hw_name: RenderingEngine, id: 3ea, m_pp_nowait: 1
AVF info: RingBufferPool wr:0, rd:108, reset:93, warning:0
I saw this error occurs with MacBook Pro, Air, i5/i7 processors etc. Is there any way to solve the stuttering/error messages?
I tested running with -Dprism.verbose=true and I saw a major difference between Windows and Mac platforms: on Windows the Prism platform init order is d3d sw while on Mac it is es2 sw; the Prism pipeline name is com.sun.prism.d3d.D3DPipeline and com.sun.prism.es2.ES2Pipeline respectively.
Maybe a possible solution is to use the same D3D pipeline on Mac? But if running with -Dprism.order=d3d,sw it gives java.lang.ClassNotFoundException: com.sun.prism.d3d.D3DPipeline.
I also tried to increase JVM memory with the -Xms option but it had no effect. So it really seems to be a codecs issue even though the videos play smoothly on other platforms.
I was testing a Windows generated jar on Mac, so now I also tried to build the AdvancedMedia example on Mac thinking it could make some difference, but nothing changed.
It seems that maybe this is more like a JavaFX bug report than a question.
A little off-topic, but I have to say that before this I tried C++ and VLCj approaches for my application and I changed it only because of Mac functionality issues...
I found out that this bug happens on Mac with any H.264 encoded video, regardless of the file extension. So as described on the docs, the only other alternative JavaFX offers is to use VP6 encoding, which is not easy to get - see my other question - but at least the videos will run smoothly and without any rendering error messages.

HX711 module and NodeMCU devkit 0.9

Good morning.
I have a NodeMCU devkit 0.9 and I am trying to interface it with a load cell amplifier (HX711), which I saw has libraries for NodeMCU.
I tried 2 options:
Custom build from nodemcu-build.com for master and dev branches. After flashing the bin file and opening ESPlorer /dev/ttyUSBx port I tried to upload the basic web server code from here. But when I go to the browser and type the IP nothing is displayed.
Build my own firmware, uncommenting the HX711 module and limiting my flash size to 4M and use esp-open-sdk to have the xtensa-lx106-elf module. After building without errors I flashed the generated bin file and the same thing happened. The web server doesn't seem to be running.
I am a bit confused because the only bin that is working with my devkit 0.9 board is the last one available in the releases, which is the 0.9.6-dev_20150704, but this one doesn't have the modules I need to proceed with my HX711 project.
When I build using the 1.4/1.5/2.0 NodeMCU firmware I'm not able to get the same things working (at least the web server basic code). I tried several combinations of user configs (baud rate, activating devkit 0.9 define, develop mode, auto flash vs 4M flash).
What procedures should I follow to have the HX711 working with the NodeMCU devkit 0.9?
Best regards.

"No accelerated colorspace conversion found from yuv420p to bgr24" when using OpenCV with FFMPEG on mac

When trying to acquire a frame from a video file (I've tried several video formats) I am getting an error message "No accelerated colorspace conversion found from yuv420p to bgr24".
The exact same code ran perfectly fine on a windows machine and I couldn't get it to run on a mac even after I recompiled and installed FFMPEG and OpenCV. I am using lion for my OS.
Any ideas?
OpenCV uses a specific video coding/decoding back-end for each OS. On windows, it uses ffmpeg, which in turn can use some codecs installed on the machine. On Mac it uses qTime, and I think it can be compiled with ffmpeg.
Make sure u have quickTime up-to-date, and maybe install some codecs (There is such a notion on Mac?)
Hope it helps!

Converting media files for HTTP Live Streaming on windows?

I want to be able to run HTTP live streaming from a server, so that I can play back the files on my iPhone via HTTP. I know it's possible to play media files through safari /without/ live streaming, but I'd like to give it a go.
As far as I can tell, the only tools available for converting media files into the formats required for the live stream are for Mac OSX. I don't have a Mac, and I'm wondering if there are any equivalent tools for Windows?
FFMpeg offers tools for Mac, Windows, and Linux. http://ffmpeg.org/download.html

Resources