A few years back, I wrote some util library around DShow/DSound to let me play MP3s in a Windows C++ application. Is that still the normal way to do it in a C++/MFC app, or is that an area of DirectX that has been subsumed into the general Windows APIs?
The motivation is simply we use the standard Windows PlaySound method for WAVs, and would like to be able to play MP3s using a similarly simple API, either provided by Windows or something we write to wrap more complex functionality.
EDIT: this is for a large, commercial, closed-source project. And we only want to play things simply, paying a lot for a library won't fly.
PlaySound() natively supports MP3 as long as it is embedded in a WAV file.
People don't realize that WAV is a container format.
Download the ffmpeg utilities to convert the header and preserve the codec:
ffmpeg -i input.mp3 -c copy -f wav embedded_mp3.wav
You can either use DirectShow but it's not part of DirectX anymore or rely on a third-party library like Bass, FMod, mpg123 or even libwmp3.
If you don't want to use DirectShow anymore (but why change if your existing code keeps working?), you can use MCI:
mciSendString("open la_chenille.mp3 type mpegvideo alias song1", NULL, 0, 0);
mciSendString("play song1", NULL, 0, 0);
mciSendString("close song1", NULL, 0, 0);
IGraphBuilder::RenderFile is an easy way to play any audio file.
Youc could use MCI windows functions,
https://msdn.microsoft.com/en-us/library/ms709626
It can play many of audio file formats including MP3, WAV, MIDI etc.
If I recall correctly it does not require DirectX.
The PlaySound function might also work for you.
If you don't want to pay any licence and wanna do in-house, do the parsing of your mp3 file and pass it to XAudio2.
Its a thing that you can do once (2-3 hours at max) and use always. :P
You could have a look at BASS. It's a simple to use audio library, free for noncommercial use.
Related
I wanted to know if there is a way to convert regular mp4 to a fragmented mp4 via javascript. (like mp4box does) Is it efficient enough (not suppose to be a complicated task)? did anyone write something like this?
to make it harder, can it be on the fly? meaning I will not download the whole mp4 from the server but download in parts and convert it into fragments compatible with fragmented mp4 and mpeg-dash - I'm trying to overcome to problem to not have to use 2 different file types to play a video or do mp4box on all my library in advance.
Regardless, is it possible to convert from h.264 compatible files with different containers (mov, flv etc.) to fragmented without a server? meaning do it in the browser with javascript somehow?
appreciate the help,
Yug
I am working on something similar (which lead me to here) but no clue so far. However, below is my finding:
Broadway:
https://github.com/mbebenita/Broadway
The idea is you may write a C/C++ using FFMPEG source library, then use Emscripten to compile your C/C++ coding into Javascript. I yet start working with this method, not sure this will work or not. If you did do let me know.
I'm working on basic sound notification scripts and I'm using mpg123 and mp3 files because I want a very, very lightweight way to play sound files. I would much prefer to use wav, but mpg123 is extremely small.
Is there a simple way to play wav files in Ruby?
Having pitch control etc. would be nice, but right now I just want efficiency as running an external app is pretty clunky.
Thanks!
You want to use an external program for playing files. I've used afplay successfully in the past
This question arises out of a combination of this being my first time working with video and unfamiliarity with Macs. Basically I'm finding it difficult to figure out how to play a video (within a QWidget, or otherwise) using any standard format, e.g. avi, mpeg, mov, etc. In particular,
QMovie::supportedFormats() gives me only .gif and .mng, but I need to use standard formats. Is there a way to increase the number of supported formats?
Phonon requires the presence of a 'backend' which the user has to implement himself. I looked to see if I could somehow do this with Quicktime, but I couldn't get the application to launch--and anyway I didn't really see how to do that. Also, Phonon looks pretty heavyweight, I'd like to avoid it if I could.
While there are plenty of avi (et al.) players floating around on the web, I think it's probably unlikely I'd be able to use them--I need to start, stop, and change the playback speed of videos programmatically i.e. through my C++ program.
I'm not sure why this should be so hard--working with images in Qt is a snap by comparison. So: What's a good way to play videos from within a C++/Qt program?
Stop what you are doing right now: Phonon is the past, Qt Mobility is the future.
After you download, compile and install Qt Mobility, check the examples: videowidget and videographicsitem, located at: qt-mobility-opensource-src-1.2.0/examples/
They pretty much answer all your questions.
I use Qt & OpenCV to record video and QAudioInput to record audio into wav format. I want to combine them into one video file. How can I accomplish this? I have researched so much but I can't seem to find a command to accomplish this.
I use both Windows and Mac.
FYI, this operation seems to be accomplished through the cmd-line in this thread. This approach may turn into an easy hack since you can call this command using system().
But if you still want to do it programatically, I suggest you take a look at Dranger's FFmpeg tutorials. It provides 8 interesting tutorials that shows how to do simple stuff, from taking snapshots of a video to more complex stuffs like writing a simple video player with audio/video sync.
These tutorials teach how to work independently with audio and video streams, which is what you need to do: read the audio stream from the WAV file and then insert it as the audio stream of a video file.
Maybe not directly related to what you are aim for, but this answer demonstrates how to use FFmpeg to retrieve the audio stream of one file and play it with SDL, while simultaneously using OpenCV to retrieve video frames and display them in a SDL window.
Anyone know of a vorbis decoder library that can be used on Windows Phone 7?
The lack of native code interop make re-using any of the native code implementations difficult (impossible?) but if there are tricks to do that, I'm open to that as well.
There is a managed implementation for mono called csvorbis, it includes a sample which outputs a wav file this didn't need many changes to work with XNA's SoundEffect class. I did a whole track at once, this took a few seconds in the emulator so you may need to stream it using DynamicSoundEffect for better results. The mooncodecs folder has a codec for the desktop version based on csvorbis which may be worth a look aswell.
Ogg Vorbis is not a supported codec on Windows Phone 7 and the platform supports no way of adding support for custom codecs.
The options available are to write your own decoder/converter in managed code or to convert the original source files.
I suspect the second option will be easier.