What I actually want is to store frames of a video into a char array, using ffMPEG.
Constraint is to use only MSVC. Not allowed to use the Windows building tweak due to maintainability issues.
So considered using the shared build to accomplish the task. It consists of only DLL's. No lib files, so I tried loading one of the DLL's using HINSTANCE hInstLibrary = LoadLibrary("avcodec-54.dll"); and it works. But I couldn't find the interfaces of this DLL published anywhere. Could anyone help with this? How do I know which functions of the DLL I can call and what parameters I can pass it so that I can get the video frames?
Use the public interface of ffmpeg from
ffmpegdir/include/libavcodec/
ffmpegdir/include/libavformat/
etc.
for example, to open a file for reading, use avformat_open_input from ffmpegdir/include/libavformat/avformat.h
AVFormatContext * ctx= NULL;
int err = avformat_open_input(&ctx, file_name, NULL, NULL);
You can get the latest builds of ffmpeg from http://ffmpeg.zeranoe.com/builds/
Public header files can be found in dev builds (http://ffmpeg.zeranoe.com/builds/win32/dev/).
UPD:
Here is a working example (no additional static linking)
#include "stdafx.h"
#include <windows.h>
#include <libavformat/avformat.h>
typedef int (__cdecl *__avformat_open_input)(AVFormatContext **, const char *, AVInputFormat *, AVDictionary **);
typedef void (__cdecl *__av_register_all)(void);
int _tmain(int argc, _TCHAR* argv[])
{
const char * ffp = "f:\\Projects\\Temp\\testFFMPEG\\Debug\\avformat-54.dll";
HINSTANCE hinstLib = LoadLibraryA(ffp);
__avformat_open_input __avformat_open_input_proc = (__avformat_open_input)GetProcAddress(hinstLib, "avformat_open_input");
__av_register_all __av_register_all_proc = (__av_register_all)GetProcAddress(hinstLib, "av_register_all");
__av_register_all_proc();
::AVFormatContext * ctx = NULL;
int err = __avformat_open_input_proc(&ctx, "g:\\Media\\The Sneezing Baby Panda.flv", NULL, NULL);
return 0;
}
Related
I've been working on a credential provider and I've been debugging it through logging. Recently learned about CredUIPromptForWindowsCredentials() API to be able to invoke it from other than login screen or remote desktop connection. The only way at this time I can seem to get my credential to display is to set the last param to CREDUIWIN_SECURE_PROMPT. I've tried various schemes of the flags with no luck. My CP works, that's not the problem. Problem is easier debugging. Only once have I had to go to rescue mode when I made my laptop unbootable. ;) The problem with using the CREDUIWIN_SECURE_PROMPT flag is that then I don't have access to the debugger because login takes over the screen and I can't get back to my debugger. I suppose the only workaround would be to remote debug on another machine with this API, but I'd prefer not to hassle with that.
My CP is registered at HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Authentication\Credential Providers\{55157584-ff0f-48ce-9178-a4e290901663} and the default property is "MyCredProvider" (for this example). (GUID, prop name changed to protect the guilty. Also ignore LsaString where bad things would happen on a copy--of which I'm not doing.)
Any way to get my custom CP without using the secure prompt?
#include <windows.h>
#include <iostream>
#include <EvoApi.h>
#include <decrypt.h>
#include <atlbase.h>
#include <Lmwksta.h>
#include <StrSafe.h>
#include <LMAPIbuf.h>
#include <LMJoin.h>
#include <wincred.h>
#include <NTSecAPI.h>
#pragma warning(disable : 4996)
#pragma comment(lib, "netapi32.lib")
#pragma comment(lib, "credui.lib")
#pragma comment(lib, "secur32")
using namespace std;
template <size_t SIZE = 256>
struct LsaString : public LSA_STRING
{
LsaString()
{
MaximumLength = SIZE;
Length = 0;
Buffer = pBuf.get();
}
LsaString(LPCSTR pWhat)
{
MaximumLength = SIZE;
Length = 0;
Buffer = pBuf.get();
Init(pWhat);
}
void Init(LPCSTR pWhat)
{
size_t len = strlen(pWhat);
if (len >= SIZE)
throw;
strcpy(Buffer, pWhat);
Length = (USHORT) len;
}
unique_ptr<char[]> pBuf = make_unique< char[] >(SIZE);
};
int _tmain(int argc, wchar_t* argv[])
{
#if 1
wstring me(_T("MYLOGING"));
wstring url(_T("Header"));
wstring message(_T("Enter credentials for ..."));
CREDUI_INFOW credInfo;
credInfo.pszCaptionText = url.c_str();
credInfo.hbmBanner = nullptr;
credInfo.hwndParent = NULL;
credInfo.pszMessageText = message.c_str();
credInfo.cbSize = sizeof(CREDUI_INFOW);
ULONG authPackage = 0;
LSAHANDLE lsaHandle;
LsaConnectUntrusted(&lsaHandle);
LsaString<> lsaString("MyCredProvider");
//LsaString<> lsaString(MICROSOFT_KERBEROS_NAME_A); // works ... as far as finding in LsaLookupAuth...
//LsaString<> lsaString(NEGOSSP_NAME_A); // works ... as far as finding in LsaLookupAuth...
ULONG ulPackage = 0;
LsaLookupAuthenticationPackage(lsaHandle, &lsaString, &ulPackage);
void* pBlob;
ULONG blobSize = 0;
DWORD dwFlags = CREDUIWIN_GENERIC; //CREDUIWIN_SECURE_PROMPT
CredUIPromptForWindowsCredentials(&credInfo, 0, &ulPackage, NULL, 0, &pBlob, &blobSize, FALSE, dwFlags);
if (pBlob) CoTaskMemFree(pBlob);
return 0;
}
I have a very simple Command Line Tool in Xcode:
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
int main(int argc, const char * argv[])
{
void *p = calloc(32, 1);
assert(p);
free(p);
return 0;
}
When I run Instruments->Allocations it shows one living block. The free seems to be ignored.
In the olden days, I remember that you could actually still use the last free'ed block. So I tried this:
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
int main(int argc, const char * argv[])
{
void *p = calloc(32, 1);
assert(p);
free(p);
void *q = calloc(32, 1);
assert(q);
free(q);
return 0;
}
Now, Instruments->Allocations shows no living blocks. This seems correct.
Can anyone explain or reproduce the problem I am seeing in the first program?
I'm using Xcode 4.1.1
Thanks.
Let me rephrase the comments above.
Apple LLVM in Xcode 5 resolved the alloc / free behavior so that no blocks allocated now, thus the free() method runs as expected.
I'm trying to intercept the openat() system call on Linux using a custom shared library that I can load via LD_PRELOAD. An example intercept-openat.c has this content:
#define _GNU_SOURCE
#include <sys/types.h>
#include <sys/stat.h>
#include <unistd.h>
#include <stdio.h>
#include <dlfcn.h>
int (*_original_openat)(int dirfd, const char *pathname, int flags, mode_t mode);
void init(void) __attribute__((constructor));
int openat(int dirfd, const char *pathname, int flags, mode_t mode);
void init(void)
{
_original_openat = (int (*)(int, const char *, int, mode_t))
dlsym(RTLD_NEXT, "openat");
}
int openat(int dirfd, const char *pathname, int flags, mode_t mode)
{
fprintf(stderr, "intercepting openat()...\n");
return _original_openat(dirfd, pathname, flags, mode);
}
I compile it via gcc -fPIC -Wall -shared -o intercept-openat.so intercept-openat.c -ldl. Then, when I run this small example program:
int main(int argc, char *argv[])
{
int fd;
fd = openat(AT_FDCWD, "/home/feh/.vimrc", O_RDONLY);
if(fd == -1)
return -1;
close(fd);
return 0;
}
The openat() call is re-written via the library:
$ LD_PRELOAD=./intercept-openat.so ./openat
intercepting openat()...
However, the same does not happen with GNU tar, even though it uses the same system call:
$ strace -e openat tar cf /tmp/t.tgz .vimrc
openat(AT_FDCWD, ".vimrc", O_RDONLY|O_NOCTTY|O_NONBLOCK|O_NOFOLLOW|O_CLOEXEC) = 4
$ LD_PRELOAD=./intercept-openat.so tar cf /tmp/t.tgz .vimrc
So the custom openat() from intercept-openat.so is not being called. Why is that?
It uses the same system call, but apparently it does not call that via the same C function. Alternatively, it could be that it does, but it's statically linked.
Either way, I think you've proved that it never dynamically links a function names "openat". If you still want to pursue this option, you might like to see if it links against a specific version of that function, but that's a long shot.
You can still intercept the system call by writing your program to use ptrace. This is the same interface used by strace and gdb. It will have a higher performance penalty though.
http://linux.die.net/man/2/ptrace
I would like to make my debug handler (installed with qInstallMsgHandler) handles UTF-8, however it seems it can only be defined as void myMessageOutput(QtMsgType type, const char *msg) and const char* doesn't handle UTF-8 (once displayed, it's just random characters).
Is there some way to define this function as void myMessageOutput(QtMsgType type, QString msg), or maybe some other way to make it work?
This is my current code:
void myMessageOutput(QtMsgType type, const char *msg) {
QString message = "";
QString test = QString::fromUtf8(msg);
// If I break into the debugger here. both "test" and "msg" contain a question mark.
switch (type) {
case QtDebugMsg:
message = QString("[Debug] %1").arg(msg);
break;
case QtWarningMsg:
message = QString("[Warning] %1").arg(msg);
break;
case QtCriticalMsg:
message = QString("[Critical] %1").arg(msg);
break;
case QtFatalMsg:
message = QString("[Fatal] %1").arg(msg);
abort();
}
Application::instance()->debugDialog()->displayMessage(message);
}
Application::Application(int argc, char *argv[]) : QApplication(argc, argv) {
debugDialog_ = new DebugDialog();
debugDialog_->show();
qInstallMsgHandler(myMessageOutput);
qDebug() << QString::fromUtf8("我");
}
If you step through the code in the debugger you will find out that QDebug and qt_message first construct a QString from the const char* and then use toLocal8Bit on this string.
The only way I can think of to circumvent this: Use your own coding (something like "[E68891]") or some other coding like uu-encode or base64-encoding that uses only ASCII characters and decode the string in your message handler.
You should also consider to use the version qDebug("%s", "string") to avoid quotes and additional whitespace (see this question).
Edit: the toLocal8Bit happens in the destructor of QDebug that is call at the end of a qDebug statement (qdebug.h line 85). At least on the Windows platform this calls toLatin1 thus misinterpreting the string. You can prevent this by calling the following lines at the start of your program:
QTextCodec *codec = QTextCodec::codecForName("UTF-8");
QTextCodec::setCodecForLocale(codec);
On some platforms UTF-8 seems to be the default text codec.
try to pass data in UTF8 and extract it in your function with something like
QString::fromUTF8
it takes const char* on input.
The problem is that the operator<<(const char *) method expects a Latin1-encoded string, so you should pass a proper UTF-8 QString to QDebug like this:
qDebug() << QString::fromUtf8("我");
... and from inside the message handler expect a UTF-8 string:
QString message = QString::fromUtf8(msg);
And that should work like a charm. ;)
For more information please read the QDebug reference manual.
You could also do the wrong thing: keep passing UTF-8 encoded strings via << and convert the strings with the horrible QString::fromUtf8(QString::fromUtf8(msg).toAscii().constData()) call.
Edit: This is the final example that works:
#include <QString>
#include <QDebug>
#include <QMessageBox>
#include <QApplication>
void
myMessageOutput(QtMsgType type, const char *msg)
{
QMessageBox::information(NULL, NULL, QString::fromUtf8(msg), QMessageBox::Ok);
}
int
main(int argc, char *argv[])
{
QApplication app(argc, argv);
qInstallMsgHandler(myMessageOutput);
qDebug() << QString::fromUtf8("我");
return 0;
}
Please note that QDebug doesn't do any charset conversion if you don't instantiate QApplication. This way you wouldn't need to do anything special to msg from inside the message handler, but I STRONGLY recommend you to instantiate it.
One thing you must be sure is that your source code file is being encoded in UTF-8. To do that you might use a proper tool to check it (file in case you use Linux, for example) or just call QMessageBox::information(NULL, NULL, QString::fromUtf8("我"), QMessageBox::Ok) and see if a proper message appears.
#include <QtCore/QCoreApplication>
#include <stdio.h>
#include <QDebug>
void myMessageOutput(QtMsgType type, const char *msg)
{
fprintf(stderr, "Msg: %s\n", msg);
}
int main(int argc, char *argv[])
{
qInstallMsgHandler(myMessageOutput);
QCoreApplication a(argc, argv);
qDebug() << QString::fromUtf8("我");
}
The code above works here perfectly, but I must stress that my console does support UTF-8, because if it would not it would show another char at that location.
So in Win32 API, I have my main function defined thus:
wmain(int argc, WCHAR* argv[])
I'm passing some arguments to it, and I'd like to execute a switch case based on the value of the argument, something like this.
wmain(int argc, WCHAR* argv[])
{
char* temp = argv[];
switch (temp) {
case "one": blah blah;
...
}
Of course, the temp=argv[] doesn't work, I'm looking for a suggestion to convert it. Right now I have an if-else-if thing going on, and it's VERY inefficient!
The reason I need to convert it is because I cannot execute a switch case on a WCHAR*.
Thanks for looking.
You can't execute a switch on a char* either. (But when you actually need to convert WCHAR* to char*, use WideCharToMultiByte)
You need to use if/else if with lstrcmpi, CompareString or some other string compare function.
Alternatively, use one of the parameter parser libraries like argtable or getopt
I am not sure if this is a good idea to do. A WCHAR* could contain unicode characters which cannot be mapped to a char* in a meaningful way. In case you want to ignore this, there is a forum post at http://www.codeguru.com/forum/showthread.php?t=336106 which has some suggestions for converting from WCHAR* to char*.
Try converting it from std::wstring to std::string, its easy, maybe there is a shorter way.
Convert WCHAR* to std::wstring using std::wstring constractor, and then use one of std::wstring method to convert to std::String
Here's a quick example I wrote some time ago.
Create a new win32 console application and select ATL support. Add this and compile/run...
#include "stdafx.h"
#include <iostream>
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
// A _TCHAR is a typedef'd, depending on whether you've got a unicode or MBCS build
// ATL Conversion macros are documented here
// http://msdn.microsoft.com/en-us/library/87zae4a3(VS.80).aspx
// Declare USES_CONVERSION in your function before using the ATL conversion macros
// e.g. T2A(), A2T()
USES_CONVERSION;
TCHAR* pwHelloWorld = _T("hello world!");
wcout << pwHelloWorld << endl;
// convert to char
char* pcHelloWorld = T2A(pwHelloWorld);
cout << pcHelloWorld << endl;
cin.get();
return 0;
}
Of course, you can't switch on a string, but this should give you the info you need in order to read a WCHAR into a char. From there, you can convert to int easily enough..
Hope this helps ;)