I want to use SymGetSourceFile to get a source file from source server using info from a dump file. But the first param is a handle to process but during postmortem we dont have a process, so is it meant to be used only for live debugging tools? How can I use it from a postmortem debugging tool?
BOOL IMAGEAPI SymGetSourceFile(
HANDLE hProcess,
ULONG64 Base,
PCSTR Params,
PCSTR FileSpec,
PSTR FilePath,
DWORD Size
);
https://learn.microsoft.com/en-us/windows/win32/api/dbghelp/nf-dbghelp-symgetsourcefile
Update:
I have tried using IDebugAdvanced3 interface for same but get HR = 0x80004002 for GetSourceFileInformation call.
char buf[1000] = { 0 };
HRESULT hr = g_ExtAdvanced->GetSourceFileInformation(DEBUG_SRCFILE_SYMBOL_TOKEN,
"Application.cs",
0x000000dd6f5f1000, 0, buf, 1000, 0);
if (SUCCEEDED(hr))
{
dprintf("GetSourceFileInformation = %s", buf);
char buftok[5000] = { 0 };
hr = g_ExtAdvanced->FindSourceFileAndToken(0, 0x000000dd6f5f1000,
"Application.cs", DEBUG_FIND_SOURCE_TOKEN_LOOKUP,
buf, 1000, 0, buftok, 5000, 0);
if (SUCCEEDED(hr))
{
dprintf("FindSourceFileAndToken = %s", buf);
}
else
dprintf("FindSourceFileAndToken HR = %x", hr);
}
else
dprintf("GetSourceFileInformation HR = %x", hr);
I have dump that has this module and pdb loaded. and pass an address within the module - 0x000000dd6f5f1000, to GetSourceFileInformation
this was a comment but grew up so addingas answer
GetSourceFileINformation iirc checks the source servers those that start with srv or %srcsrv%
this returns a token for use with findsourcefileandtoken
if you have a known offset (0x1070 == main() in case below )
use GetLineByOffset this has the added advantage of reloading all the modules
hope you have your private pdb for the dump file you open.
this is engext syntax
Hr = m_Client->OpenDumpFile("criloc.dmp");
Hr = m_Control->WaitForEvent(0,INFINITE);
unsigned char Buff[BUFFERSIZE] = {0};
ULONG Buffused = 0;
DEBUG_READ_USER_MINIDUMP_STREAM MiniStream ={ModuleListStream,0,0,Buff,BUFFERSIZE,Buffused};
Hr = m_Advanced2->Request(DEBUG_REQUEST_READ_USER_MINIDUMP_STREAM,&MiniStream,sizeof(
DEBUG_READ_USER_MINIDUMP_STREAM),NULL,NULL,NULL);
MINIDUMP_MODULE_LIST *modlist = (MINIDUMP_MODULE_LIST *)&Buff;
Hr = m_Symbols->GetLineByOffset(modlist->Modules[0].BaseOfImage+0x1070,&Line,
FileBuffer,0x300,&Filesize,&Displacement);
Out("getlinebyoff returned %x\nsourcefile is at %s line number is %d\n",Hr,FileBuffer,Line);
this is part src adapt it to your needs.
the result of the extension command is pasted below
0:000> .load .\mydt.dll
0:000> !mydt
Loading Dump File [C:\Users\xxxx\Desktop\srcfile\criloc.dmp]
User Mini Dump File with Full Memory: Only application data is available
OpenDumpFile Returned 0
WaitForEvent Returned 0
Request Returned 0
Ministream Buffer Used 28c
06 00 00 00 00 00 8d 00 00 00 00 00 00 e0 04 00
f0 9a 05 00 2d 2e a8 5f ba 14 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
43 00 00 00 4a 38 00 00 00 00 00 00 00 00 00 00
40 81 00 00 00 00 00 00 00 00 00 00 00 00 00 00
No of Modules =6
Module[0]
Base = 8d0000
Size = 4e000
getlinebyoff returned 0
sourcefile is at c:\users\xxx\desktop\misc\criloc\criloc.cpp line number is 21 <<<<<<<<<
||1:1:010> lm
start end module name
008d0000 0091e000 CRILOC (private pdb symbols) C:\Users\xxxx\Desktop\misc\CRILOC\CRILOC.pdb
||1:1:010>
and the actual source file contents on path
:\>grep -i -n main CRILOC.CPP
20:int main(void) << the curly braces is on line 21
UPDATE:
yes if the src file is not source indexed (cvs,perforce,... ) GetSourceFileInformation () will not return a token
it checks for a token using the Which parameter
and the returned info can be used in FindSourceFileAndToken();
if your source is not source indexed and you only have a source path
use FindSourceFileandToken() with DEBUG_FIND_SOURCE_FULL_PATH Flag
be aware you need to either use SetSourcePath() or issue .srcpath command or use _NT_SOURCE_PATH environment variable or use -srcpath commandline switch prior to invoking FindSourceFileAndToken()
see below for a walkthrough
sourcefile and contents
:\>ls *.cpp
mydt.cpp
:\>cat mydt.cpp
#include <engextcpp.cpp>
#define BSIZE 0x1000
class EXT_CLASS : public ExtExtension {
public:
EXT_COMMAND_METHOD(mydt);
};
EXT_DECLARE_GLOBALS();
EXT_COMMAND( mydt, "mydt", "{;e,o,d=0;!mydt;}" ){
HRESULT Hr = m_Client->OpenDumpFile("criloc.dmp");
Hr = m_Control->WaitForEvent(0,INFINITE);
char Buff[BSIZE] = {0};
ULONG Buffused = 0;
DEBUG_READ_USER_MINIDUMP_STREAM MiniStream ={ModuleListStream,0,0,
Buff,BSIZE,Buffused};
Hr = m_Advanced2->Request(DEBUG_REQUEST_READ_USER_MINIDUMP_STREAM,&MiniStream,
sizeof(DEBUG_READ_USER_MINIDUMP_STREAM),NULL,NULL,NULL);
MINIDUMP_MODULE_LIST *modlist = (MINIDUMP_MODULE_LIST *)&Buff;
//m_Symbols->SetSourcePath("C:\\Users\\xxx\\Desktop\\misc\\CRILOC");
char srcfilename[BSIZE] ={0};
ULONG foundsize =0 ;
Hr = m_Advanced3->FindSourceFileAndToken(0,modlist->Modules[0].BaseOfImage,"criloc.cpp",
DEBUG_FIND_SOURCE_FULL_PATH,NULL,0,NULL,srcfilename,0x300,&foundsize);
Out("gsfi returned %x\n" , Hr);
Out("srcfilename is %s\n",srcfilename);
}
compiled and linked with
:\>cat bld.bat
#echo off
set "INCLUDE= %INCLUDE%;E:\windjs\windbg_18362\inc"
set "LIB=%LIB%;E:\windjs\windbg_18362\lib\x86"
set "LINKLIBS=user32.lib kernel32.lib dbgeng.lib dbghelp.lib"
cl /LD /nologo /W4 /Od /Zi /EHsc mydt.cpp /link /nologo /EXPORT:DebugExtensionInitialize /Export:mydt /Export:help /RELEASE %linklibs%
:\>bld.bat
mydt.cpp
E:\windjs\windbg_18362\inc\engextcpp.cpp(1849): warning C4245: 'argument': conversion from 'int' to 'ULONG64', signed/unsigned mismatch
Creating library mydt.lib and object mydt.exp
:\>file mydt.dll
mydt.dll; PE32 executable for MS Windows (DLL) (GUI) Intel 80386 32-bit
executing
:\>cdb cdb
Microsoft (R) Windows Debugger Version 10.0.18362.1 X86
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
ntdll!LdrpDoDebuggerBreak+0x2c:
77d805a6 cc int 3
0:000> .load .\mydt.dll
0:000> .chain
Extension DLL chain:
.\mydt.dll: API 1.0.0, built Thu Mar 18 20:40:04 2021
[path: C:\Users\xxxx\Desktop\srcfile\New folder\mydt.dll]
0:000> !mydt
Loading Dump File [C:\Users\xxxx\Desktop\srcfile\New folder\criloc.dmp]
User Mini Dump File with Full Memory: Only application data is available
gsfi returned 80004002
srcfilename is
||1:1:010> .srcpath "c:\\users\\xxxx\\desktop\\misc\\criloc\\"
Source search path is: c:\\users\\xxxx\\desktop\\misc\\criloc\\
************* Path validation summary **************
Response Time (ms) Location
OK c:\\users\\xxxx\\desktop\\misc\\criloc\\
||1:1:010> !mydt
Loading Dump File [C:\Users\xxxx\Desktop\srcfile\New folder\criloc.dmp]
gsfi returned 0
srcfilename is c:\\users\\xxxx\\desktop\\misc\\criloc\\criloc.cpp
||2:2:021>
I'm trying to mprotect a subset of functions bundled in a shared library for the purposes of a larger level feature.
Given the requirements around page-alignment for mprotect, I've setup a separate section in my linker script that ensures that alignment:
SECTIONS
{
.protectedsection ALIGN(4096) : {
*(.protectedsection)
}
}
INSERT AFTER .rodata;
And in the declaration of the function I want protection on, I add the relevant GCC attributes:
// Setup with protection
void bar(int a) __attribute__ ((section ("protectedsection")));
// Setup without protection
void foo(int a);
I compile the resultant C code with GCC, along with the -T option to pass in the linker file:
gcc -fpic -shared -T linkerscript.ld libfuncs.c -o libfuncs.so
Objdump'ing it reveals that while it is in the right section, the alignment isn't right:
Disassembly of section protectedsection:
00000000000010da <_Z3bari>:
10da: 55 push %rbp
10db: 48 89 e5 mov %rsp,%rbp
10de: 48 83 ec 10 sub $0x10,%rsp
10e2: 89 7d fc mov %edi,-0x4(%rbp)
10e5: 48 8d 3d e1 00 00 00 lea 0xe1(%rip),%rdi # 11cd <_fini+0x9>
10ec: e8 df f4 ff ff callq 5d0 <puts#plt>
10f1: 90 nop
10f2: c9 leaveq
10f3: c3 retq
Is what I'm trying to do here possible, and if so, how?
Found the problem - changing 'INSERT AFTER .rodata' to 'INSERT AFTER .text' fixed my problem. With that, I was able to setup a 'buffered' region for the code I want to have mprotect'ed, and it all works like a charm!
I got blue screen of death (bsod) error on my laptop some time ago. I read online that analyzing minidump file in "c:\windows\minidump" will help understand cause behind bsod error. (and probably point to the culprit driver causing the error)
I used this online tool to analyze error http://www.osronline.com/page.cfm?name=analyze
It created a report but I do not understand it. If you can understand it, please let me know.
Link to online crash analysis report: https://pastebin.com/raw/3Hhq7arw
Dump file location: https://drive.google.com/file/d/0BzNdoGke8tyRZk5YcHBKQV8ycFE/view?usp=sharing
Laptop config: Windows 7, 32 bit
You get an WHEA_UNCORRECTABLE_ERROR (124) Bugcheck, which measn there is a fatal hardware error:
The WHEA_UNCORRECTABLE_ERROR bug check has a value of 0x00000124. This
bug check indicates that a fatal hardware error has occurred.
Using the !errrec command in Windbg and the value from parameter 2 I see you have an Internal timer issue with your Intel i3-3217U CPU:
===============================================================================
Common Platform Error Record # 8a0a401c
-------------------------------------------------------------------------------
Record Id : 01d1ff7437bf4c24
Severity : Fatal (1)
Length : 928
Creator : Microsoft
Notify Type : Machine Check Exception
Timestamp : 8/26/2016 8:32:29 (UTC)
Flags : 0x00000000
===============================================================================
Section 0 : Processor Generic
-------------------------------------------------------------------------------
Descriptor # 8a0a409c
Section # 8a0a4174
Offset : 344
Length : 192
Flags : 0x00000001 Primary
Severity : Fatal
Proc. Type : x86/x64
Instr. Set : x86
Error Type : Micro-Architectural Error
Flags : 0x00
CPU Version : 0x00000000000306a9
Processor ID : 0x0000000000000003
===============================================================================
Section 1 : x86/x64 Processor Specific
-------------------------------------------------------------------------------
Descriptor # 8a0a40e4
Section # 8a0a4234
Offset : 536
Length : 128
Flags : 0x00000000
Severity : Fatal
Local APIC Id : 0x0000000000000003
CPU Id : a9 06 03 00 00 08 10 03 - bf e3 ba 3d ff fb eb bf
00 00 00 00 00 00 00 00 - 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 - 00 00 00 00 00 00 00 00
Proc. Info 0 # 8a0a4234
===============================================================================
Section 2 : x86/x64 MCA
-------------------------------------------------------------------------------
Descriptor # 8a0a412c
Section # 8a0a42b4
Offset : 664
Length : 264
Flags : 0x00000000
Severity : Fatal
Error : Internal timer (Proc 3 Bank 3)
Status : 0xbe00000000800400
Address : 0x0000000085286f3c
Misc. : 0x0000000000000000
I see that you use the ASUS X550CA Laptop:
BiosVersion = X550CA.217
BiosReleaseDate = 01/23/2014
SystemManufacturer = ASUSTeK COMPUTER INC.
SystemProductName = X550CA
I see you se the older BIOS Version .217, so update the BIOS to .300, maybe it fixes the issue. If this doesn't fix the issue, do a stress test of the CPU with Prime95 and Intel CPU Diag tool.
see a simple code below:
int foo(int a)
{
return a;
}
int main() {
printf("%x\n", foo);
printf("%x\n", &foo);
printf("%x\n", *foo);
foo(1);
}
They all displayed the same value:
0x20453840
0x20453840
0x20453840
I used gdb to check foo() entry point is:
(gdb) p foo
$1 = {int (int)} 0x100003d8 <foo>
the value 0x20453840 is actually foo() pointer of pointer:
(gdb) p /x *0x20453850
$3 = 0x100003d8
(gdb) si
0x10000468 76 foo(1);
0x10000464 <main+76>: 38 60 00 01 li r3,1
=> 0x10000468 <main+80>: 4b ff ff 71 bl 0x100003d8 <foo>
(gdb)
foo (a=541407312) at insertcode.c:57
57 {
=> 0x100003d8 <foo+0>: 93 e1 ff fc stw r31,-4(r1)
0x100003dc <foo+4>: 94 21 ff e0 stwu r1,-32(r1)
0x100003e0 <foo+8>: 7c 3f 0b 78 mr r31,r1
0x100003e4 <foo+12>: 90 7f 00 38 stw r3,56(r31)
(gdb)
So I think 0x100003d8 is the entry point.
I used gcc 4.6.2 to compile.
I have tow questions:
why different function address definition on AIX? is it related to gcc?
I have to use gcc not xlC.
how to get real function address in C on AIX?
Thanks in advance!
why different function address definition on AIX?
nm -Pg ./f_addr | grep foo
Try this command, and you will see you have too symbols: foo and .foo One of them lives in the code segment (or text segment), the other, in the data segment.
The purpose is, indeed, creating an indirection in function calling; it is important when creating/using shared libraries.
is it related to gcc? I have to use gcc not xlC.
No.
How to get real function address in C on AIX?
Please clarify your question: what do you want to do with the 'real address'.
Well the following should work - I can't find a reason why it shouldn't:
std::fstream f;
std::string myoutput;
f.imbue(std::locale(f.getloc(), new std::codecvt_utf16<wchar_t, std::little_endian | std::consume_header>));
f.open("c:\\test.txt", std::ios::in);
std::getline(f, myoutput);
The code is executed on the following file (in hex - it should spell "hello world"):
FF FE 68 00 65 00 6C 00 6C 00 6F 00 20 00 77 00 6F 00 72 00 6C 00 64
00
The ultimate goal is to abstract the encoding away, always consider a file UTF-8 unless the first bytes are a BOM. Now above code would be executed after reading a BOM and noticing it is UTF-16. It should hence read UTF-16 file, and convert it to a utf-8 string.
However the std::getline does not ignore the BOM (easily fixed) but furthermore it does not respect the fact that UTF-16 uses 2 bytes. (And it stops after reading the first 3 bytes upon seeing "0" ).
Now of course I could use std::wfstream. But as I wish to "hide" the unicode type from the user for one thing all the "filestreams" are stored inside a container for referencing. So the signature of all those filestreams has to be equal - and be based on char and std::string
If you opened your file as basic_fstream<char>, you've already set both external and internal character width to 1 byte and the locale facet you're applying will never be used.
Either read into a string and apply wstring_convert twice, or apply wbuffer_convert to make the internal character width bigger, and then wstring_convert:
std::fstream f;
f.open("test.txt", std::ios::in | std::ios::binary);
std::wbuffer_convert<std::codecvt_utf16<wchar_t,
0x10ffff, // note your 2nd parameter was wrong
std::little_endian // or consume_header, not both
>> cvt1(f.rdbuf());
std::wistream wide_f(&cvt1);
std::wstring wstr;
std::getline(wide_f, wstr);
std::wstring_convert<std::codecvt_utf8<wchar_t>> cvt2;
std::string u8str = cvt2.to_bytes(wstr);
std::cout << u8str << '\n';