The libssh2.org example:
http://www.libssh2.org/examples/ssh2_exec.html
has the line:
char buffer[0x4000];
What is this line doing?
The code allocates a buffer of 16KiB. 0x4000 means 4000 in hexadecimal, which is 16384 in decimal.
Related
For example I by convention null terminate a buffer (set buffer equal to zero) the following way, example 1:
char buffer[1024] = {0};
And with the windows.h library we can call ZeroMemory, example 2:
char buffer[1024];
ZeroMemory(buffer, sizeof(buffer));
According to the documentation provided by microsoft: ZeroMemory Fills a block of memory with zeros. I want to be accurate in my windows application so I thought what better place to ask than stack overflow.
Are these two examples equivalent in logic?
Yes, the two codes are equivalent. The entire array is filled with zeros in both cases.
In the case of char buffer[1024] = {0};, you are explicitly setting only the first char element to 0, and then the compiler implicitly value-initializes the remaining 1023 char elements to 0 for you.
In C++11 and later, you can omit that first element value:
char buffer[1024] = {};
char buffer[1024]{};
I'm a little confused on the __LINKEDIT section.
Let me set the background:
What I understand about __LINKEDIT
In theory (http://www.newosxbook.com/articles/DYLD.html) the first "section" will be LC_DYLD_INFO.
If I check mach-o/loader.h I get:
#define LC_DYLD_INFO 0x22
...
struct dyld_info_command {
uint32_t cmd; /* LC_DYLD_INFO or LC_DYLD_INFO_ONLY */
uint32_t cmdsize; /* sizeof(struct dyld_info_command) */
...
If I check a mach-o file with otool I get:
$ otool -l MyBinary | grep -B3 -A8 LINKEDIT
Load command 3
cmd LC_SEGMENT_64
cmdsize 72
segname __LINKEDIT
vmaddr 0x0000000100038000
vmsize 0x0000000000040000
fileoff 229376
filesize 254720
maxprot 0x00000001
initprot 0x00000001
nsects 0
flags 0x0
If I check the hex using xxd
$ xxd -s 229376 -l 4 MyBinary
00038000: 1122 9002 ."..
I know that the endiannes of my binary is little:
$ rabin2 -I MyBinary (03/14 10:21:51)
arch arm
baddr 0x100000000
binsz 484096
bintype mach0
bits 64
canary false
class MACH064
crypto false
endian little
havecode true
intrp /usr/lib/dyld
laddr 0x0
lang swift
linenum false
lsyms false
machine all
maxopsz 16
minopsz 1
nx false
os darwin
pcalign 0
pic true
relocs false
sanitiz false
static false
stripped true
subsys darwin
va true
I can corroborate that the first section in __LINKEDIT is LC_DYLD_INFO by getting it's offset form otool:
$ otool -l MyBinary | grep -B1 -A11 LC_DYLD_INFO (03/14 10:25:35)
Load command 4
cmd LC_DYLD_INFO_ONLY
cmdsize 48
rebase_off 229376
rebase_size 976
bind_off 230352
bind_size 3616
weak_bind_off 0
weak_bind_size 0
lazy_bind_off 233968
lazy_bind_size 6568
export_off 240536
export_size 9744
If we check the offset of __LINKEDIT and from LC_DYLD_INFO we get the same: 229376
Everything fine at the moment, kinda make sense.
My confusion
Now when I'm in lldb and want to make sense of the memory.
I can read the memory at the offset:
(lldb) image dump sections MyBinary
...
0x00000400 container [0x0000000100ca0000-0x0000000100ce0000) r-- 0x00038000 0x0003e300 0x00000000 MyBinary.__LINKEDIT
Ok, let's read that memory:
(lldb) x/x 0x00000100ca0000
0x100ca0000: 0x02902211
So this is my problem:
0x02902211
Let's assume I don't know if it's Little or Big Endian. I should find 0x22 at the begining or at the end of the bytes. but it's in the middle? (This confuses me)
the 0x11 I guess is the size 17(in decimal) which might corresponds to what I can see from the structure in loader.h (12bytes + 5bytes of padding?) :
struct dyld_info_command {
uint32_t cmd; /* LC_DYLD_INFO or LC_DYLD_INFO_ONLY */
uint32_t cmdsize; /* sizeof(struct dyld_info_command) */
uint32_t rebase_off; /* file offset to rebase info */
uint32_t rebase_size; /* size of rebase info */
uint32_t bind_off; /* file offset to binding info */
uint32_t bind_size; /* size of binding info */
uint32_t weak_bind_off; /* file offset to weak binding info */
uint32_t weak_bind_size; /* size of weak binding info */
uint32_t lazy_bind_off; /* file offset to lazy binding info */
uint32_t lazy_bind_size; /* size of lazy binding infs */
uint32_t export_off; /* file offset to lazy binding info */
uint32_t export_size; /* size of lazy binding infs */
};
My questions
1.) Why is the 0x22 not in the end(or beginnig)? or am I reading the offset incorrectly?
2.) otool says that the command size is 48 (that's 0x30 in hex) but I can't get it from the bytes next to 0x22. Where do I get the size from?
Thanks for taking the time to read all the way here, and thanks for any help.
Our team develop POS solution for NFC cards on Ingenico devices.
What we use to read the card:
/* Open the MIFARE driver */
int ClessMifare_OpenDriver (void);
Return value: OK
/*Wait until a MIFARE contactless card is detected*/
int ClessMifare_DetectCardsEx (unsigned char nKindOfCard, unsigned int *pNumOfCards, unsigned int nTimeout);
Return value: OK
/*Retrieve the type of the MIFARE card and its UID */
int ClessMifare_GetUid (unsigned char nCardIndex, unsigned char *pKindOfCard, unsigned char *pUidLength, unsigned char *pUid);
Return Value:
Paramater2:
pKindOfCard(Type of cards)
Card1: CL_B_UNDEFINED
Card2: CL_B_UNDEFINED
Card3: CL_B_UNDEFINED
Card4: CL_MF_CLASSIC
Paramater4: pUid ( UID of the card)
Card1: "\004Br\302\3278\200"
Card2: "\004\333\354y\342\002\200"
Card3: "\004s\247B\344?\201"
Card4: "\016\310d\301"
But in real life we expect:
Card1 044272c2d73880
Card2 0ec864c1
Card3 0473a742e43f81
Card4 04dbec79e20280
From Android NFC readers we get correct numbers, but from POS its quite different as a output from Ingenico POS. What we need to do to get this number in hex?
Thanks!
You are actually seeing the right UIDs here. There is just a representation issue you are not expecting. Return values you are quoting are C strings with octal escaping for non-printable characters. \nnn is octal representation of a byte.
In the value "\004s\247B\344?\201", you have \004, byte of value 0x04, followed by printable character s, of value 0x73, followed by \247, value 0xa7, etc.
You can convert to hex for debugging with python for example:
$ python2
>>> import binascii
>>> binascii.b2a_hex("\004Br\302\3278\200")
'044272c2d73880'
>>> binascii.b2a_hex("\004\333\354y\342\002\200")
'04dbec79e20280'
>>> binascii.b2a_hex("\004s\247B\344?\201")
'0473a742e43f81'
>>> binascii.b2a_hex("\016\310d\301")
'0ec864c1'
But overall, data is here.
I want to manually set address of Pointer to value stored in string variable. I have:
addr : String;
ptr : Pointer;
then:
addr:='005F5770';
How to assign it to the ptr?
Like this:
ptr := Pointer($005F5770);
You don't need a string variable since the address is a literal that is known at compile time.
In fact you can make this a constant since the value is known at compile time:
const
ptr = Pointer($005F5770);
Of course, if the value isn't a literal and really does start life as a string with hexadecimal representation then you first need to convert to an integer:
ptr := Pointer(StrToUInt64('$' + S));
Convert it to a UInt64 so that your code is immune to 32 bit pointer truncation when compiled for 64 bit.
Prepend the string hexadecimal number with $ or 0xand use the standard StrToInt():
ptr := Pointer(StrToInt('$'+addr));
If your pointer values are large and targeting a 64 bit compiler, consider using StrToInt64()
Note that a typecast from integer to a pointer is needed.
I am trying to read a utf8 content to char*, my file does not have any DOM, so the code is straight, (the file is unicode punctuation)
char* fileData = "\u2010\u2020";
I cannot see how a single unsigned char 0 > 255 can contain a character of value 0 > 65535 so I must be missing something.
...
std::ifstream fs8("../test_utf8.txt");
if (fs8.is_open())
{
unsigned line_count = 1;
std::string line;
while ( getline(fs8, line))
{
std::cout << ++line_count << '\t' << line << L'\n';
}
}
...
So how can I read a utf8 file into a char*, (or even a std::string)
well, you ARE reading the file correctly into std::string and std::string do support UTF8, it's probably that your console * which cannot show non-ASCII character.
basically, when a character code page is bigger than CHAR_MAX/2, you simply represent this character with many character.
how and how many characters? this is what encoding is all about.
UTF32 for example, will show each character, ASCII and non ASCII as 4 characters. hence the "32" (each byte is 8 bit, 4*8 = 32).
without providing any auditional information on what OS you are using, we can't give a an advice on how your program can show the file's line.
*or more exactly, the standard output which will probably be implemented as console text.