I am using Windows XP pro SP3. Standard english keyboard. I live in the USA; never touched the keyboard settings. Stock install. So, when I press check the scancodes my program is returning they are as follows:
A = 30
S = 31
D = 32
F = 33
G = 34
When I check the microsoft specification (page 11 of the document:: http://download.microsoft.com/download/1/6/1/161ba512-40e2-4cc9-843a-923143f3456c/scancode.doc ) It says:
A = 31
S = 32
D = 33
F = 34
G = 35
They are off by 1! Any ideas why?
The Microsoft Keyboard Scan Code Specification you quoted has six columns. For the A key:
key location: 31
keyboard: A
scan 1 make: 1E
scan 1 break: 9E
scan 2 make: 1C
scan 2 break: F0 1C
It looks like that "scan 2" set is an alternate hardware scan code that's different from the original IBM PC scan code ("scan 1"). Note that the "key location" is 31 and the "scan 1 make" is 30. This might help explain what you're seeing with the values you originally posted. Perhaps you could try looking at keys such as Esc and ` that are quite different in each set (and not just off by one, which I think is misleading).
You didn't say which API or Windows message you were using to get the scan code values you reported, but if you look in the detailed documentation for whatever you're using you might find more information.
Is the keyboard faulty at the hardware level or there is a problem with the keyboard driver can you ascertain?
Related
I've got a .pfx file I can't share (work-related), but when I load it up on Windows my public key starts with 30 82 01 0a ... (truncated, see screenshot)
Windows screenshot
And when I load it up with on macOS it begins with BF:CF:10...
macOS screenshot
Even though it is the same file, the serial number of the loaded certificate and the public key seem to differ, which is causing problems for my C# Unity project when I try to connect to it. They have the same Authority Key Identifier, Subject Key Identifier, and Thumbprint, but differ in their Serial Number (4274337a15ab78c4 for Windows, and 4788598903244265668 and public key).
Is there a reason why Windows and macOS would differ in the details when loading the same certificate file?
It looks like they are importing it the same, just displaying it differently. The difference in serial numbers is due to Windows showing it in hex, and macOS in decimal.
The public key is a bit more complicated, but look closely at the top line of the Windows display. It starts with "30 82 01 0a 02 82 01 01 00", but the bytes following that, "bf cf 10 be e7 b1 1d af ..." match with the first bytes of what macOS lists as the public key.
I don't have a Windows system to look at, but I think what's happening is that the "Public Key" it shows is actually an ASN.1 data structure containing the modulus and exponent that make up the public key. macOS, on the other hand, lists the modulus (labeled "Public Key") and exponent separately, and doesn't bother with the surrounding ASN.1 headers (the "30 82 ..." stuff).
What APDU command gets the PIN from the smart card and write the Changed PIN into the card?
For writing the code on card I have found 80 D4 00 00 08 01 02 03 04 05 06 07 08 to set pin 1 2 3 4 5 6 7 8 but we got 6D 00 in response i.e Instruction code not supported or invalid.
Or are there any WIN APIs that can be used?
Thanks in advance.
Severe misunderstanding: Nothing gets the stored PIN from the card. Using the VERIFY command you can only supply a comparison value and find out, whether it is correct - if it is not, the retry counter will decrease and the PIN may block. There is the standard command CHANGE REFERENCE DATA, see ISO 7816-4, but standard commands have CLA=00 while you currently try CLA=80 (first byte of the command).
6D00 can also be found there and since it means "wrong INS code" the whole command may be wrong. (A PIN consisting of non-printable bytes is also somewhat untypical.)
Without knowing, which card you have and which specification it complies to, you will not make significant progress.
While WINSCARD may be your friend to get the command transported, it will not help in the respect of finding the correct bytes.
(this my first question, excuse me for any mistakes)
I was messing around with debug.exe and tried to alter the BIOS date stored in address range FFFF:0005 to FFFF:000C.
-d FFFF:5 L 8
FFFF:0000 30 31 2F-30 31 2F 39 32 01/01/92
I finally figured out that to move to the address i want to modify i had to point the DS register to it and not the CS as erroneously stated in some sites(e.g. here)
-r DS
DS=073F
:FFFF
I also figured out that I can use the whole address to modify the exact memory address I want.
-e FFFF:000b
FFFF:000B 39.31 32.31
but then the output of dump command remained unchanged!!!
-d FFFF:5 L 8
FFFF:0000 30 31 2F-30 31 2F 39 32 01/01/92
I am suspecting that there are maybe some "protected" areas in memory I cannot modify, but I couldn't find any documentation about that is why I am asking. Can anyone possibly explain me why and how this is happening?
Thank you
P.S. Note that I am using DosBox to emulate this and to not brick my computer!(maybe this is the problem?)
As the comments suggest, you are writing to ROM, so the values there can't be changed by your code. On modern machines you would get some sort of error as feedback for doing this, but on old hardware it's very common for writes to ROM to be silently ignored. In other words, the CPU will perform the requested operation anyway, but that operation will have no effect on the memory.
I use some SIMCOM GSM module to receive incoming messages. When I send SMS from my mobile phone I see my normal number:
+CMT: "+38012345678", ...
But when SMS comes from my cell operator, or some named SMS service as Google I see somу trash like here from Google:
+CMT: "16p6p6w237562767963656", ...
one more:
+CMT: "w49511#495946535451425", ...
and more:
+CMT: "#497966737471627", ...
According to module documentation this parameter named <oa> and means GSM 03.40 TP-Originating-Address Address-Value string field.
Is it possible to decode it on any programming language, e.g. from python? What can it be? If I switch to UCS2 and decode from it is absolutely the same.
According to SIM800 Series AT Command Manual v1.10, page 114:
GSM 03.40 TP-Destination-Address Address-Value field in string
format; BCD numbers (or GSM default alphabet characters) are converted
to characters of the currently selected TE character set (refer
Command +CSCS in 3GPP TS 27.007); type of address given by
If phone number in CMT message does not start with "+" sign, it is encoded with BCD numbers.
I tried to compare those numbers with ASCII table. This is not exactly BCD encoding, but looks very similar.
To decode "16p6p6w237562767963656" split it into pairs: 16 p6 p6 w2 37 56 27 67 96 36 56
then reverse each pair: 61 6p 6p 2w 73 65 72 76 69 63 65
Now compare to HEX codes in ASCII table and get the result: all services. You may wonder how to read 6p 6p 2w. I wonder either!
After searching other examples of encoded numbers I made an assumption that HEX digits 0, A-F have equivalent of different characters:
0 - w
A
B - #
C - p
D
E - +
F - #
I have no idea, why HEX digits were replaces by random letters.
"w49511#495946535451425" stands for "#Y?KYIVSTAR". The code "11" is unprintable and replaced by "?".
"#497966737471627" stands for "Kyivstar".
Are you sure your module is set to text format (AT+CMGF=1) when receiving those SMS? If you switched off your module and on again it probably is set to "PDU" mode, which is more suited for computers than humans..
See the SIMCOM AT Command manual for details, it's very extensive (380 pages pdf).
I want to be able to find out where the code appearing at the entry point comes from by looking at the PE header.
For example, this piece of code is the starting code of my program(401000h)
00401000 >/$ 58 POP EAX ; kernel32.76E93677
00401001 |. 2D 77360100 SUB EAX,13677
00401006 |. BB 4A184000 MOV EBX,<JMP.&kernel32.VirtualProtect>
I want to know where this code comes from. How can I find it without manually scanning my file? (to complete the example, here's an hexdump from the same file, the code now resides at 200h)
Offset 0 1 2 3 4 5 6 7 8 9 A B C D E F
00000200 58 2D 77 36 01 00 BB 4A 18 40 00
How can I get from my virtual entry point (401000h) to the raw entry point (200h)?
I tried solving it myself of course. But I'm missing something. At first I thought:
.text[ Entrypoint (1000h) - VirtualOffset (1000d) ] = raw entrypoint
since the file alignment = 200, and the raw entry point was at the very start of my .text section, I thought I could use this for all the executables.
Solved, I made stupid mistakes when calculating the raw entry point
.text[ Entry point - Virtual offset ] + File Alignment = Raw entry point (relative to .text section)
To locate the offset in the file by yourself you need to have a look at the _IMAGE_NT_HEADERS structure. From this you can get the IMAGE_OPTIONAL_HEADER where
the member you are interested in ImageBase is. You can change its value with EditBin /REBASE so there is little need to roll your own tool.
For reference how you can determine the entry point via dumpbin.
You can use
dumpbin /headers
dumpbin /headers \Windows\bfsvc
Dump of file \Windows\bfsvc.exe
PE signature found
File Type: EXECUTABLE IMAGE
FILE HEADER VALUES
14C machine (x86)
4 number of sections
4A5BBFB3 time date stamp Tue Jul 14 01:13:55 2009
0 file pointer to symbol table
0 number of symbols
E0 size of optional header
102 characteristics
Executable
32 bit word machine
OPTIONAL HEADER VALUES
10B magic # (PE32)
9.00 linker version
DE00 size of code
2000 size of initialized data
0 size of uninitialized data
4149 entry point (01004149)
1000 base of code
F000 base of data
1000000 image base (01000000 to 01011FFF)
1000 section alignment
200 file alignment
For the entry point the image base value is relevant. But this is only true for images that are not ASLR enabled. For them a random base address (1 of 128 different ones) is choosen.
The flag that indicates if an image is ASLR enabled is the value 0x40 which is set in DLL characteristics.
8140 DLL characteristics
For svchost.exe for example it is set for older programs it is generally 0.
Yours,
Alois Kraus
Have a look at this thread including an answer with a detailed explanation: Calculating the file offset of a entry point in a PE file
AddressOfRawEntryPoint (in EXE file) = AddressOfEntryPoint + .text[PointerToRawData] - .text[VirtualAddress]