APDU Read Record ACR122 - nfc

I am trying to use a USB NFC card reader, ACR122U.
I managed to get the ID of the card by sending 0xFF, 0xCA, 0x00, 0x00, 0x00 and to tell the unit not to beep by sending control 0xFF, 0x00, 0x52, 0x00, 0x00. However, using an Android app I wrote a URL to the cards first record (0).
Following the specs here, I should send 0xFF, 0xB2, 0x00, 0x08, 0x00 to read the first record, however I only get "c" as a response.
Does anyone have the actual command to send to get the first record?
Thanks!

You have to use the .Transmit() method instead of the .Control() method.

Related

What is the layout of the initial data block of an NFC Forum Tag 3 card?

Can anyone explain the format of the first data block of a NFC forum tag 3? I'm trying to emulate tags and therefore need to understand the format, I've found resources on the NDEF data blocks but I can't find anything on the initial block. When I write to the PN532, I get the output ```0x10, 0x0c, 0x08, 0x00
0x3f, 0x00, 0x00, 0x00
0x00, 0x0f, 0x01, 0x00
0x00, 0x00, 0x00, 0x73```
on the first chunk, but I am not quite sure how to decipher this. Can anyone explain/direct me to a resource on how this first chunk is encoded?
Thanks
The Type 3 NFC Forum Spec has some detail on the Initial System Information Blocks http://apps4android.org/nfc-specifications/NFCForum-TS-Type-3-Tag_1.1.pdf

Capcom driver vulnerability: Bluescreens on the physical PC, but not in the virtual machine

I am investigating anti-cheat mechanisms in computer games using the popular Capcom driver with a vulnerability.
You can pass user mode functions to the Capcom driver via DeviceIoControl() calls, which are then executed in kernel context.
Now I'm faced with a strange problem:
I run the DeviceIoControl() calls as they are also successfully executed by many others.
In my virtual machine the DeviceIoControl() calls also work without problems.
However, when I execute the code on my physical PC, I get a blue screen with the message "SYSTEM_SERVICE_EXCEPTION".
Here is the code that works correctly in the VM, but not on my physical PC:
void __stdcall EmptyTestFunction(MmGetSystemRoutineAddress_t pMmGetSystemRoutineAddress, PVOID userData) {
}
DriverLoadingTest() {
HANDLE device = OpenDevice("Htsysm72FB");
CapcomCodePayload* codePayload = (CapcomCodePayload*)VirtualAlloc(nullptr, sizeof(CapcomCodePayload), MEM_COMMIT, PAGE_EXECUTE_READWRITE);
BYTE codePayloadBuf[] = {
0xE8, 0x08, 0x00, 0x00, 0x00, // CALL $+8 ; Skip 8 bytes, this puts the UserFunction into RAX
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // UserFunction address will be here
0x48, 0xBA, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // MOV RDX, userData
0x58, // POP RAX
0xFF, 0x20 // JMP [RAX]
};
*(ULONGLONG*)(codePayloadBuf + 5) = (ULONGLONG)EmptyTestFunction;
*(ULONGLONG*)(codePayloadBuf + 15) = (ULONGLONG)0;
codePayload->pointerToPayload = codePayload->payload;
ZeroMemory(codePayload->payload, PAYLOAD_BUFFER_SIZE);
CopyMemory(codePayload->payload, codePayloadBuf, sizeof(codePayloadBuf));
status = 0x0;
DWORD bytesReturned = 0x0;
DeviceIoControl(device, IOCTL_RunPayload64, &codePayload->pointerToPayload, sizeof(ULONG_PTR), &status, sizeof(status), &bytesReturned, 0);
printf("DeviceIoControl returned %08x\n", status);
}
I only make limited progress with the crash dump because I lack experience.
The crash happens every time the following instruction is executed:
mov cr4, rax
With rax=0000000000070678
The exception code is: c0000096
I hang the WinDbg "!analyze -v" crash dump at the end of my post.
My main concern is to find out how I could tackle the problem now in order to solve it. Because the situation that the exact same code works in the VM, but not on my physical PC, is completely new to me.
link to crash dump
CR4 is one of the x86 control registers and you are clearly glomming something in there which is causing a CPU exception.
That register consists of a set of flag bits as documented here, so let's look at which ones you are setting:
0x70678 = 1110000011001111000b, so we have the following:
0 VME Virtual 8086 Mode Extensions - OFF (sounds OK)
1 PVI Protected-mode Virtual Interrupts - OFF
2 TSD Time Stamp Disable - OFF
3 DE Debugging Extensions - ON
4 PSE Page Size Extension - ON
5 PAE Physical Address Extension - ON
6 MCE Machine Check Exception- ON
7 PGE Page Global Enabled - OFF
8 PCE Performance-Monitoring Counter enable - OFF
9 OSFXSR Operating system support for FXSAVE and FXRSTOR instructions - ON
10 OSXMMEXCPT Operating System Support for Unmasked SIMD Floating-Point Exceptions - ON
11 UMIP User-Mode Instruction Prevention - OFF
12 LA57 (none specified) - OFF
13 VMXE Virtual Machine Extensions Enable - OFF
14 SMXE Safer Mode Extensions Enable - OFF
16 FSGSBASE Enables the instructions RDFSBASE, RDGSBASE, WRFSBASE, and WRGSBASE - ON
17 PCIDE PCID Enable - ON
18 OSXSAVE XSAVE and Processor Extended States Enable - ON
20 SMEP[4] Supervisor Mode Execution Protection Enable - OFF
21 SMAP Supervisor Mode Access Prevention Enable - OFF
22 PKE Protection Key Enable - OFF
So one of these is upsetting the apple cart, and my money would be on bits 4 and / or 5.
HOWEVER. Why is the code trying to set CR4 at all? I can't think of a single reason why you would want to do that in kernel mode, unless you are part of the OS. Which you are not.
Anyway, I hope that gives you something to go on. I dislike the question though because there is nowhere near enough context and have therefore voted to close (although I didn't vote it down because it does hold some interest for me).
It turned out that Hyper-V hypervisor prohibited writing to register CR4, which led to the bluescreens.
I don't know if this is specifically because I am accessing the kernel from the Capcom driver. Other kernel modules will probably also access the CR4 register? If yes, then it is specifically related to the Capcom driver. So if someone has the same problem, they should check if they have enabled the Hyper-V Hypervisor service.

How to change polling sequence on ACR122U / PN532

I'm looking to develop a Go application to read EMV and NFC cards with a 2 separate custom wakeup frames.
My polling loop should check WUPA, then WUPB, then WUPCUST1, then WUPCUST2. Both custom frames follow the same protocol as Type-A
At the end of the loop, it should return which (if any) of the four it has found (in normal cases, it would be 1 or 2 of the four E.g. Type-A & Custom 1).
I have been trying to configure this by with pseudo-ADPU commands to the PN532 chip, but I'm struggling to understand the documentation, which is very technical and contains many unreferenced terms and acronyms.
I can successfully call InJumpForDEP, InJumpForPSL, InListPassiveTarget, but get a 0x27 error () when I try to call InATR with []byte{0xFF, 0x00, 0x00, 0x00, 0x04, 0xD4, 0x50, 0x00, 0x00} or []byte{0xFF, 0x00, 0x00, 0x00, 0x04, 0xD4, 0x50, 0x01, 0x00}.
Unfortunately, my knowledge of NFC hardware isn't deep enough to figure out what calls I need to make to the PN532 to configure this. Or indeed, if what I am trying to achieve is even possible.

Windows USB HID Report Length

I am developing a USB HID device using an STMicro microcontroller. I started with STMicro's HID example which works fine. I am using C++ on Windows 7 64-bit for the PC side. I have an application that works with my device. There is one thing I can't figure out, however.
The example firmware only allowed sending and receiving 2 bytes at a time, which is determined by a HIDP_CAPS.OutputReportByteLength and InputReportByteLength. I would like to send more data than this at once, but I can't figure out how to increase the report lengths. I successfully changed the endpoint wMaxPacketSize, the VID and PID, and a few other things, but I can't figure out how Windows is calculating the in and out report lengths. There doesn't seem to be any fields in my report or device descriptions that indicate this length, but I can't imagine where else it might be coming from.
Can anyone tell me how Windows determines the HIDP_CAPS.OutputReportByteLength and HIDP_CAPS.InputReportByteLength?
How can I increase these lengths?
I figured it out. I thought I would post here in case anyone else needs to know. I'm not entirely sure I really understand it all, so if I made a mistake, someone please correct me.
I had to change the report description in my firmware. I had several usages. Windows gets the report description and figures out which usage requires the longest length and uses that length. On one of my input reports I made the following changes (the input report is just an array of bytes in firmware):
0x27, 0xFF, 0xFF, 0xFF, 0xFF, //Logical maximum is 4 bytes long, and has a value of 0xFFFFFFFF
0x95, 0x01, //There is one report
0x75, 0x20, //There are 32 bits per report
I did something similar for the output, but there is no report number field (0x95).
Windows now tells me I can send and receive 5 bytes, which I believe means the end point plus report number times report size.

pcsc-sharp Mifare Authentication

I'm using the pcsc-sharp library to communicate with an ACR122U Reader an read/write information to MIFARE Classic 1k cards.
After getting familiar with the library and the APDU concept I'm able to use the cards UID as identifier in my applications.
Now I am in need of setting my own ID's to the card. Therefore I read some manuals regarding NXP's MIFARE (like MF1S70YYX_V1) and also got some information about ISO 7816-4.
I'm aware of the need to do authentication before accessing the cards memory to perform read/write operations and I know the standard Key value.
I downloaded the pcsc-sharp examples from GitHub and ran the Mifare1kTest example. I works but card.LoadKey in Line 36 fails. The response values of the Apdu command in LoadKey is SW1=99 SW2=0, which I cannot find in any documentation. Commenting out the "throw new Exception" section makes the example work.
My question now is, which values are the correct ones to pass to Card.LoadKey, respectively which are the correct values to use for parameters in the Apdu Command. What is meant with "keynumber" (Sectornumber - Sector/Block Combination)? Is the LoadKey call necessary, if the example works?
Your question is broad, but these should work for you. Code is explained with comments
var loadKeySuccessful = card.LoadKey(
KeyStructure.VolatileMemory,
0x00, // first key slot
new byte[] {0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF} // key
);

Resources