How to card emulate with ACR122U-A9 - nfc

I have a ACR122U-A9, I want to card emulate with it.
I ask who has done this point? Can you give me some suggestions?
Now I have also other problems, when I put the white smartcard on this NFC reader, LED blinks only once. But when I put the phone on it, the LED always blinks until phone out.

The ACR122U contains a PN532 NFC controller chip. The PN532 supports host card emulation through its TgInitAsTarget command (see the PN532 user manual). In order to pass commands to the PN532, you would connect to the ACR122U just as if it was a normal smartcard reader (e.g. using PC/SC). You can then send pack PN532 commands into reader-APDUs of the form
> FF000000 <Lc> <Command>
and get responses in the form
< <Response> 9000
So for getting the ACR122 into card emulation mode, you would do about the following:
ReadRegister:
> FF000000 08 D406 6305 630D 6338
< D507 xx yy zz 9000
Update register values:
xx = xx | 0x004; // CIU_TxAuto |= InitialRFOn
yy = yy & 0x0EF; // CIU_ManualRCV &= ~ParityDisable
zz = zz & 0x0F7; // CIU_Status2 &= ~MFCrypto1On
WriteRegister:
> FF000000 11 D408 6302 80 6303 80 6305 xx 630D yy 6338 zz
< D509 9000
SetParameters:
> FF000000 03 D412 30
< D513 9000
TgInitAsTarget
> FF000000 27 D48C 05 0400 123456 20 000000000000000000000000000000000000 00000000000000000000 00 00
< D58D xx ... 9000
Where xx should be equal to 0x08.
Communicate using a sequence of TgGetData and TgSetData commands:
> FF000000 02 D486
< D587 xx <C-APDU> 9000
Where xx is the status code (should be 0x00 for success) and C-APDU is the command sent from the reader.
> FF000000 yy D48E <R-APDU>
< D587 xx 9000
Where yy is 2 + the length of the R-APDU (response) and xx is the status code (should be 0x00 for success).

Basically ACR122U is not made to emulate the card while there is still very little information from manufacturer saying that it can be used to emulate an NFC card. If this is possible, it would be not that straight forward. I suggest you try Android Host Card Emulation (HCE on Android 4.4).
For part 2: I tried with my phone (Xperia Z) when I turned on the NFC chip and place the phone over the card reader, nothing happened from both sides. Maybe you are using card emulation feature of the phone.

Related

getting extra bytes 82 00 in pc/sc response

I am trying to read data from sony felica card using pc/sc transparent session and transceive data object.
The response I am getting is for a read without encryption command is
c0 03 00 90 00 92 01 00 96 02 00 00 97 82 00 + Data
But according to the protocol, the response should be
c0 03 00 90 00 92 01 00 96 02 00 00 97 + Data
I am unable to figure out the last 82 00 appended in the response from the card.
Now when I try to authenticate with the card I get
c0 03 01 6F 01 90 00
which is a error in pc/sc. I want to resolve these extra bytes 82 00 which I believe will solve the issue with all the commands which require authentication and encryption.
The response data is BER-TLV encoded (see PC/SC 2.02, Part 3).
In BER-TLV encoding there are several possibilities to encode tag 0x97 with two octets of data 0xD0D1, e.g.:
97|02|D0D1 -- short form (see parsed)
97|8102|D0D1 -- long form with one octet with length (see parsed)
97|820002|D0D1 -- long form with two octets with length (see parsed)
97|83000002|D0D1 -- long form with three octets with length (see parsed)
...
Your reader is using two octets for sending the length of ICC Response data object (which is perfectly valid).
You should parse the response properly...Good luck!
PS: The above means, that the Data part of your truncated responses still contains one extra byte with the response length (i.e. Len|Data)

Different Code 128 barcode symbols representing the same data

I'm currently using software called LineView. It generates downtime reason codes for our factory lines. An operator scans the barcodes with an RS232 scanner and it goes into our XL board system.
The software itself generates the barcodes within an internet browser, but I am trying to make it so our own labeling machine can also print out the barcodes. However, the barcodes that are produced by the labeler (and the many online barcode generators I've tried) look longer and do not work.
The data for the example 128 barcode that I am trying to replicate is [SOH]1[STX]65;1067[ETX].
According to the manual:
- The Start of Header character (ASCII 0x01) starts the XL Command packet.
1 - The Serial Address of the XL device (the default is 1).
- The Start of Transmission character (ASCII 0x02) marks the start of the actual command.
65; - The ID of the Production State > Set Reason Code command.
The Reason Code ID (which can range from 1 to 999 for system reasons or 1000 to 1999 for user defined reasons). In my case it is 1067
- The End of Transmission character (ASCII 0x03) ends the XL Command packet.
I have attatched the pictures of what LineView produces (which is what I want it to look like) and what it is currently printing like on our labeller.
When I scan them they both come up with the [SOH]1[STX]65;1067[ETX] code despite them looking different.
Any help with this would be very much appreciated.
Your intended barcode is constructed internally using the following series of Code 128 codewords which correctly represent the ASCII control characters:
103 Start-in-Mode-A (Upper-case and control characters)
65 [SOH] (ASCII 1)
17 1
66 [STX] (ASCII 2)
22 6
21 5
27 ;
99 Switch-to-Mode-C (Double-density numeric)
10 10
67 67
101 Switch-to-Mode-A
67 [ETX] (ASCII 3)
67 Check-digit
106 Stop
Your label printer is printing a barcode representing the literal string [SOH]1[STX]65;1067[ETX] with no ASCII control characters (i.e. left-bracket, S, O, H, right-bracket, ...) using the following internal codewords:
104 Start-in-Mode-B (Mixed-case)
59 [
51 S
47 O
40 H
61 ]
17 1
59 [
51 S
52 T
56 X
61 ]
22 6
21 5
27 ;
99 Switch-to-Mode-C (Double-density numeric)
10 10
67 67
100 Switch-to-Mode-B
59 [
37 E
52 T
56 X
61 ]
57 Check-digit
106 Stop
So you need to work out how to correctly specify ASCII control characters in the input to your labelling machine.

How to determine the major compiler version from .obj files compiled with /GL?

I'm trying to determine Visual Studio version (2002/2003, 2005, 2008, 2010, 2012, 2013, 2015) from the .obj file generated with the link time code generation option.
The file I have, generated with MSVC2012, has following COFF header contents:
File Header
+0 00 00 Machine - Unknown Machine
+2 FF FF NumberOfSections
+4 01 00 4C 01 TimeDateStamp
+8 70 94 F9 55 PointerToSymbolTable
+12 38 FE B3 0C NumberOfSymbols
+16 A5 D9 SizeOfOptionalHeader
+18 AB 4D Characteristics
Optional Header
+20 AC 9B Magic
+22 D6 B6 Linker Version Major/Minor
It seems that the initial 4 bytes being 00,00,FF,FF mark it as a LTCG object, and what follows is proprietary. None of the usual file header members make "sense" (maybe the timestamp is OK, I didn't check).
Does anyone know offhand if any part of this header is compiler-specific? All I need to determine is the MSVC major version used to compile the object...
It appears that there is a version, coded as <MAJOR:16:LE> 0x80 <MINOR:16:LE>, stored shortly after the header. E.g.:
17.00.61030 -> 0x11.0xEE66 -> 11 00 80 66 EE
19.00.23026 -> 0x13.0x59F2 -> 13 00 80 F2 59
What's needed is to figure out how to get to it reliably by offsets from preceding data.
This is a related question, with no resolution...
TL,DR :
You can't get the compiler version with this file format, I guess ...
Complete answer :
It looks like some variation of the "anonymous file format", described in the "winnth.h" by various ANON_OBJECT_HEADER_XXX structures (replace XXX by V2 or BIGOBJ).
Here is a copy of the ANON_OBJECT_HEADER_BIGOBJ found in winnt.h :
typedef struct ANON_OBJECT_HEADER_BIGOBJ {
/* same as ANON_OBJECT_HEADER_V2 */
WORD Sig1; // Must be IMAGE_FILE_MACHINE_UNKNOWN
WORD Sig2; // Must be 0xffff
WORD Version; // >= 2 (implies the Flags field is present)
WORD Machine; // Actual machine - IMAGE_FILE_MACHINE_xxx
DWORD TimeDateStamp;
CLSID ClassID; // CLSID is a 16 bytes struct (not original comment)
DWORD SizeOfData; // Size of data that follows the header
DWORD Flags; // 0x1 -> contains metadata
DWORD MetaDataSize; // Size of CLR metadata
DWORD MetaDataOffset; // Offset of CLR metadata
/* bigobj specifics */
DWORD NumberOfSections; // extended from WORD
DWORD PointerToSymbolTable;
DWORD NumberOfSymbols;
} ANON_OBJECT_HEADER_BIGOBJ;</code>
The description match:
Sig1 : 00 00
Sig2 : FF FF
Version : >=2
Machine : 0x14c`
The other header structures (i.e, ANON_OBJECT_HEADER and ANON_OBJECT_HEADER_V2) are basically the same, but with less fields.
For the Version field, I found some information here :
http://www.geoffchappell.com/studies/msvc/link/dump/infiles/obj.htm
Looks like the Version field is "1" for anonymous files, and it seems like the anonymous files and the so called "import files" shared the same characteristics, only that Version = 0 for import file format (I do not really know what it is admittedly).
But yeah, by just looking at the header, it seems that we have no information on what compiler version was used. And even then, when looking at .obj files generated with the /GL switch, they do not exactly follow this format and I didn't find much information about them. I'll be glad that someone prove me wrong.

snmptrap unsigned type not working as expected

I am using snmpV3 adapter and passing V2 traps to it by using commands as below. It looks like the range for type u (i.e. unsigned) is upto (2^31) - 1 (i.e. 2147483647). I was expecting it to be (2^32) - 1 (i.e. 4294967295).
snmptrap -c public -v 2c clm-pun-009642 '' 1.3.6.1.4.1.20006.1.0.5 1.3.6.1.4.1.12345.1 u 2147483647
Above command generates following log:
trace: ..\..\snmplib\snmp_api.c, 5293:
dumph_recv: Value
dumpx_recv: 42 04 7F FF FF FF
dumpv_recv: UInteger: 2147483647 (0x7FFFFFFF)
Where as for:
snmptrap -c public -v 2c clm-pun-009642 '' 1.3.6.1.4.1.20006.1.0.5 1.3.6.1.4.1.12345.1 u 2147483648
Above command generates following log:
enter code heretrace: ..\..\snmplib\snmp_api.c, 5293:
dumph_recv: Value
dumpx_recv: 42 05 00 80 00 00 00
dumpv_recv: UInteger: -2147483648 (0x80000000)
Refer to:
http://www.net-snmp.org/docs/man/snmptrap.html
I am using net-snmp v5.5.
Is this the correct behavior or am I missing something?
I have discovered various problems with net-snmp over the years. This is apparently one more. The standards are quite clear. RFC 2578 defines Unsigned32 as follows:
-- an unsigned 32-bit quantity
-- indistinguishable from Gauge32 Unsigned32 ::=
[APPLICATION 2]
IMPLICIT INTEGER (0..4294967295)
As noted, this is identical to Gauge32, which is identical to Gauge in SNMPv1 (RFC 1155):
Gauge ::=
[APPLICATION 2]
IMPLICIT INTEGER (0..4294967295)
The encoding is correct; all integers within SNMP are encoded as signed, meaning a value above 2^31-1 must be encoded in 5 bytes. Thus the proper translation of the encoding is:
42 Type: Gauge32 or Unsigned32
05 Length: 5 bytes
00 80 00 00 00 Value: 2^31
net-snmp is incorrectly decoding the value.

VB6 RS232 commands not working

I have the following code:
MSCommProj.CommPort = 6
MSCommProj.RThreshold = 1
MSCommProj.Settings = "19200,N,8,1"
MSCommProj.InputLen = 0
MSCommProj.PortOpen = True
And it opens just fine and connects but when i try sending the command:
MSCommProj.Output = "21 8901 5057 31 0A" & Chr$(13)
and
MSCommProj.Output = "21 89 01 50 57 31 0A" & Chr$(13)
and
MSCommProj.Output = "3F 89 01 50 57 0A" & Chr$(13)
as instructed by the user manual, it does not come on.
Here is the pages in the manual that shows this. Maybe i am just doing it wrong?:
Are you sure that you're meant to be sending character data to the RS232 interface for that? Those look like binary sequences to me.
Rather than:
MSCommProj.Output = "3F 89 01 50 57 0A" & Chr$(13)
I'd be looking at transmitting the binary data thus:
MSCommProj.Output = chr$(63) & chr$(137) & chr$(1) & chr$(80) & chr$(87) & chr(10)
You'll note that there's no chr$(13) at the end, the spec doesn't call for that.
If you want to know what the conversions are for those hex values, start up the Windows calculator, change the view to scientific, switch to hex mode, enter the value, the switch to decimal mode.
Or you can download an ASCII table for this purpose. Or view one of my voluminous essays on the subject here.
You are required to send bytes given.
You instead send string representation of those.
Send actual bytes.
chr$(&h21) & chr$(&h89) & chr$(&h01) & chr$(&h50) etc.
It was because i did not use a cross-over cable... All the rs232 codes were correct. Blah.

Resources