Indy 10 TidTCPServer encoding characters - delphi-xe2

Indy 10 Sockets . TIdTCPServer component . There is a client program that connects to a server program using sockets. Everything working , there is no trouble in communicating , but I can not send a string with special characters from the server to the client. When sending a string like this:
AContext.Connection.IOHandler.WriteLn ( ' Usuário não existe' ) ;
Client receives the string :
usu?rio n?o existe
Anyone who has ever worked with this component knows how do I set the encoding correctly and to send this special string to the client ?

By default, Indy uses 7bit ASCII as its character encoding (for compatibility with various Internet protocols). To use a different character encoding, you need to either:
set the IOHandler.DefStringEncoding property before doing any reading/writing. The string-based I/O methods will then use this encoding by default.
// note: use TIdTextEncoding.UTF8 if not using Indy 10.6 or later...
AContext.Connection.IOHandler.DefStringEncoding := IndyTextEncoding_UTF8;
use the optional AByteEncoding parameter of the various string-based I/O methods, including WriteLn().
// note: use TIdTextEncoding.UTF8 if not using Indy 10.6 or later...
AContext.Connection.IOHandler.WriteLn('Usuário não existe', IndyTextEncoding_UTF8);
Needless to say, the client will also have to use an equivalent encoding when reading the server's data so it can decode the transmitted bytes back to Unicode. For example:
// note: use TIdTextEncoding.UTF8 if not using Indy 10.6 or later...
Client.IOHandler.DefStringEncoding := IndyTextEncoding_UTF8;
Or:
// note: use TIdTextEncoding.UTF8 if not using Indy 10.6 or later...
S := Client.IOHandler.ReadLn(IndyTextEncoding_UTF8);

Related

SNMPv3 protocol and packets composition

With a friend we are currently working on a library to create and read SNMPv3 packet.
The idea is "only" to create the content of the packet and it will be sent independently.
I know that many libraries exist for that but not in the language that we need. Our major problem now is to specify the content of the different packets. Which part is mandatory? Which part comes in which type of request?
With some examples available on Wireshark's website and the RFCs we can have a beginning of an idea but as it is a protocol, we need to be very clear and sure of what is required in each type of request (get-request, set-request, get-bulk, trap, etc.).
Is there a way to know exactly how each type of packet is created or the only information sources are the RFCs?
First, I want to offer some clarification about the terminology. A UDP packet encodes an SNMP "message". The format of the message varies with the SNMP version, but in all cases, it contains a single PDU. I think when you say "packet", you really mean "PDU".
As for your question, there's no better source than the RFCs, and they are actually easier to read than you think, as long as you know which parts to read (that's the tricky part).
RFC 3416 specifies everything to do with PDUs, including the format (p. 8), a comprehensive list of PDU types (pp. 7-8), and an explanation of how each PDU is used (under section 4.2, starting on p. 10).
The format of all PDUs is the same (though the BulkPDU replaces error-status and error-index with two integer fields of different meanings):
PDU ::= SEQUENCE {
request-id INTEGER (-214783648..214783647),
error-status -- sometimes ignored
INTEGER {
noError(0),
tooBig(1),
noSuchName(2), -- for proxy compatibility
badValue(3), -- for proxy compatibility
readOnly(4), -- for proxy compatibility
genErr(5),
noAccess(6),
wrongType(7),
wrongLength(8),
wrongEncoding(9),
wrongValue(10),
noCreation(11),
inconsistentValue(12),
resourceUnavailable(13),
commitFailed(14),
undoFailed(15),
authorizationError(16),
notWritable(17),
inconsistentName(18)
},
error-index -- sometimes ignored
INTEGER (0..max-bindings),
variable-bindings -- values are sometimes ignored
VarBindList
}

Sending a UNICODE string to A16 COMS Mainframe via TCP/IP

I need to send a UNICODE string message to A16 COMS (Mainframe) via TCP/IP. What algorithm do I need , what transformation of a string. String can contain one or more UNICODE Characters.
While sending ASCII only based string I convert(map) it to EBCDIC and send via TCP/IP connection. I know that EBCDIC doesn't handle UNICODE Character. Besides, I can send via TCP IP only byte array, where in case of ASCII string one character maps to one array cell. In the case of UNICODE character - it can occupy from 1 to 4 byte array cells.
The question is how do I send the UNICODE containing string to A16 Mainframe.
Further clarification:
When I run the code, the TCP client cannot receive any response. It passes timeout and gives an error. Increasing timeout does not help. C# can convert an UNC string to UTF-8 either using System.Text.Encoding or even with an algorithm - almost manually. Those are not a problem. Problem is that A16 COMS expects “one character = one byte”, (mapped to EBCDIC). And with UTF-8 one character may occupy 2, 3 or 4 cells of an array. Now EBCDIC mapping itself does not help, because EBCDIC is designed to work with non-unicode (ASCII based) strings.
I hope that someone whoever did this at some point in his career might read my post because not much can be done by figuring out. Can it be done with TCP Client and its NetworkStream? Send method has only array of bytes in its signature, but with utf-8 array of bytes can be so much longer than the limit.
It is a question asking to share experience, not knowledge.

Transmit commands in ASCII format STM32

I'm using STM32 discovery board to communicate to a device that takes ASCII commands.
I use HAL_UART_TRANSMIT_IT to send data, that works fine.
I want to send ASCII command XM3 to the device. When I use virtual port programs as realterm, I just click on ascii and put the baudrate, databitc etc etc and when I type in XM3 and click on +CR it send the command and it works fine, if CR is not included it doesent work.
When I try to do that from my MCU I use this code and it does not work, any ideas how to send ASCII commands in C via serial port?
char txD[3]="XM3";
__HAL_UART_ENABLE_IT(&huart1, UART_IT_TC);
HAL_UART_Transmit_IT(&huart1, (uint8_t *)txD ,3);
When I send this to realterm it shows XM3 but when I send this to the device nothing happens.
I need to know how I send XM3 and a CR to the device.
If you send the command via Realterm and check the +CR option, Realterm does append a Carriage Return, i.e. ASCII Code 13.
In order to reproduce this behavior in your code, you should define the command as follows:
char txD[4]="XM3\r";
Respectively, if the receiver also expects to receive a Newline, i.e. ASCII Code 10, you should define it as follows:
char txD[5]="XM3\r\n";

Windows DHCP client hostname encoding

Recently I have been trying to save list of hostnames from captured DHCP packets. I have found out, every DHCP hostname (option 12) should have form defined in RFC 1035. So if I understand it correctly, hostname should be encoded in 7-bit ASCII and have other restrictions like:
- name should not start with digit and should omit some forbidden characters.
Almost every device I have encountered in packets fulfill this constraint, but not Windows devices (Vendor ID MSFT 5.0). IMHO Windows DHCP client takes computer (mobile) name and fill it in hostname option.
Problem occurs, when computer name is set for example to "Lukáš-PC". Wireshark display this hostname as Luk\240\347-PC. (240 and 347 are numbers in octal). To see for myself I have printed values in packets with printf("%hhu", c) (C language).
á = 160
š = 231
IMHO I think this is simple char variable overflow. I tried deduce original value from overflow value, but I haven't found any relation between character and known encodings. So my questions are:
Is there any way to convert these values back to original?
If yes, what was original character encoding, when overflow happened?
Thanks.
Default char is usually signed, and extends to int when passed to a variadic function. To ensure that it is printed unsigned, use printf("%hhu", c) or printf("%d", (unsigned char)c);.
The correct encoding is impossible to know because it depends on each system's settings.
Note that any compliant systems MUST encode names according to RFC 3490, but Windows seems to enjoy violating standards.
The characters á and š that you are seing are encoded using code page 852 (Latin-2 - Central European languages).
Unfortunately there is no simple way how you can figure out the encoding used only by looking at DHCP requests. In principle the DHCP client can use any code page it wants. If you are working in a private/controlled network, then it is probably safe to assume the all clients are using the same code page and explicitly encode the strings using that particular code page.

Formatting Modbus requests in ruby

I have a modbus device I am trying to communicate with using an ethernet to RS485 device. I'm not sure whether the device uses modbus ASCII or RTU.
I am trying to format a request to a device with address 1. The command code is 11h. I'm not sure I'm formatting the request properly
Here is the string I am using for ASCII - ":010B000000000C\x0D\x0A"
Here is the hex I'm using for RTU: "\x01\x0B\x00\x00\x00\x00\x0B\xA4"
When I send this command it is echoed back but I'm not getting responses. I've been through the modbus documentation and I think I have the correct byte structure. I'm wondering if I'm encoding it right for ruby?
It turned out my ethernet to RS485 device wasn't capable of the correct timing for modbus. Once I purchased a new unit the ascii strings worked.
Are you sure the checksum should be written in pure bytes, not in ASCII? I mean, try to send :010B000000000C0D0A instead of :010B000000000C\x0D\x0A.
Also, you wrote that the command is 11h - for my understanding it is 0x11 (hex), and you are sending 0x0B. Or the command is 11 (dec)?

Resources