converting SNMP octet string to float (readable string) - snmp

I'm running a number of SNMP queries to a Hytera dmr repeater. However, the SNMP object definition looks like this:
rptVswr OBJECT-TYPE
SYNTAX OCTET STRING(SIZE(4))
MAX-ACCESS read-only
STATUS mandatory
DESCRIPTION
"The VSWR.
It should be changed to float format. "
-- 1.3.6.1.4.1.40297.1.2.1.2.4
::= { rptDataInfo 4 }
After running the query, I got an result like this:
Name/OID: rptVswr.0;
Value (OctetString): 0x76 D5 8B 3F
Does anyone have an idea how to convert that string into a readable format?
It should be something like this : 1.15 or 2.15
Many thanks for your help,
BR - Nils

Here is pretty simple C++ app that decodes hex data and converts it to float:
#include <iostream>
#include <algorithm>
using namespace std;
int main()
{
unsigned char ptr[] = {0x76, 0xD5, 0x8B, 0x3F};
reverse(ptr, ptr + 4);
float f = *reinterpret_cast<float*>(ptr);
cout << f << endl;
return 0;
}
The result is 2.16559e+33

My experience with RF devices is that the SNMP replies are either in decimal or hex format and represent power in mW. If you take your get response 0x76 D5 8B 3F, and convert hex to decimal, you get 1,993,706,303 mW. This translates to 1.9937 kW. For VSWR, this is an accurate and acceptable measurement if your forward power is 2+ MW.

Related

Non-ASCII character casted on int

I would like to cast a non-ASCII character (for example 'ą') on int to get it number in UTF-8. When I do something like this:
#include <iostream>
using namespace std;
int main()
{
cout << static_cast<int>('ą')<<endl;
return 0;
}
I get -71 what is not its proper number in UTF-8. I heard that it might be because 'ą' is stored in 2 bytes and one of them is cut away when initialization of variable. Any solution for this?

how can i safely convert an ascii integer back to its associated ascii character using curses in c++?

I have not been able to find a reliable solution for my problem, what i'm simply trying to do is create some function which:
takes an rows and columns position in the terminal.
calls mvinch(window_object , rows, cols), which returns an unsigned int which corresponds to the character in the terminal at that position.
returns the ascii character associated with that unsigned int, effectively casting it back to a char.
Here is an example of my code in c++11:
char Kmenu::getChrfromW(size_t const y, size_t const x,
bool const save_cursor) const {
size_t curr_y, curr_x;
getyx(_win, curr_y, curr_x);
char ich = mvwinch(_win, y, x);
char ch = ich;
if (save_cursor)
wmove(_win, curr_y, curr_x);
return ch;
}
If for example the character in the terminal at position 2,3 is the letter 'a', i want this function to return the letter 'a'.
I tried the solution described here:
Convert ASCII number to ASCII Character in C
which effectively casts an integer as char.
unfortunately what i get back is still the integer: testing with a screen filled with 'w's, i get back the integer 119.
the man page for the curses function mvwinch() describes the function to return chtype, which the compiler recognises as unsigned int.
Is there a built in a curses function which gives the char back directly without casting to unsigned int, or some other way i can achieve this?
Edit: ch to ich, as in the actual code
A chtype contains a character along with other data. The curses.h header has several symbols which are useful for extracting those bits. If you mask it with A_CHARTEXT and cast that to a char, you will get a character:
char c = (char)((A_CHARTEXT) & n);
Your example should not compile, since it declares ch twice. You may have meant this:
char Kmenu::getChrfromW(size_t const y, size_t const x,
bool const save_cursor) const {
int curr_y, curr_x; // size_t is inappropriate...
getyx(_win, curr_y, curr_x);
char ch = (char)((A_CHARTEXT) & mvwinch(_win, y, x));
// char ch = ich;
if (save_cursor)
wmove(_win, curr_y, curr_x);
return ch;
}
The manual page for mvwinch mentions the A_CHARTEXT mask in the Attributes section, assuming the reader is familiar with things like that:
The following bit-masks may be AND-ed with characters returned by
winch.
A_CHARTEXT Bit-mask to extract character
A_ATTRIBUTES Bit-mask to extract attributes
A_COLOR Bit-mask to extract color-pair field information

Error when trying to use vehicle tracking in Veins via TraCI Interface to SUMO

I am using Veins3.0 with SUMO-0.21.0 and omnetpp4.4.
I tried to use the vehicle tracking command in TraCI/SUMO. It is described here: http://sumo.dlr.de/wiki/TraCI/Change_GUI_State.
There you can read that the Variable View ID Type of the value New Value for this command are 0xa6 "View #0" string <vehicle id>.
So I wrote a new function in TraCICommandInterface.cc to track a vehicle.
void TraCICommandInterface::setVehicleTracking(std::string nodeId) {
uint8_t variableId = VAR_TRACK_VEHICLE;
uint8_t variableType = TYPE_STRING;
TraCIBuffer buf = connection.query(CMD_SET_GUI_VARIABLE, TraCIBuffer() << variableId << "View #0" << variableType << nodeId);
ASSERT(buf.eof());
}
I used some constants from TraCIConstants.h
// track vehicle
#define VAR_TRACK_VEHICLE 0xa6
// command: set GUI variable
#define CMD_SET_GUI_VARIABLE 0xcc
// 8 bit ASCII string
#define TYPE_STRING 0x0C
The function is called from TraCIMobility.h, which fills the node id with getExternalID().
void commandTrackVehicle(){
getCommandInterface()->setVehicleTracking(getExternalId());
}
The error occurs when I call commandTrackVehicle() from the vehicle module with mobility->commandTrackVehicle();.
The error text in SUMO is:
error: tcpip::Storage::readIsSafe: want to read 1717063210 byte from Storage, but only 12 remaining
Does anybody has an idea how to solve this problem or to get more information about the error? Thanks.
Your code should work if you change TraCIBuffer() << "View #0" to TraCIBuffer() << std::string("View #0").
The reason is a bit complex:
SUMO's TraCI API defines its data type string as
32 bit string length, followed by text coded as 8 bit ASCII
Veins 3 has an overload for sending std::string as a TraCI-compatible string. It does not have one for a pointer-to-byte(s) data type (char*). That is, if you insert a char* (which is what "View #0" is to the compiler) into a Veins 3 TraCIBuffer, it won't know to apply this special formatting, confusing SUMO by sending a Byte (that it tries to interpret as a 32 bit length, followed by trying to read as many Bytes as the "length" indicates, and failing).

How to interface LM75A temperature IC with AVR

I am trying to get the temperature data of LM75A which is connected to atmega8 microcontroller using i2c, and display the data to docklight using serial communication. I have written the code and the output I am getting is
FF 7F 0F
According to the datasheet, if I ignore FF then 7F 0F will lead to +125 C temperature. But i dont know if its right or wrong(and why to ignore FF). So i am confused in cracking the output I am getting. The code which I think is correct but if it is wrong please correct it.
CODE:
#ifndef F_CPU
#define F_CPU 8000000UL
#endif
#include <avr/io.h>
#include<util/delay.h>
#include <stdio.h>
//Serial tansmit
void serial_avr(char *str)
{
UCSRB=(1<<TXEN);
UCSRC=(1<<UCSZ1)|(1<<UCSZ0)|(1<<URSEL);
UBRRL=51;
for (unsigned int i=0;str[i]!=0;i++)
{
UDR=str[i];
while(!(UCSRA&(1<<UDRE)));
}
_delay_ms(500);
}
void i2c_init(void)
{
TWSR=0x00;
TWBR=0x47;
TWCR=0x04;
}
void i2c_start(void)
{
TWCR = (1 << TWINT) | (1 << TWSTA) | (1 << TWEN);
while ((TWCR & (1 << TWINT)) == 0);
}
void i2c_write(unsigned char data)
{
TWDR = data ;
TWCR = (1<< TWINT)|(1<<TWEN);
while ((TWCR & (1 <<TWINT)) == 0);
}
unsigned char i2c_read(unsigned char ackVal)
{
TWCR = (1<< TWINT) | (1<<TWEN) | (ackVal<<TWEA);
while (!(TWCR & (1 <<TWINT)));
return TWDR ;
}
void i2c_stop()
{
TWCR = (1<< TWINT)|(1<<TWEN)|(1<<TWSTO);
}
void main(void)
{
int i =23;
unsigned char temp[20];
i2c_init();
i2c_start();
i2c_write(0b10010001); //slave address for LM75A
i2c_stop();
i2c_init();
i2c_start();
i2c_write(0b00000000); //pointer register address of LM75A
i2c_stop();
i2c_init();
i2c_start();
temp[20] = i2c_read(1);
i2c_stop();
while(1)
{
serial_avr(temp);
_delay_ms(2000);
}
}
I am reading the temperature in array and I am getting the output as FF 7F 0F and when I initialise it as a normal char variable then I am getting the output as C4. I am confused, i dont know where I am missing the point. If there is any error in the code then please tell me and how to crack the output.??
Please help, thanks.!
The first obvious error is how you treat the array temp[20].
You only read one byte from the sensor, but then write the value off the end of the array. (The only valid spots in the array are temp[0] to temp[19]. temp[20] is past the memory allocated.) You should be reading 3 bytes from the sensor and storing them at temp[0] to temp[2].
The next error with temp is how you write it out over the serial. You should be writing all the bytes of the array, not all the bytes until a 0. You don't know the last byte of the array is 0, because it never had a string in it. A convenient thing to do would be to #define a value for the length of the array so you could refer to it in the declaration of the array and in the write function.
Until you fix these problems, it is hard to tell if the rest is working. I don't see how you can even know that the values form the sensor are FF 7F 0F.

Lua Alien - Defined variables in the Win32 API WaitForSingleObject function

I am using Alien for Lua to reference the WaitForSingleObject function in the Windows Kernel32.dll.
I am pretty new to Windows programming, so the question I have is about the following #defined variables referenced by the WaitForSingleObject documentation:
If dwMilliseconds is INFINITE, the function will return only when the object is signaled.
What is the INFINITE value? I would naturally assume it to be -1, but I cannot find this to be documented anywhere.
Also, with the following table, it mentions the return values in hexadecimal, but I am confused as to why they have an L character after the last digit. Could this be something as simple as casting it to a Long?
The reason I ask is because Lua uses a Number data type, so I am not sure if I should be checking for this return value via Hex digits (0-F) or decimal digits (0-9)?
The thought crossed my mind to just open a C++ application and print out these values, so I did just that:
#include <windows.h>
#include <process.h>
#include <iostream>
int main()
{
std::cout << INFINITE;
std::cout << WAIT_OBJECT_0;
std::cout << WAIT_ABANDONED;
std::cout << WAIT_TIMEOUT;
std::cout << WAIT_FAILED;
system("pause");
return 0;
}
The final Lua results based off my findings is:
local INFINITE = 4294967295
local WAIT_OBJECT_0 = 0
local WAIT_ABANDONED = 128
local WAIT_TIMEOUT = 258
local WAIT_FAILED = 4294967295
I tried to Google for the same information. Eventually, I found this Q&A.
I found two sources with: #define INFINITE 0xFFFFFFFF
https://github.com/tpn/winsdk-10/blob/master/Include/10.0.10240.0/um/WinBase.h#L704
https://github.com/Alexpux/mingw-w64/blob/master/mingw-w64-tools/widl/include/winbase.h#L365
For function WaitForSingleObject, parameter dwMilliseconds has type DWORD.
From here: https://learn.microsoft.com/en-us/windows/win32/winprog/windows-data-types
I can see: DWORD A 32-bit unsigned integer.
Thus, #RemyLebeau's comment above looks reasonable & valid:
`4294967295` is the same as `-1` when interpreted as a signed integer type instead.`
In short: ((DWORD) -1) == INFINITE
Last comment: Ironically, this "infinite" feels similar to the Boeing 787 problem where they needed to reboot/restart the plane once per 51 days. Feels eerily close / similar!

Resources