Boost asio synchronous https call- Json response have unintended character - https

we are migrating from http to https boost asio sysnchornous call and I am using the below code to make https synchoronous call with ssl certificate validation and we got the response into multiple lines; as of now i have removed the line feed character(\r\n)(upstream system is saying that they are sending response in single line and without any extra character as described below) and tried to parse the response but sometimes we are getting the response with extra characters in key value pairs as shown below:
try{
fast_ostringstream oss;
boost::asio::streambuf request_;
boost::asio::streambuf response_;
boost::system::error_code ec;
boost::asio::ssl::context ctx(boost::asio::ssl::context::sslv23);
ctx.set_verify_mode(boost::asio::ssl::verify_peer);
ctx.set_default_verify_paths(ec);
if (ec)
{
fast_ostringstream oss;
oss << "Issue in settign the default path:" << ec.message();
PBLOG_INFO(oss.str());
}
oss << ec.message();
ctx.add_verify_path("/home/test/pemcert/");
ctx.set_options(boost::asio::ssl::context::default_workarounds |
boost::asio::ssl::context::no_sslv2 |
boost::asio::ssl::context::no_sslv3);
boost::asio::ssl::stream<boost::asio::ip::tcp::socket> socket(io_service,ctx);
std::ostream request_stream(&request_);
request_stream << "POST " << server_endpoint << " HTTP/1.1\r\n";
request_stream << "Host: " << hostname << "\r\n";
request_stream << "Accept: */*\r\n";
request_stream << authorization_token << "\r\n";
request_stream << client_name << "\r\n";
request_stream << "Content-Length: " << req_str.length() << "\r\n";
request_stream << "Content-Type: application/x-www-form-urlencoded \r\n";
request_stream << "Connection: close\r\n\r\n";
request_stream << req_str << "\r\n";
tcp::resolver resolver(io_service);
tcp::resolver resolver(io_service);
tcp::resolver::query query(hostname, port_no);
tcp::resolver::iterator endpoint_iterator = resolver.resolve(query);
tcp::resolver::iterator end;
boost::system::error_code error = boost::asio::error::host_not_found;
boost::asio::connect(socket.lowest_layer(), endpoint_iterator, error);
boost::system::error_code echs;
socket.handshake(boost::asio::ssl::stream_base::client, echs);
boost::asio::write(socket, request_);
PBLOG_INFO("Trac Request successfully sent");
// Read the response status line.
boost::asio::read_until(socket, response_, "\r\n");
string res=make_string(response_);
// Check that response is OK.
std::istream response_stream(&response_);
std::string http_version;
response_stream >> http_version;
unsigned int status_code;
response_stream >> status_code;
std::string status_message;
std::getline(response_stream, status_message);
if (!response_stream || http_version.substr(0, 5) != "HTTP/")
{
PBLOG_WARN("Invalid response\n");
}
if (status_code != 200)
{
fast_ostringstream oss;
oss << "Response returned with status code: " << status_code << "\n";
PBLOG_WARN(oss.str());
}
boost::asio::read(socket, response_, boost::asio::transfer_all(), error);
if (error.value() != 335544539 && strcmp(error.category().name(),"asio.ssl") != 0 )
{
fast_ostringstream oss;
oss << "Error : " << error.message() << "Value:" << error.value() << "Category Name:" << error.category().name();
PBLOG_WARN(oss.str());
return false;
}
else
{
string message = make_string(response_);
size_t pos = message.find( "header" );
if( pos != std::string::npos)
{
pos = pos - 2;
string msg = message.substr(pos, message.length());
msg.erase(std::remove(msg.begin(),msg.end(),'\n'),msg.end());
msg.erase(std::remove(msg.begin(),msg.end(),'\r'),msg.end());
msg.erase(msg.size()-1); //to ignore the short read error
response = msg;
}
else
{
fast_ostringstream oss;
oss << "Invalid Response: " << message;
PBLOG_WARN(oss.str());
return false;
}
socket.lowest_layer().shutdown(tcp::socket::shutdown_both);
}
}
Json response:
I couldn't paste the full response due to security reason but small part where the extra character(here 21f0 is getting appended while we got the resposne)is getting added is shown below:
"SGEType":{"decisionKey"21f0:"SGMtype","decisionValue":null,"decisionGroup":"partyTranslations","ruleName":"Party Details Translations"}
Please let me know whether i am reading from the socket is accurate or needs modification.

I couldn't paste the full response due to security reason but small part where the extra character(here 21f0 is getting appended while we got the response)is getting added is shown below:
We have no way of knowing. How big is the response? My wild stab at things is that it might be using chunked encoding which you are potentially mis-handling because you manually "non-parse" HTTP responses?
In that case, my earlier answer might be kind of prophetic:
Luckily, you can also refer to that live example to see how to use Boost Beast to read the response correctly.
Live On Coliru
I'll also repeat the summary because the lesson is an important one:
Side note: Just reading until EOF would probably have worked for HTTP/1.0. But the server might rightfully reject that version or choose to respond with HTTP/1.1 anyways.

Related

Problem with multiple controllers in OMNeT++ SDN

I'm trying to build multiple controllers. The original code that connect all switches to one controller is as follows:
void OFA_switch::connect()
{
socket.renewSocket();
int connectPort = par("connectPort");
/*
const char *connectAddress= par("connectAddress");
EV << "connectAddress = " << connectAddress << " connectPort =" << connectPort << endl;
if (getParentModule()->getParentModule()->getSubmodule("controller") != NULL)
{
// multiple controllers; full path is needed for connect address
connectAddress = (getParentModule()->getParentModule())->getSubmodule("controller")->getFullPath().c_str();
cModule *ctl = getSystemModule()->getSubmodule("controller");
if(ctl != NULL) {
EV << "ctl->getFullPath() = " << ctl->getFullPath().c_str() << endl;
connectAddress = ctl->getFullPath().c_str();
}
EV << "After: connectAddress = " << connectAddress << endl;
}
*/
L3Address ctlIPAddr;
EV << "connect L3Address = " << L3AddressResolver().tryResolve("controller", ctlIPAddr) << endl;
// EV << "result: connectAddress = " << ctlIPAddr << endl;
// socket.connect(L3AddressResolver().resolve(connectAddress), connectPort);
socket.connect(ctlIPAddr, connectPort);
}
I'm trying to make some switches connected to controller1 while the other switches connected to controller2, so I tried to adapt the following code to:
void My_OFA_switch::connect() {
socket.renewSocket();
int connectPort = par("connectPort");
const char *connectAddress = par("connectAddress");
EV << "connectAddress = " << connectAddress << " connectPort =" << connectPort << endl;
const char *connectAddr;
cModule *ctl;
if (strcmp (connectAddress, "controller1")==0)
{ connectAddr = (getParentModule())->getSubmodule("controller1")->getFullPath().c_str();
ctl = getSystemModule()->getSubmodule("controller1");
}
else if (strcmp (connectAddress, "controller2")==0)
{ connectAddr = (getParentModule())->getSubmodule("controller2")->getFullPath().c_str();
ctl = getSystemModule()->getSubmodule("controller2");
}
if(ctl != NULL) {
EV << "ctl->getFullPath() = " << ctl->getFullPath().c_str() << endl;
connectAddr = ctl->getFullPath().c_str();
}
L3Address ctlIPAddr;
EV << "connect L3Address = " << L3AddressResolver().tryResolve(connectAddr, ctlIPAddr) << endl;
socket.connect(ctlIPAddr, connectPort);
}
Also, there is a file Switch.cc which represents the controller behavior
"in ini file :
*.controller.behavior = "Switch" "that contain:
void Switch::initialize() {
cModule *ITModule =
getParentModule()->getSubmodule("ofa_controller");
controller = check_and_cast<OFA_controller *>(ITModule);
getParentModule()->subscribe("PacketIn",this); }
Should I change something here?
But when I run it the following runtime error appears and immediately close the simulation:
Simulation run has encountered a problem. Finished with error.
And in console it appeared:
Simulation terminated with exit code: -1073741819
Working directory: D:/omnet/OpenFlowOmnet/omnetpp-5.6.2-src-windows/omnetpp-5.6.2/myws/openflow/scenarios
Command line: ../openflow.exe -m -n ..;../../inet/src;../../inet/examples;../../inet/tutorials;../../inet/showcases --image-path=../images;../../inet/images -l ../../inet/src/INET My_2Domain_Ctrl.ini
Environment variables:
PATH=;D:/omnet/OpenFlowOmnet/omnetpp-5.6.2-src-windows/omnetpp-5.6.2/myws/inet/src;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\bin;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\mingw64\bin;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\usr\bin;;D:/omnet/OpenFlowOmnet/omnetpp-5.6.2-src-windows/omnetpp-5.6.2/ide/jre/bin/server;D:/omnet/OpenFlowOmnet/omnetpp-5.6.2-src-windows/omnetpp-5.6.2/ide/jre/bin;D:/omnet/OpenFlowOmnet/omnetpp-5.6.2-src-windows/omnetpp-5.6.2/ide/jre/lib/amd64;.;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\bin;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\mingw64\bin;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\usr\local\bin;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\usr\bin;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\usr\bin;C:\Windows\System32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\usr\bin\site_perl;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\usr\bin\vendor_perl;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\tools\win64\usr\bin\core_perl;D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2;
OMNETPP_ROOT=D:/omnet/OpenFlowOmnet/omnetpp-5.6.2-src-windows/omnetpp-5.6.2/
OMNETPP_IMAGE_PATH=D:\omnet\OpenFlowOmnet\omnetpp-5.6.2-src-windows\omnetpp-5.6.2\images
I really appreciate any guidance and help because I must do a lot of work and I'm running out of time.
enter image description here
One has to remember that strcmp() returns 0 if the contents of both strings are equal. And in C++ zero means false.
So if you want to do something when connectAddress is equal to "controller1", you should write:
if (strcmp (connectAddress, "controller1") == 0) {
// ...
}
Generally, in order to deal with a runtime exception in OMNeT++ do:
Set debug-on-errors=true in your omnetpp.ini.
Build the project in debug mode.
Start the simulation in debug (i.e. Run | Debug).
The simulation will stop in the line that causes an error and in the stack trace you may see the referenced calls.
Reference: Learn OMNeT++ with TicToc - Runtime errors

Serial Communication data problem between Windows and embedded System (STM32) (C/C++)

I currently try to set up communication between a Windows program and a µC.
I'll show you the code to initialize the port:
int serialCommunication::serialInit(void){
//non overlapped communication
hComm = CreateFile( gszPort.c_str(),
GENERIC_READ | GENERIC_WRITE,
0,
0,
OPEN_EXISTING,
0,
0);
if (hComm == INVALID_HANDLE_VALUE){
cout << "Error opening port." << endl;
return 0;
}
else{
cout << "Opened Port successfully." << endl;
}
if (SetCommMask(hComm, EV_RXCHAR) == FALSE){
cout << "Error setting communications mask." << endl;
return 0;
}
else{
SetCommMask(hComm, EV_RXCHAR);
cout << "Communications mask set successfully." << endl;
}
if (GetCommState(hComm, &dcbSerialParams) == FALSE){
cout << "Error getting CommState." << endl;
return 0;
}
else{
GetCommState(hComm, &dcbSerialParams);
cout << "CommState retrieved successfully" << endl;
}
dcbSerialParams.BaudRate = CBR_115200; // Setting BaudRate = 115200
dcbSerialParams.ByteSize = 8; // Setting ByteSize = 8
dcbSerialParams.StopBits = ONESTOPBIT; // Setting StopBits = 1
dcbSerialParams.Parity = NOPARITY; // Setting Parity = None
if (SetCommState(hComm, &dcbSerialParams) == FALSE){
cout << "Error setting CommState" << endl;
return 0;
}
else{
SetCommState(hComm, &dcbSerialParams);
cout << "CommState set successfully" << endl << endl;
cout << "+---CommState Parameters---+" << endl;
cout << "Baudrate = " << dcbSerialParams.BaudRate << endl;
cout << "ByteSize = " << static_cast<int>(dcbSerialParams.ByteSize) << endl; //static Cast, um int auszugeben und kein char
cout << "StopBits = " << static_cast<int>(dcbSerialParams.StopBits) << endl; //static Cast, um int auszugeben und kein char
cout << "Parity = " << static_cast<int>(dcbSerialParams.Parity) << endl; //static Cast, um int auszugeben und kein char
cout << "+--------------------------+" << endl;
}
/*------------------------------------ Setting Timeouts --------------------------------------------------*/
timeouts.ReadIntervalTimeout = 50;
timeouts.ReadTotalTimeoutConstant = 50;
timeouts.ReadTotalTimeoutMultiplier = 10;
timeouts.WriteTotalTimeoutConstant = 50;
timeouts.WriteTotalTimeoutMultiplier = 10;
if (SetCommTimeouts(hComm, &timeouts) == FALSE){
cout << "Error setting timeouts" << endl;
return 0;
}
else{
SetCommTimeouts(hComm, &timeouts);
cout << "Timeouts set successfully." << endl;
cout << "+--------------------------+" << endl;
return 1;
}
My Read function looks like this:
void serialCommunication::serialRead(void){
bool readStatus;
bool purgeStatus = 0;
bool correctData = 0;
cout << "Waiting for Data..." << endl; // Programm waits and blocks Port (like Polling)
readStatus = WaitCommEvent(hComm, &dwEventMask, 0);
if (readStatus == FALSE){
cout << "Error in setting WaitCommEvent." << endl;
}
else{
cout << "Data received." << endl;
do{
readStatus = ReadFile(hComm, &TempChar, sizeof(TempChar), &NoBytesRead, 0);
SerialBuffer += TempChar; // add tempchar to the string
}while (NoBytesRead > 0);
SerialBuffer.pop_back(); // Delete last sign in buffer, otherwise one "0" too much shows up, for example "23900" instead of "2390"
cout << endl << SerialBuffer << endl;
SerialBuffer = ""; // Reset string
}
So at some point, my µC sends the String "Init complete...!\r\n" after initializing some things. This works well.Init complete proof
Now after that, the communcation produces errors. I am getting Data I should not receive. The µC can only send data, if a specific string is sent to it by the PC. While debugging I could detect, that the µC never receives this specific string and therefore never sends data. In the following picture, I show you what gibberish I am receiving constantly though.
Receiving Gibberish
/EDIT: I am constantly receiving the same gibberish
The funny thing is, I even receive that data, when the µC is completely switched off (Serial Cables are still connected). So there has to be some data at the port, which just is not deleted. I tried to restart the PC aswell, but it didn't help either.
I will also show you my while loop on PC:
while (testAbbruch != 1){
pointer = acMessung(anzahlMessungen, average); // measurement with external multimeter
cout << endl;
cout << "Average: " << average << endl << endl;
if (average >= 30){
testAbbruch = 1; // there won't be a next while iteration
befehl = "stopCalibration\r\n";
serialTest.serialWrite(befehl);
serialTest.serialRead();
}
else{
cout << "Aktion: ";
std::getline (cin, befehl);
befehl = "increment"; //for debugging
if (befehl == "increment"){
befehl.append("\r\n"); // adding it, so the µC can detect the string correctly
serialTest.serialWrite(befehl);
serialTest.serialRead(); // µC has to answer
}
else if(befehl == "decrement"){
befehl.append("\r\n"); // adding it, so the µC can detect the string correctly
serialTest.serialWrite(befehl);
serialTest.serialRead(); // µC has to answer
}
befehl = ""; // string leeren für nächsten Aufruf
}
}
I know my program is far from perfect, but if I understood the serial Communication with Windows correctly, the buffer is deleted while reading.
Is there any clue you could give me?
EDIT// I just wrote a program that expects one of two inputs: One input is called "increment" the other one is called "decrement". Those inputs are sent to the µC via the serial communication port. Every time I try to send "increment" and instantly after that I am reading from the port, I receive the weird data from this picture. Now, every time I try to send "decrement" and instantly after that I am reading from the port, I receive the weird data from that picture.
//
So my guess is that the data somehow is changed and then looped back to the PC? But why and how?!

Boost Beast HTTP

I am working on a http parser, and it looks like boost.beast is a nice one. However, I still have some questions:
*** Assume HTTP Request POST data already received via boost.asio socket. Stored inside a std::string buffer.
Is there any good sample on how to extract http header fields and its value (one-after-another)? I assume it will be an iterator method, but i tried several way and still won't work.
How to extract the http body?
Thank you very much.
Starting from a simple example: https://www.boost.org/doc/libs/develop/libs/beast/example/http/client/sync/http_client_sync.cpp
// Declare a container to hold the response
http::response<http::dynamic_body> res;
// Receive the HTTP response
http::read(socket, buffer, res);
Extract The Headers
The response object already contains all the goods:
for(auto const& field : res)
std::cout << field.name() << " = " << field.value() << "\n";
std::cout << "Server: " << res[http::field::server] << "\n";
You can also just stream the entire response object:
std::cout << res << std::endl;
Extract The Body
std::cout << "Body size is " << res.body().size() << "\n";
To actually use the "dynamic_body", use standard Asio buffer manipulation:
#include <boost/asio/buffers_iterator.hpp>
#include <boost/asio/buffers_iterator.hpp>
std::string body { boost::asio::buffers_begin(res.body().data()),
boost::asio::buffers_end(res.body().data()) };
std::cout << "Body: " << std::quoted(body) << "\n";
Alternatively, see beast::buffers_to_string
Obviously, things become more straight-forward when using a string_body:
std::cout << "Body: " << std::quoted(res.body()) << "\n";

Boost Interprocess Send giving error: boost::interprocess_exception::library_error

I am using boost message queue to communicate among different processes. I am transmitting an object of type Packet. To do this, I am using serialization and deserialization in send and receive functions.
However, when I try to send the data, I am getting this error:
boost::interprocess_exception::library_error
No other information is given.
This is how I create message queues.
for(i = 0; i< PROC_MAX_E ; i++){
std::string mqName = std::string("mq") + std::to_string(i);
std::cout << " Size of Packet is " << sizeof(Packet) << std::endl;
message_queue mq(open_or_create, mqName.c_str(), MAX_QUEUE_SIZE_E, 100*sizeof(Packet)); // size of packet later
}
This is my Packet :
class Packet{
public :
Packet();
Packet(uint32_t aType, uint32_t aProcId);
~Packet();
uint32_t getType();
union{
uint32_t mFuncId;
//uint8_t mResult8;
uint32_t mResult32;
//uint64_t mResult64;
//bool mResult;
//uint8_t* mAddr8;
//uint32_t* mAddr32;
//uint64_t* mAddr64;
//char mData[MAX_PACKET_SIZE]; // This will be used to store serialized data
};
friend class boost::serialization::access;
template <class Archive>
void serialize(Archive & ar, const unsigned int version){
ar & _mType;
ar & _mProcId;
//ar & mData;
ar & mFuncId;
//ar & mResult32;
}
private :
uint32_t _mType;
uint32_t _mProcId;
}; // end class
} // end namespace
This is my serialize and deserialize functions:
std::string IPC::_serialize(Packet aPacket){
std::stringstream oss;
boost::archive::text_oarchive oa(oss);
oa << aPacket;
std::string serialized_string (oss.str());
return serialized_string;
}
Packet IPC::_deserialize(std::string aData){
Packet p;
std::stringstream iss;
iss << aData;
boost::archive::text_iarchive ia(iss);
ia >> p;
return p;
}
And this is my send and receive functions:
bool IPC::send(uint32_t aProcId, Packet aPacket){
try{
_mLogFile << "<-- Sending Data to Process : " << aProcId << std::endl;
//uint32_t data = aPacket;
std::string mqName = std::string("mq") + std::to_string(aProcId);
message_queue mq(open_only, mqName.c_str());
//serialize Packet
std::cout << "Serializing \n";
std::string data = _serialize(aPacket);
std::cout << " Serialized data =" << data.data() << "Size = " << data.size()<< std::endl;
mq.send(data.data(), data.size(), 0);
//mq.send(&data, sizeof(uint32_t), 0);
}catch(interprocess_exception &ex){
_mLogFile << "***ERROR*** in IPC Send to process : " << aProcId << " " << ex.what() << std::endl;
std::cout << "***ERROR*** in IPC Send to process : " << aProcId << " " << ex.what() << std::endl;
_ipc_exit();
}
}
I am getting exception during mq.send
When I transmit only integers it works fine. Only with serialization and deserialization, I get this error
Any help is greatly appreciated.I am a little stuck as the exception message is also not clear.
I am using boost 1_57_0
Rgds
Sapan
Try closing or flushing the string steam before using the string.
std::string IPC::_serialize(Packet aPacket){
std::stringstream oss;
{
boost::archive::text_oarchive oa(oss);
oa << aPacket;
}
return oss.str();
}

Winsock IRC client connects but does not send data

I'm using the code posted on http://social.msdn.microsoft.com/Forums/en/vcgeneral/thread/126639f1-487d-4755-af1b-cfb8bb64bdf8 but it doesn't send data just like it says in the first post. How do I use WSAGetLastError() like it says in the solution to find out what's wrong?
I tried the following:
void IRC::SendBuf(char* sendbuf)
{
int senderror = send(m_socket, sendbuf, sizeof(sendbuf), MSG_OOB);
if(senderror == ERROR_SUCCESS) {
printf("Client: The test string sent: \"%s\"\n", sendbuf);
}
else {
cout << "error is: " << senderror << ", WSAGetLastError: " << WSAGetLastError() << endl;
printf("Client: The test string sent: \"%s\"\n", sendbuf);
}
}
And the output is: error is: 4, WSAGetLastError: 0
You're evaluating the address of WSAGetLastError instead of calling it. You need to add parenthesis in order to actually call that function:
void IRC::SendBuf(char* sendbuf)
{
int senderror = send(m_socket, sendbuf, strlen(sendbuf), 0);
if (senderror != SOCKET_ERROR) {
printf("Client: The test string sent: \"%s\"\n", sendbuf);
} else {
cout << "Error is: " << WSAGetLastError() << endl;
}
}
EDIT: The send() function returns the number of bytes written, not an error code. You need to test the return value against SOCKET_ERROR, as in the updated code above. In your case, send() tells that it successfully sent 4 bytes.
As you noted below, it only sends 4 bytes because that's the size of the sendbuf variable (it's a pointer, not a buffer). If the string in sendbuf is null-terminated, you can use strlen() instead. If it isn't, you probably should add a string length parameter to IRC::SendBuf() itself.

Resources