Qt network reply is empty - qnetworkrequest

I know that this question has been asked many times however in my case the suggested codes and solutions aren't cutting it. The network reply is still my case empty and the error code is 0.
Here's my function:
QString NWork::send(QVector<QString> &data) const{
//QNetworkAccessManager qnam = new QNetworkAccessManager();
QNetworkAccessManager qnam;
try{
QString json = NWork::to_JSON(data);
QByteArray json_data(json.toUtf8());
QNetworkRequest request;
request.setUrl(QUrl(NWork::connection));
request.setRawHeader("Content-Type", "application/json");
request.setRawHeader("Content-Length", json_data);
reply = qnam.post(request, json_data);
//reply = qnam.get(request);
int status = reply->attribute(QNetworkRequest::HttpStatusCodeAttribute).toInt();
QString s(reply -> readAll());
qDebug()<<"code "<<status<<"Content "<<s;
//return QString::fromUtf8(response.data(),response.size());
}catch(std::exception x){
std::cout<<x.what()<<std::endl;
}
return "";
}
Making connections of the form suggested by many like
connect(qnam,SIGNAL(destroyed(QNetworkReply*)),this,SLOT(read(QNetworkReply*)));
have no effect on all. The request is reaching the PHP script and I know this by writing the request data in a file. It does so for every request. Echoing anything back even with a text/html header is not working.
Yes, I have tried my PHP script with a HTML AJAX request program and it works. It writes to file, and returns a response to the browser. Same code in both cases.
Here's my PHP code:
header("Access-Control-Allow-Origin: *");
$k = file_get_contents("php://input");
$file = "/file/path/log.k";
//echo $file;
$handle = fopen($file, "a+");
if($handle){
echo $k;
fwrite($handle, $k."\n");
fclose($handle);
}
header("Content-Type: text/html");
echo "line 22 ".$que;
exit(0);
I've checked my apache2 error logs and none are invoked. Why is it not working in my case?

I know this is almost a year old question but I just started teaching myself Qt and I recently ran into this issue as well and was bought to this page. So for those who are also stuck on this, here is how I solved it.
First change the connect from:
connect(qnam,SIGNAL(destroyed(QNetworkReply*)),this,SLOT(read(QNetworkReply*)));
to:
connect(reply, SIGNAL(finished()), this, SLOT(onReply()));
Then add it to your code after the post call like so:
reply = qnam.post(request, json_data);
connect(reply, SIGNAL(finished()), this, SLOT(onReply()));
The finished method is part of the QNetworkReply signals and is fired when the reply is finished. The method inside of SLOT is a Q_SLOT that you have to define in your hpp. Then move your code to your onReply method it would look similar to this:
QNetworkReply* reply = qobject_cast<QNetworkReply*>(sender());
QString response = reply->readAll();
if (reply)
{
if (reply->error() == QNetworkReply::NoError)
{
const int available = reply->bytesAvailable();
if (available > 0)
{
const QByteArray buffer = reply->readAll();
response = QString::fromUtf8(buffer);
// success = true;
}
}
else
{
response = tr("Error: %1 status: %2").arg(reply->errorString(), reply->attribute(QNetworkRequest::HttpStatusCodeAttribute).toString());
}
qDebug()<<"code: "<<reply->attribute(QNetworkRequest::HttpStatusCodeAttribute).toString()<<" response: "<<response;
reply->deleteLater();
}
sources: QNetworkReply, BlackBerry Sample App Maven source code

Related

Google reCAPTCHA working on Local but not on my server

I have a form which requires a google reCAPTCH to be ticked. It is working perfectly on Local but does not work when I put it on the development server. I have replaced the registered keys to the ones appointed to me by Google.
It keeps outputting the error message.
The method in my form is post.
I do not understand why it doesn't work. Can someone please help me?
Here is my code:
$secretKey = "#######";
$captcha = $_POST['g-recaptcha-response'];
$ip = $_SERVER['REMOTE_ADDR'];
$response=file_get_contents("https://www.google.com/recaptcha/api/siteverify?secret=".$secretKey."&response=".$captcha."&remoteip=".$ip);
$responseKeys = json_decode($response,true);
if(intval($responseKeys["success"]) !== 1) {
$throwErrorMessage = "You are a robot! ";
$throwError = 1;
$isvalid = False;
};
What version of ReCAPTCHA are you using? The docs on Google website here are pretty different than your code. In particular, you use the function file_get_contents while the documentation uses recaptcha_check_answer like in this example:
<?php
require_once('recaptchalib.php');
$privatekey = "your_private_key";
$resp = recaptcha_check_answer ($privatekey,
$_SERVER["REMOTE_ADDR"],
$_POST["recaptcha_challenge_field"],
$_POST["recaptcha_response_field"]);
if (!$resp->is_valid) {
// What happens when the CAPTCHA was entered incorrectly
die ("The reCAPTCHA wasn't entered correctly. Go back and try it again."."(reCAPTCHA said: " . $resp->error . ")");
} else {
// Your code here to handle a successful verification
}
?>
Can you post the error message?

No definition found for Table yahoo.finance.xchange

I have a service which uses a Yahoo! Finance table yahoo.finance.xchange. This morning I noticed it has stopped working because suddenly Yahoo! started to return an error saying:
{
"error": {
"lang": "en-US",
"description": "No definition found for Table yahoo.finance.xchange"
}
}
This is the request URL. Interesting fact: if I try to refresh the query multiple times, sometimes I get back a correct response but this happen very rarely (like 10% of the time). Days before, everything was fine.
Does this mean Yahoo API is down or am I missing something because the API was changed? I would appreciate any help.
Since I have the same problem and that it started today too, that others came to post exactly in the same time as well, and that it still works most of the time, the only explanation I can find is that they have some random database errors on their end and we can hope that this will be solved soon. I also have a 20% rate of failures when refreshing the page of the query.
My guess is that they use many servers to handle the requests (let's say 8) and that one of them is empty or doesn't have that table for some reasons so whenever it directs the query to that server, the error is returned.
Temporary solution: Just modify your script to retry 3-4 times. That did it for me because among 5 attempts at least one succeeds.
I solve this issue by using quote.yahoo.com instead of the query.yahooapis.com service. Here's my code:
function devise($currency_from,$currency_to,$amount_from){
$url = "http://quote.yahoo.com/d/quotes.csv?s=" . $currency_from . $currency_to . "=X" . "&f=l1&e=.csv";
$handle = fopen($url, "r");
$exchange_rate = fread($handle, 2000);
fclose($handle );
$amount_to = $amount_from * $exchange_rate;
return round($amount_to,2);
}
EDIT the above no longer works. At this point, lets just forget about yahoo lol Use this instead
function convertCurrency($from, $to, $amount)
{
$url = file_get_contents('https://free.currencyconverterapi.com/api/v5/convert?q=' . $from . '_' . $to . '&compact=ultra');
$json = json_decode($url, true);
$rate = implode(" ",$json);
$total = $rate * $amount;
$rounded = round($total);
return $total;
}
Same error, i migrate to http://finance.yahoo.com
Here is C# example
private static readonly ILog Log = LogManager.GetCurrentClassLogger();
private int YahooTimeOut = 4000;
private int Try { get; set; }
public decimal GetRate(string from, string to)
{
var url =
string.Format(
"http://finance.yahoo.com/d/quotes.csv?e=.csv&f=sl1d1t1&s={0}{1}=X", from, to);
var request = (HttpWebRequest)WebRequest.Create(url);
request.UseDefaultCredentials = true;
request.ContentType = "text/csv";
request.Timeout = YahooTimeOut;
try
{
using (var response = (HttpWebResponse)request.GetResponse())
{
var resStream = response.GetResponseStream();
using (var reader = new StreamReader(resStream))
{
var html = reader.ReadToEnd();
var values = Regex.Split(html, ",");
var rate = Convert.ToDecimal(values[1], new CultureInfo("en-US"));
if (rate == 0)
{
Thread.Sleep(550);
++Try;
return Try < 5 ? GetRate(from, to) : 0;
}
return rate;
}
}
}
catch (Exception ex)
{
Log.Warning("Get currency rate from Yahoo fail " + ex);
Thread.Sleep(550);
++Try;
return Try < 5 ? GetRate(from, to) : 0;
}
}
I've got the same issue.
I need exchange rates in my app, so I decided to use currencylayer.com API instead - they give 168 currencies, including precious metals and Bitcoin.
I've also written a microservice using webtask.io to cache rates from currencylayer and do cross-rate calculations.
And I've written a blog post about it 🤓
Check it out if you want to run your own microservice, it's pretty easy 😉
I found solution, in my case, just change http to https and everything works fine.

Firefox file downloading process structure figure

For my project, I need to study some info like "FireFox/Gecko file downloading structure overview"(if any), or somewhat "file downloading process flow chart of FireFox/Gecko". I couldn't find something like that in the Internet so far. Is there any info about it? Thanks a lot.
PS: It must include the paths about all file downloading through FireFox browser, which are via the network connection info APIs and file handling APIs, just like "httpOpenRequest" or "DoFileDownload" API(if any).
What would be the Firefox downloading process API paths?? Is there any figure or chart?
Please help me...
You are probably going to need to look at the code to get the information you desire. You will need to build the flowchart yourself.
There are a couple of different ways downloading is done in the code.
If you are talking about a Firefox add-on performing a download, then it is probably being done using Downloads.jsm (although there is an older method for doing so). The source code for that JavaScript module is at resource://gre/modules/Downloads.jsm (This URL is only valid in Firefox). There appear to be several files all located in the jsloader\resource\gre\modules directory within the zip format file called omni.ja in the root of the Firefox distribution. You can just copy that file and change the name to omni.zip and access it as a normal .zip file.
If you are wanting to know how Firefox saves a page when it is requested by the user: It is defined in the context menu with the oncommand value being gContextMenu.saveLink();. saveLink() is defined in: chrome://browser/content/nsContextMenu.js. It does some housekeeping and then calls saveHelper() which is in the same file.
The saveHelper() code is the following:
// Helper function to wait for appropriate MIME-type headers and
// then prompt the user with a file picker
saveHelper: function(linkURL, linkText, dialogTitle, bypassCache, doc) {
// canonical def in nsURILoader.h
const NS_ERROR_SAVE_LINK_AS_TIMEOUT = 0x805d0020;
// an object to proxy the data through to
// nsIExternalHelperAppService.doContent, which will wait for the
// appropriate MIME-type headers and then prompt the user with a
// file picker
function saveAsListener() {}
saveAsListener.prototype = {
extListener: null,
onStartRequest: function saveLinkAs_onStartRequest(aRequest, aContext) {
// if the timer fired, the error status will have been caused by that,
// and we'll be restarting in onStopRequest, so no reason to notify
// the user
if (aRequest.status == NS_ERROR_SAVE_LINK_AS_TIMEOUT)
return;
timer.cancel();
// some other error occured; notify the user...
if (!Components.isSuccessCode(aRequest.status)) {
try {
const sbs = Cc["#mozilla.org/intl/stringbundle;1"].
getService(Ci.nsIStringBundleService);
const bundle = sbs.createBundle(
"chrome://mozapps/locale/downloads/downloads.properties");
const title = bundle.GetStringFromName("downloadErrorAlertTitle");
const msg = bundle.GetStringFromName("downloadErrorGeneric");
const promptSvc = Cc["#mozilla.org/embedcomp/prompt-service;1"].
getService(Ci.nsIPromptService);
promptSvc.alert(doc.defaultView, title, msg);
} catch (ex) {}
return;
}
var extHelperAppSvc =
Cc["#mozilla.org/uriloader/external-helper-app-service;1"].
getService(Ci.nsIExternalHelperAppService);
var channel = aRequest.QueryInterface(Ci.nsIChannel);
this.extListener =
extHelperAppSvc.doContent(channel.contentType, aRequest,
doc.defaultView, true);
this.extListener.onStartRequest(aRequest, aContext);
},
onStopRequest: function saveLinkAs_onStopRequest(aRequest, aContext,
aStatusCode) {
if (aStatusCode == NS_ERROR_SAVE_LINK_AS_TIMEOUT) {
// do it the old fashioned way, which will pick the best filename
// it can without waiting.
saveURL(linkURL, linkText, dialogTitle, bypassCache, false,
doc.documentURIObject, doc);
}
if (this.extListener)
this.extListener.onStopRequest(aRequest, aContext, aStatusCode);
},
onDataAvailable: function saveLinkAs_onDataAvailable(aRequest, aContext,
aInputStream,
aOffset, aCount) {
this.extListener.onDataAvailable(aRequest, aContext, aInputStream,
aOffset, aCount);
}
}
function callbacks() {}
callbacks.prototype = {
getInterface: function sLA_callbacks_getInterface(aIID) {
if (aIID.equals(Ci.nsIAuthPrompt) || aIID.equals(Ci.nsIAuthPrompt2)) {
// If the channel demands authentication prompt, we must cancel it
// because the save-as-timer would expire and cancel the channel
// before we get credentials from user. Both authentication dialog
// and save as dialog would appear on the screen as we fall back to
// the old fashioned way after the timeout.
timer.cancel();
channel.cancel(NS_ERROR_SAVE_LINK_AS_TIMEOUT);
}
throw Cr.NS_ERROR_NO_INTERFACE;
}
}
// if it we don't have the headers after a short time, the user
// won't have received any feedback from their click. that's bad. so
// we give up waiting for the filename.
function timerCallback() {}
timerCallback.prototype = {
notify: function sLA_timer_notify(aTimer) {
channel.cancel(NS_ERROR_SAVE_LINK_AS_TIMEOUT);
return;
}
}
// set up a channel to do the saving
var ioService = Cc["#mozilla.org/network/io-service;1"].
getService(Ci.nsIIOService);
var channel = ioService.newChannelFromURI(makeURI(linkURL));
if (channel instanceof Ci.nsIPrivateBrowsingChannel) {
let docIsPrivate = PrivateBrowsingUtils.isWindowPrivate(doc.defaultView);
channel.setPrivate(docIsPrivate);
}
channel.notificationCallbacks = new callbacks();
let flags = Ci.nsIChannel.LOAD_CALL_CONTENT_SNIFFERS;
if (bypassCache)
flags |= Ci.nsIRequest.LOAD_BYPASS_CACHE;
if (channel instanceof Ci.nsICachingChannel)
flags |= Ci.nsICachingChannel.LOAD_BYPASS_LOCAL_CACHE_IF_BUSY;
channel.loadFlags |= flags;
if (channel instanceof Ci.nsIHttpChannel) {
channel.referrer = doc.documentURIObject;
if (channel instanceof Ci.nsIHttpChannelInternal)
channel.forceAllowThirdPartyCookie = true;
}
// fallback to the old way if we don't see the headers quickly
var timeToWait =
gPrefService.getIntPref("browser.download.saveLinkAsFilenameTimeout");
var timer = Cc["#mozilla.org/timer;1"].createInstance(Ci.nsITimer);
timer.initWithCallback(new timerCallback(), timeToWait,
timer.TYPE_ONE_SHOT);
// kick off the channel with our proxy object as the listener
channel.asyncOpen(new saveAsListener(), null);
}

How to prevent additional page requests after response sent

I have configured a listener on kernel.request which sets a new response with redirect when the session time has reached a certain value. The listener works fine and redirects to a certain page, on the next request, after the session has ended. But my problem is on the page I have many links and if I press multiple times the same link, the initial request with the redirect is cancelled/stopped and a new request is made with the last link pressed and so it passes my redirect even though the session has ended and is destroyed. So, my question is how to prevent additional requests/link presses after the firs request is made?
Here is my code:
public function onKernelRequestSession(GetResponseEvent $event)
{
$request = $event->getRequest();
$route = $request->get('_route');
$session = $request->getSession();
if ((false === strpos($route, '_wdt')) && ($route != null)) {
$session->start();
$time = time() - $session->getMetadataBag()->getCreated();
if ($route != 'main_route_for_idle_page') {
if (!$session->get("active") && $route == 'main_route_for_site_pages') {
$session->invalidate();
$session->set("active", "1");
} else {
if ($time >= $this->sessionTime) {
$session->clear();
$session->invalidate();
$event->setResponse(new RedirectResponse($this->router->generate('main_route_for_idle_page')));
}
}
} else {
if ($session->get("activ")) {
$session->clear();
$session->invalidate();
}
}
}
}
Thak you.
Idea #1: Simple incremental counter
Each request sends sequence number as param which is being verified as expected at the server.
Server increments the number and sends it back via response
the new number is used in future requests
Basically, if server expects the SEQUENCE number to be 2 and client sends 1 the request is to be rejected.
Idea #2: Unique hash each time
Similar to the idea above, but uses unique hashes to eliminate predictive nature of incremental sequence.
I resolved the issue using JQuery: when a link was pressed I disabled the other ones and so only one request is made from the page:
var isClicked = false;
$(".menu-link").click(function(e) {
if(!isClicked) {
isClicked = true;
} else {
e.preventDefault();
}
});
Thanks.

FTP status code response don't work

Welcome!
I have a little problem with own application. This app can be connect(sith socket) an FTP server, and its work fine. But my problem is, if the user use bad usernam or password, the program won't receive the response statucode. Whats wrong?
I would like to use this statuscode some clause to examine(usernem or/and password etc.)
Code:
public static void ReadResponse()
{
result = ParseHostResponse();
statusCode = int.Parse(result.Substring(0, 3));
statusMessage = "";
}
The ParseHostResponse() method contains next:
Code:
public static string ParseHostResponse()
{
SocketAsyncEventArgs socketEventArg = new SocketAsyncEventArgs();
socketEventArg.RemoteEndPoint = socket.RemoteEndPoint;
socketEventArg.SetBuffer(buffer, BUFFER_SIZE, 0);
socketEventArg.Completed += new EventHandler<SocketAsyncEventArgs>(delegate(object s, SocketAsyncEventArgs e)
{
if (e.SocketError == SocketError.Success)
{
statusMessage = Encoding.UTF8.GetString(e.Buffer, e.Offset, e.BytesTransferred);
statusMessage = statusMessage.Trim('\0');
}
else
{
statusMessage = e.SocketError.ToString();
}
});
socket.ReceiveAsync(socketEventArg);
string[] msg = statusMessage.Split('\n');
if (statusMessage.Length > 2)
{
statusMessage = msg[msg.Length - 2];
}
else
{
statusMessage = msg[0];
}
if (!statusMessage.Substring(3, 1).Equals(" "))
{
return ParseHostResponse();
}
return statusMessage;
}
If I invite to the ReadResponse() method, the Visual Studio answer with this exception: NullReferenceException
in this code:
Code:
.
.
string[] msg = statusMessage.Split('\n');
.
What is the wrong? This code issue to http://msdn.microsoft.com/en-us/library/hh202858%28v=vs.92%29.aspx#BKMK_RECEIVING
Thank you for your help!
I can't help, but have to start with these side remarks:
statusMessage.Trim('\0') does not work (try it)
statusMessage.Split('\n') is inefficient as it involves extra allocations (guess why)
Now to the actual subject: I never used sockets on WP7, but from what I know about async operations it looks to me that you start async op (by calling ReceiveAsync) and use the result (statusMessage) before the answer arrives.
Think a bit about your design of the ParseHostResponse() method:
Bad name: Indicates parsing of a response, while it actually performs communication
Bad functionality: The method indicates sync patter, but internally uses async pattern. I don't know what to suggest here as every solution seems to be wrong. For example waiting for a response will make UI irresposible.
My main recommendation is that you get more information about async programming and then reprogramm your app accordingly.

Resources