WaitUntil not waiting / Get HTML on WaitForSelectorAsync - puppeteer-sharp

Having two problems that I would appreciate some advise on. Have used puppeteer in the past in node, but for some reason, running into a problem on the sharp version.
Basically I'm crawling a webpage with a WaitUntil set to WaitUntilNavigation.Networkidle0, the longest wait period. In my node code, this runs and loads my website correctly, but in the C# version, I get the page without angular loaded. From the best I can tell it is not waiting and returning the initial Load state. Below is my code.
if (BROWSER == null)
{
await new BrowserFetcher().DownloadAsync(BrowserFetcher.DefaultRevision);
BROWSER = await Puppeteer.LaunchAsync(new LaunchOptions
{
Headless = true,
Args = new string[] { "--no-sandbox", "--disable-accelerated-2d-canvas", "--disable-gpu", "--proxy-server='direct://'", "--proxy-bypass-list=*" }
});
}
if (page == null)
{
page = await BROWSER.NewPageAsync();
await page.SetUserAgentAsync("PScraper-SiteCrawler");
await page.SetViewportAsync(new ViewPortOptions() { Width = 1024, Height = 842 });
var response = await page.GoToAsync(url, new NavigationOptions() { Referer = "PScraper-SiteCrawler", Timeout = timeoutMilliseconds, WaitUntil = new[] { WaitUntilNavigation.Networkidle0 } });
}
Timeout is set to 30 seconds, or 30,000 milliseconds. I then get the html of the page doing
await reponse.TextAsync()
My second question is unrelated, but likely simpler to solve. One route I was considering was using the page.WaitForSelectorAsync() method. This appears to wait until the content I'm looking for is loaded, but I haven't been able to figure out how to get the entire html of the page after this is done from the ElementHandle return.
Would appreciate some help here, tried a couple routes and haven't been able to figure out whats causing the difference between the node and C# code.

Solved my problem. The issue was how I was getting the html of the page.
I was using...
await reponse.TextAsync()
Apparently, this gets me only the initial response. When I changed my html get to the following line of code everything worked as expected.
await page.GetContentAsync()

Related

Using ZXingScannerPage with XF, my content page has weird behavior

I am making an app in xamarin forms of which I will have a login similar to that of whatapp web, an on-screen qr that will be scanned by the phone, in the emulator with visual studio 2017 I have no problems, but when I export the app to an apk and the I install on a mobile device, the app reads the qr and returns to the previous login screen, not showing any reaction, which should be to go to the next screen where I have a dashboard.
What can be? I enclose my code used.
btnScanQRCode.IsEnabled = false;
var scan = new ZXingScannerPage();
scan.OnScanResult += (result) =>
{
scan.IsScanning = false;
Device.BeginInvokeOnMainThread(async () =>
{
await Application.Current.MainPage.Navigation.PopAsync();
var resultado = JsonConvert.DeserializeObject<QrCode>(result.Text);
JObject qrObject = JObject.Parse(JsonConvert.SerializeObject(resultado));
JsonSchema schema = JsonSchema.Parse(SettingHelper.SchemaJson);
bool valid = qrObject.IsValid(schema);
if (valid == true)
{
App.Database.InsertQrCode(resultado);
QrCode qr = App.Database.GetQrCode();
await _viewModel.Login();
await Navigation.PushAsync(new Organization());
}
else
{
await DisplayAlert("False", JsonConvert.SerializeObject(resultado), "ok");
}
});
};
await Application.Current.MainPage.Navigation.PushAsync(scan);
btnScanQRCode.IsEnabled = true;
This was originally a comment, but through the writing i realized this is the answer.
You need to debug your code. Attach a device and deploy the app in Debug config. Step through your code and see where it fails.
It sounds like it's crashing silently and probably on the line where you Deserialize result.Text in a QrCode. result.Text is just a string and will never deserialize into an object. You probably need a constructor that takes a string like QrCode(result.Text).
First scan then use the result to do other things in your app.
var scanner = new ZXing.Mobile.MobileBarcodeScanner();
var result = await scanner.Scan();
Check for proper camera permissions. I bet your problem is there.

Telegram Bot Random Images (How to send random images with Telegram-Bot)

const TeleBot = require('telebot');
const bot = new TeleBot({
token: 'i9NhrhCQGq7rxaA' // Telegram Bot API token.
});
bot.on(/^([Hh]ey|[Hh]oi|[Hh]a*i)$/, function (msg) {
return bot.sendMessage(msg.from.id, "Hello Commander");
});
var Historiepics = ['Schoolfotos/grr.jpg', 'Schoolfotos/boe.jpg',
'Schoolfotos/tobinsexy.jpg'];
console.log('Historiepics')
console.log(Math.floor(Math.random() * Historiepics.length));
var foto = Historiepics[(Math.floor(Math.random() * Historiepics.length))];
bot.on(/aap/, (msg) => {
return bot.sendPhoto(msg.from.id, foto);
});
bot.start();
The result I'm getting from this is just one picture everytime, but if I ask for another random picture it keeps showing me the same one without change.
I recently figured this out, so I'll drop an answer for anyone that runs into this issue.
The problem is with Telegram's cache. They cache images server side so that they don't have to do multiple requests to the same url. This protects them from potentially getting blacklisted for too many requests, and makes things snappier.
Unfortunately if you're using an API like The Cat API this means you will be sending the same image over and over again. The simplest solution is just to somehow make the link a little different every time. This is most easily accomplished by including the current epoch time as a part of the url.
For your example with javascript this can be accomplished with the following modifications
bot.on(/aap/, (msg) => {
let epoch = (new Date).getTime();
return bot.sendPhoto(msg.from.id, foto + "?time=" + epoch);
});
Or something similar. The main point is, as long as the URL is different you won't receive a cached result. The other option is to download the file and then send it locally. This is what Telebot does if you pass the serverDownload option into sendPhoto.

AJAX Newbie , POSTing data on server

I started learning AJAX recently and am trying a very simple project which involves capturing some form data and sending it to two servers.
The first server is the one which hosts the website and server side php handling. This worls fine
The second server is a python basic http server which handles only the POST operation request send from AJAX. This functionality works but is a bit weird.
Let me explain
Here is my AJAX code which is absolutely straight forward.
function xml_http_post(url, data) {
var req = false;
try {
// Firefox, Opera 8.0+, Safari
req = new XMLHttpRequest();
}
catch (e) {
// Internet Explorer
try {
req = new ActiveXObject("Msxml2.XMLHTTP");
}
catch (e) {
try {
req = new ActiveXObject("Microsoft.XMLHTTP");
}
catch (e) {
alert("Your browser does not support AJAX!");
return false;
}
}
}
req.onreadystatechange = function() {
if (req.readyState == 4) {
// callback(req);
}
}
req.open("POST", url, true);
req.setRequestHeader("Content-type","text/plain");
req.send(data);
}
Since I do not intend to send back any response , my callback function on ready state change is empty.
But when I execute this code ( triggered by onclick on a button) , the POST doesnt work and server doesnt seem to receive anything.
But the most surprising thing is that if I keep a breakpoint at req.open( ) and then do a manual step execution then it works always. Which means , I guess that there is some timing issue which needs to be resolved.
It works fine without breakpoints if the third parameter "async" is set to false but that is anyway undesirable so I want to make it work with async = true.
Any help would be greatly appreciated.
Thanks
Shyam
As I figured out, the form page was getting unloaded by a php script which was invoked as a action of the form b the first server. This resulted in the javascript code being partially or not executed.
So I figured out that sync XHR is the only way for my.

can't seem to get progress events from node-formidable to send to the correct client over socket.io

So I'm building a multipart form uploader over ajax on node.js, and sending progress events back to the client over socket.io to show the status of their upload. Everything works just fine until I have multiple clients trying to upload at the same time. Originally what would happen is while one upload is going, when a second one starts up it begins receiving progress events from both of the forms being parsed. The original form does not get affected and it only receives progress updates for itself. I tried creating a new formidable form object and storing it in an array along with the socket's session id to try to fix this, but now the first form stops receiving events while the second form gets processed. Here is my server code:
var http = require('http'),
formidable = require('formidable'),
fs = require('fs'),
io = require('socket.io'),
mime = require('mime'),
forms = {};
var server = http.createServer(function (req, res) {
if (req.url.split("?")[0] == "/upload") {
console.log("hit upload");
if (req.method.toLowerCase() === 'post') {
socket_id = req.url.split("sid=")[1];
forms[socket_id] = new formidable.IncomingForm();
form = forms[socket_id];
form.addListener('progress', function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
});
form.parse(req, function (err, fields, files) {
file_name = escape(files.upload.name);
fs.writeFile(file_name, files.upload, 'utf8', function (err) {
if (err) throw err;
console.log(file_name);
})
});
}
}
});
var socket = io.listen(server);
server.listen(8000);
If anyone could be any help on this I would greatly appreciate it. I've been banging my head against my desk for a few days trying to figure this one out, and would really just like to get this solved so that I can move on. Thank you so much in advance!
Can you try putting console.log(socket_id);
after form = forms[socket_id]; and
after progress = (bytesReceived / bytesExpected * 100).toFixed(0);, please?
I get the feeling that you might have to wrap that socket_id in a closure, like this:
form.addListener(
'progress',
(function(socket_id) {
return function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
};
})(socket_id)
);
The problem is that you aren't declaring socket_id and form with var, so they're actually global.socket_id and global.form rather than local variables of your request handler. Consequently, separate requests step over each other since the callbacks are referring to the globals rather than being proper closures.
rdrey's solution works because it bypasses that problem (though only for socket_id; if you were to change the code in such a way that one of the callbacks referenced form you'd get in trouble). Normally you only need to use his technique if the variable in question is something that changes in the course of executing the outer function (e.g. if you're creating closures within a loop).

XDomainRequest object caching/asynchronous call issue

I have an aspx page on which I am using XDomainRequest object to populate two div(s) with html returned from AJAX response.
I have used Jquery to get the divs and perform "each()" on the retrieved List
var divs = $("div");
divs.each(function(index) {
if (window.XDomainRequest) {
xdr = new XDomainRequest();
if (xdr) {
xdr.onload = function() {
alert("XDR Response - " + xdr.responseText);
var currentDivID = divs[index].attributes["id"].value;
var selectedDiv = $("div[id='" + currentDivID + "']");
if (xdr.responseText == '') selectedDiv.attr("style", "display:none;");
else selectedDiv.append(xdr.responseText);
};
xdr.open("GET", xdrUrl);
try {
xdr.send();
} catch (e) {
alert(e);
}
} else {
alert('Create XDR failed.');
}
} else {
alert('XDR not found on window object .');
}
}
Now, whats happening is , i have two Divs on a page that have different IDs and when this code runs on "$.ready(function(){})" , both requests are asynchronously sent to the server and processed
the result is
1. sometimes the onload get the response for the second div in both div results.
2. IE sents only one request to the server(I am using fiddler to see what requests are sent to server).
Can anybody guide me whats wrong with the code ? As far as I know XDR does not support synchronous calls, and asynchronous calls are giving me wrong results. Any workaround/tip for this problem.
Issue solved by myself when I pointed out a mistake in my code:(.
xdr = new XDomainRequest();
should be
var xdr = new XDomainRequest();
For Point 2 , I added "Cache-Control:no-cache" header in my response and it solved the matter.

Resources