Wget download after POST, make it wait? - bash

I am working on bash script in which I use Wget to supply POST data, and Wget is supposed to make POST request on specific page, and that page is supposed to return file for download.The problem is that, after making request, that page returns file after few seconds, not immediately so Wget only downloads html page, and don't wait for that file to be returned.Is there any option to make this work - make post request and wait a few seconds for a file to be returned from remote server ?

If your only problem is that you need more time you can use the sleep command.
You can get more information about it here: http://www.linuxtopia.org/online_books/advanced_bash_scripting_guide/timedate.html
Hope that helped!

Related

Creating a script to automate submitting something on a webpage

I want to create a script, which accesses a website behind a login (with 2FA) and press the submit button every x seconds.
Unfortunately, I am a total Shell noob, but I already automated the process with the Chrome extensions "Kantu Browser Automation", but the extension has limits on the looping and a looping timeout.
use curl command for this and put it crontab.
curl:
https://curl.haxx.se/
you have to use POST method.
crontab:
https://crontab.guru/

Pull info from website

I'm looking to pull the timer from this site: http://invasiontimer.com/
But it looks like the timer isn't in html, so the normal curl or wget isn't getting it for me.
Is there any way to get this in a bash script and print it to a text file.
Thanks.
I think what you want is the content loaded by javascript. Check out this answer for more details: How to get webcontent that is loaded by JavaScript using cURL?

ajax request and robots.txt

A website has a URL http://example.com/wp-admin/admin-ajax.php?action=FUNCTIOn_NAME. When I click the URL, it executes the ajax function.
When I put the URL in the address bar, it gives a redirect error because the URL doesn't actually take you anywhere, but it definitely still executes the ajax function.
When I use the command line bash call: firefox -new-window http://example.com/wp-admin/admin-ajax.php?action=FUNCTIOn_NAME, it opens a empty page except for the line "Bad user...". After some digging I found that the robots.txt file has "Disalow: /wp-admin/". I am assuming this is why it isn't working in the command line. I have used wget -e robots=off URL before, but there isn't anything to download so it doesn't apply here.
What type of URL is this? (I believe it's dynamic or formula, but not sure)
I want to get the same results with the command line as when I plug the URL into the address bar. Ideas?
It's nothing special it just display a that html no matter what. HTTP servers don't have use files. It could be written in c++, java, python or nodejs(probably not).

how curl retrieves a url with # and ! symbols in it?

I was considering using curl to retrieve a page from a url(http://bbs.byr.cn/#!board/JobInfo?p=3) but ended up getting a notice from bash like
$ curl bbs.byr.cn/#!article/JobInfo/102321
bash: !article/JobInfo/102321: event not found
this url is accessible in my browser window, how can I write a curl command line that works on this url?
In general this is not possible that stuff after the hashtag (#) is just handled by JavaScript on the client side. Curl cannot execute JavaScript. You can put that URL in quotes to get the static part of the page, but this is however surly not that what you want.
If you observe the traffic of that page in Firebug you will see that the url http://bbs.byr.cn/board/JobInfo?p=3 will be downloaded. This file you can download to get your results.

Ajax Post Request blocks website loading

I have a strange problem with using ajax post requests. I use the request to run an ImageMagick process directly on the command line by using php function exec(). The process takes about a minute, and then responds with some variables. This is working fine, except from one problem. During the execution time I cannot excess other parts of the website that are installed on the same webserver (as if the server is unreachable). When the process finishes, everything works fine again.
I first thought this to be due to an overloaded server. However, when you access the website via another browser, there are no problems, even during the execution time of the process in the other browser. So it looks like the problems has something to do with browsers blocking other requests during the post request.
Could anyone help me out here? What could be the root problem?
Found the solution! Thanks from the help by kukipei By adding session_write_close(); to the file of the ajax request (after is has read the userid and token), the session file is no longer locked, and all pages are accessible again. Problem was that the session was locked during the whole execution time of the process, which was not necessary, since I only needed the session to read the userid and token. So before calling the ImageMagick operation, I now add session_write_close()

Resources