Is there a way to perform http commands GET/PUT/SET whatever via a command line in ubuntu or windows xp? Preferably without installing 3rd party products. Being that http is text based I thought it would be alot easier to run in the cmd line.
I've been able to get what I want out of GET in ubuntu in bash via
$wget google.com
$cat index.html
This is kinda clunky. It would be nice to pipe the output or something, but even that isn't straight forward. C programs are fine too. I'm trying to do something like what we get with Fiddler, but more basic.
telnet google.com 80
GET / HTTP/1.0
Host: google.com
You have to hit return twice after the Host line. It doesn't get any more basic.
If you are familiar with HTTP use telnet.
If you are looking for a browser take a look for Links.
Although it requires a 3rd party tool, these days I use curl. The -X option allows me to specify the HTTP verb. Windows has a few bash clients that allow you to run curl including Cygwin.
Sample Execution
$ curl -H "Content-Type: application/json" -X POST -d '{value: "600"}' http://localhost:8888/my/endpoint
Related
I have a script written in bash and tested working in Linux (CentOS 7) and on MacOS. The script uses cURL to interact with a REST API compliant data platform (XNAT).
I was hoping that Windows users could use the same script within git-bash that comes packaged with Git for Windows. Unfortunately there seems to be an issue when using cURL in git-bash.
The first use I make of cURL is to retrieve a JSESSION cookie:
COOKIE=`curl -k -u $USERNAME https://theaddress/JSESSION`
On Linux, this asks the user for password and stores the cookie in COOKIE.
In git-bash, issuing the command hangs, until using a "ctrl + C" to interrupt it. Strangely at that point the query message for the password is displayed, but too late, the script has terminated.
I have a suspicion that this may have to do with CR or LF issues, but cannot find some info I understand regarding this.
Any pointers would be welcome !
Thank you
EDIT:
It appears the above command works fine if I pass the password in the command like this:
COOKIE=`curl -k -u $USERNAME:$PASSWORD https://theaddress/JSESSION`
However, as pointed here:
Using cURL with a username and password?
I would rather avoid having the user typing their password as a command argument.
So the question is now "why is cURL not prompting for a password when I use the first command?" when in git-bash on Windows, while that command behaves as expected in Linux or MacOS:
COOKIE=`curl -k -u $USERNAME https://theaddress/JSESSION`
Ending up replying to my own question, hope this may be useful to someone else.
It appears this issue is a known problem when running cURL from within git-bash, according to this thread:
https://github.com/curl/curl/issues/573
In particular, see the answer of dscho on 30 Dec 2015:
The problem is the terminal emulator we use with Git Bash since Git for Windows 2.5, MinTTY.
This terminal emulator is not associated with a Win32 Console, therefore the user does not see anything when cURL wants to interact with the user via said Console.
This issue has a workaround, which is documented here:
https://github.com/git-for-windows/build-extra/blob/master/ReleaseNotes.md#known-issues
The workaround is to run curl via winpty as follows:
winpty curl [arguments]
Not an issue with CR or LF after all.
Soooo, git-bash may not be the magic-bullet (tm) to run my bash scripts in Windows with zero effort. Sigh...
I am trying to do a script to get me access of advance scan option of nessus in localhost. So I want advance scan operation through shell script without GUI. I want all operations like login, advance scan and export report are performed through shell script without GUI access.
Why do you want to do it with bash script?
You can do this much easier with the nessus API.
Have a look at the link below
https://github.com/jfalken/nessus_enterprise_rest_client
the simplest way of doing automatisation in nessus is to use the nessus API.
its located at https://NessusServerIP:8834/ - if you visit it, you will be greeted by the API-Documentation.
There are various API-Implementations available - if you google 'Nessus API client' you'll get a glimpse.
If you, as you said, want to to run bash-skripts than the simplest way is probably using CURL for the API-Requests.
A typical workflow will look like this:
authorize yourself to the NessusAPI (either via TOKEN or API-Key)
launch or configure a scan (and wait until it finished)
export a report (and wait until it finished)
download the exported report
CURL #1 (authorize using token):
curl -X POST --data '{"username":"NessusUser","password":"YourPassword"}' -k "https://NessusServerIp:8834/session"
--header "Content-Type:application/json" | python -m json.tool
..which will yield you following JSON yielding an Token which you need for the other API-Calls:
{"token": "e411e443521adee4496d79823a510cc68c5bf05aeda6e6eb"}
CURL #2 (launch a scan):
curl -X POST -H 'X-Cookie: token=e411e443521adee4496d79823a510cc68c5bf05aeda6e6eb' -H 'Content-Type:application/json'
--data '{"scan_id":"21", "alt_targets":[127.0.0.1]}'
-k "https://NessusServerIp:8834/scans/21/launch" | python -m json.tool
...which will be answered with a JSON like this, containing the ID of the just startet scan:
{"scan_uuid":"c1c30d8f-5f79-2e4b-2d03-05b8b3c595f1e768e03195abdfa2"}
CURL #3 (exporting a scan):
curl -X POST -H 'X-Cookie: token=766ef7a2302780c189ba563b89c5eb3706140c0ef1e4de8b' -H
'Content-Type:application/json' --data '{"scan_id":"33", "format":"html"}' -k
"https://NessusServerIP:8834/scans/33/export" | python -m json.tool
...which will yield this JSON response, containing a token to the exported file and the file_id:
{"token":"3e13ab381c480caa1e377411c0b561970c46e5d78894c5a0cb2be0e7f00fefe0","file":1434780027}
...so now we are ready to download the report. in this case, since i have specified "format: html" in the last call, its a .html you will need to safe the outcome into.
Curl #4 (download exported report):
curl -X GET -H 'X-Cookie: token=7d155aef4359d02addea29d8d56bca4a5045ca61efeb38ee' -H 'Content-Type:application/json'
--data '{"scan_id":"21", "alt_targets":127.0.0.1}'
-k "https://NessusServerIP:8834/scans/17/export/945237343/download" > report.html
...which should leave you with a report.html in the folder you started your script.
Now... how do you automatize this? Well write a Bash-Skript, put in this calls, parse the answers to extract the information you need - and then enjoy! :)
ps: i use the python -m json.tool to beautify the otherwise not very beautiful output of CURL.
Hope i have helped,
Gewure
I've been trying to use socat to respond on each connection to a socket it's listening to with a fake HTTP reply. I cannot get it working. It might be because I'm using the cygwin version of socat? I don't know.
Part of the problem is I want the second argument <some_file_response> not to be written to. In other words because it's bidirectional it will read what's in response.txt and then write to that same file, and I don't want that. Even if I do open:response.txt,rdonly it doesn't work repeatedly. system: doesn't seem to do anything. exec seems like it works, for example I can do exec:'cat response.txt' but that never gets sent to the client connecting to port 1234.
socat -vv tcp-listen:1234,reuseaddr,fork <some_file_response>
I want it to read a file to the client that's connected and then close the connection, and do that over and over again (that's why I used fork).
I am putting a bounty on this question. Please only give me solutions that work with the cygwin version from the windows command prompt.
Tested with cygwin:
socat -vv TCP-LISTEN:1234,crlf,reuseaddr,fork SYSTEM:"echo HTTP/1.0 200; echo Content-Type\: text/plain; echo; cat <some_file_response>"
If you do not want a complete HTTP response, leave out the echos:
socat -vv TCP-LISTEN:1234,crlf,reuseaddr,fork SYSTEM:"cat <some_file_response>"
Taken from socat examples
socat -vv TCP-LISTEN:8000,crlf,reuseaddr,fork SYSTEM:"echo HTTP/1.0 200; echo Content-Type\: text/plain; echo; cat"
This one works:
socat -v -v -d -d TCP-LISTEN:8080,reuseaddr,fork exec:"cat http.response",pipes
Two things need to be aware,
should you add crlf, as in other answers. I recommend not.
crlf caused problem sending image
just use \r\n explicitly in http response headers.
without pipes, seems no data sent to client. browser complains:
127.0.0.1 didn’t send any data.
ERR_EMPTY_RESPONSE
tested in cygwin.
== EDIT ==
If you want use inside cmd.exe, make sure PATH is correctly set, so that socat and cat can be found.
Say both socat.exe and cat.exe located under E:\cygwin64\bin
set PATH=%PATH%;E:\cygwin64\bin
Works in cmd.exe, with socat & cat from cygwin.
I am trying to use cURL in the Command Prompt, but I dont understand where I have problems. I have been told that I need to configure a proxy tothe Command Prompt so that it can access the sites I am calling on.
This is what I want to run: curl -glob "api.fda.gov/drug/event.json?&search=receivedate:[20040101+TO+20150101]&limit=1"
I have cURL installed, but always face errors because it is not connecting. Is there a simple way to set up a proxy for/through the Command Prompt in Windows 7?
I also do not have admin rights, so I cannot change the system settings.
You can set your proxy using a set command in windows:
set http_proxy=http://<yourproxyaddress>:<port>
Then you can connect your curl requests to external sites.
Some proxies require specific authentication headers to be set, so be aware of those as well. In my case, it's --proxy-ntlm in the example below:
curl -x webproxy.net:8080 -U usernaname:password http://google.com --proxy-ntlm
But there're other options:
--proxy-digest and --proxy-negotiate
Lastly, cURL has a super friendly doc page, so be sure to check it out.
I have a php page that runs on a local uri on a local nginx server. It is called like this:
http://example.dev/index.php?v=var
Is it possible to call this php page from inside a Bash script in order to make it run just like I do by typing the uri in Firefox?
I tryed to access the script directly in cli:
php /home/public_html/example.dev/index.php
but it didn't work (it looks that php running under fastCGI and PHP-CLI work somehow differently).
Any ideas?
Try GNU Wget
wget http://example.dev/index.php?v=var
or cURL
curl http://example.dev/index.php?v=var
to run it like a browser would.
Note: But this is not CLI in any way.
php -f <path-to-file>
php can output whatever you tell it to. It doesn't have to be HTML.
You can use a bash script to call a URI page by mfetching it with a program like curl:
curl -s 'http://example.dev/index.php?v=var' > /dev/null
…or you can be a little more hands on and use nc:
echo 'GET /index.php?var' | nc example.dev 80