I've been trying to use socat to respond on each connection to a socket it's listening to with a fake HTTP reply. I cannot get it working. It might be because I'm using the cygwin version of socat? I don't know.
Part of the problem is I want the second argument <some_file_response> not to be written to. In other words because it's bidirectional it will read what's in response.txt and then write to that same file, and I don't want that. Even if I do open:response.txt,rdonly it doesn't work repeatedly. system: doesn't seem to do anything. exec seems like it works, for example I can do exec:'cat response.txt' but that never gets sent to the client connecting to port 1234.
socat -vv tcp-listen:1234,reuseaddr,fork <some_file_response>
I want it to read a file to the client that's connected and then close the connection, and do that over and over again (that's why I used fork).
I am putting a bounty on this question. Please only give me solutions that work with the cygwin version from the windows command prompt.
Tested with cygwin:
socat -vv TCP-LISTEN:1234,crlf,reuseaddr,fork SYSTEM:"echo HTTP/1.0 200; echo Content-Type\: text/plain; echo; cat <some_file_response>"
If you do not want a complete HTTP response, leave out the echos:
socat -vv TCP-LISTEN:1234,crlf,reuseaddr,fork SYSTEM:"cat <some_file_response>"
Taken from socat examples
socat -vv TCP-LISTEN:8000,crlf,reuseaddr,fork SYSTEM:"echo HTTP/1.0 200; echo Content-Type\: text/plain; echo; cat"
This one works:
socat -v -v -d -d TCP-LISTEN:8080,reuseaddr,fork exec:"cat http.response",pipes
Two things need to be aware,
should you add crlf, as in other answers. I recommend not.
crlf caused problem sending image
just use \r\n explicitly in http response headers.
without pipes, seems no data sent to client. browser complains:
127.0.0.1 didn’t send any data.
ERR_EMPTY_RESPONSE
tested in cygwin.
== EDIT ==
If you want use inside cmd.exe, make sure PATH is correctly set, so that socat and cat can be found.
Say both socat.exe and cat.exe located under E:\cygwin64\bin
set PATH=%PATH%;E:\cygwin64\bin
Works in cmd.exe, with socat & cat from cygwin.
Related
I use curl tool to get something within http protocol, and intent to use option -i to display http-header. There's output message without http-header, only http-body from server in the terminal.
Following this question, you can try ---verbose option instead of -i
As the comment by cfeduke in the question mentioned, it depends on the response of the server as well.
I have a bash script that downloads some files from an ftp server. the problem is that sometimes curl returns errors 6 (can't resolve host) randomly! I can open the ftp via web browser without any problem. I also noticed that the most errors occurs on the first downloads. any idea?
Also I wanted to know that how can I make curl to retry download when these errors occur
Code I used:
curl -m 60 --retry 10 --retry-delay 10 --ftp-method multicwd -C - ftp://some_address/some_file --output ./some_file
note: I also tried the code without --ftp-method multicwd
OS: CentOS 6.5 64bit
while [ "$ret" != "0" ]; do curl [your options]; ret=$?; sleep 5; done
Assuming those are transitional problems with the server and/or DNS, looping might be of some help. This is a particularly good case for the rarely used (?) until loop:
until curl [your options]; do sleep 5; done
In addition, if using curl is not mandatory, maybe wget might be better suited for "unreliable" network connections. From the man:
GNU Wget is a free utility for non-interactive download of files from
the Web. It supports HTTP, HTTPS, and FTP protocols, as well as
retrieval through HTTP proxies.
[...]
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to
a network problem, it will keep retrying until the whole file has been retrieved. If the server supports
regetting, it will instruct the server to continue the download from where it left off.
All,
I'm attempting to create a bash shell script that uses openssl to do an https query for me (/dev/tcp and wget are unavailable) along the lines of:
openssl s_client -connect xxx.xxx.xxx.xxx:port <<EOF
GET / HTTP/1.1
Connection: close
...more http here...
EOF
If I do the command line by hand, typing in the request, it works as expected and I see the correct HTML. However, if I run it from inside of a shell script I am not getting an HTTP document back from the server. Any thoughts?
I wonder whether -ign_eof helps. The original problem is described in http://www.mail-archive.com/openssl-users#openssl.org/msg02926.html (note this is very old) and this switch seems to fit.
I want to setup a simple ssh tunnel from a local machine to a machine on the internet.
I'm using
ssh -D 8080 -f -C -q -N -p 12122 <username>#<hostname>
Setup works fine (I think) cause ssh returs asking for the credentials, which I provide.
Then i do
export http_proxy=http://localhost:8080
and
wget http://www.google.com
Wget returns that the request has been sent to the proxy, but no data is received back.
What i need is a way to look at how ssh is processing the request....
To get more information out of your SSH connection for debugging, leave out the -q and -f options, and include -vvv:
ssh -D 8080 -vvv -N -p 12122 <username>#<hostname>
To address your actual problem, by using ssh -D you're essentially setting up a SOCKS proxy which I believe is not supported by default in wget.
You might have better luck with curl which provides SOCKS suport via the --socks option.
If you really really need to use wget, you'll have to recompile your own version to include socks support. There should be an option for ./configure somewhere along the lines of --with-socks.
Alternatively, look into tsock which can intercept outgoing network connections and redirecting them through a SOCKS server.
Is there a way to perform http commands GET/PUT/SET whatever via a command line in ubuntu or windows xp? Preferably without installing 3rd party products. Being that http is text based I thought it would be alot easier to run in the cmd line.
I've been able to get what I want out of GET in ubuntu in bash via
$wget google.com
$cat index.html
This is kinda clunky. It would be nice to pipe the output or something, but even that isn't straight forward. C programs are fine too. I'm trying to do something like what we get with Fiddler, but more basic.
telnet google.com 80
GET / HTTP/1.0
Host: google.com
You have to hit return twice after the Host line. It doesn't get any more basic.
If you are familiar with HTTP use telnet.
If you are looking for a browser take a look for Links.
Although it requires a 3rd party tool, these days I use curl. The -X option allows me to specify the HTTP verb. Windows has a few bash clients that allow you to run curl including Cygwin.
Sample Execution
$ curl -H "Content-Type: application/json" -X POST -d '{value: "600"}' http://localhost:8888/my/endpoint