Howto: the minimal server to serve zero length answers - performance

I face funny problem: I have an FreeBSD 8.2 server, and I need to set up web server that wil answer zero (0) length answer to any request. Just '200 OK' and zero body.
Ok, I can setup nginx (and already did that) and set 404 error document to /dev/null, but I think maybe there is more optimal and elegant solution? I know there is nginx module that output 1x1 gif, may there be anything like that for zero-length file?

It's possible to return a status code from Nginx:
location /empty {
return 200;
}
NOTE Generally, HTTP status code 204 No Content is meant to say "I've completed the request, but there is no body to return". You may return it in the same fashion:
location /empty {
return 204;
}

You could use netcat like in the example here http://howtoforge.com/useful-uses-of-netcat
E.g.
while true; do echo 'HTTP/1.1 200 OK
Content-Length: 0
Connection: close
' | nc -l 80; done

Related

Bash script to test status of site

I have a script for testing the status of a site, to then run a command if it is offline. However, I've since realised because the site is proxied through Cloudflare, it always shows the 200 status, even if the site is offline. So I need to come up with another approach. I tried testing the site using curl and HEAD. Both get wrong response (from Cloudflare).
What I have found is that HTTPie command gets the response I need. Although only when I use the -h option (I have no idea why that makes a difference, since visually the output looks identical to when I don't use -h).
Assuming this is an okay way to go about reaching my aim ... I'd like to know how I can test if a certain string appears more than 0 times.
The string is location: https:/// (with three forward slashes).
The command I use to get the header info from the actual site (and not simply from what Cloudflare is dishing up) is, http -h https://example.com/.
I am able to test for the string using, http -h https://example.com | grep -c 'location: https:///'. This will output 1 when the string exists.
What I now want to do is run a command if the output is 1. But this is where I need help. My bash skills are minimal, and I am going about it the wrong way. What I came up with (which doesn't work) is:
#!/bin/bash
STR=$(http -h https://example.com/)
if (( $(grep -c 'location: https:///' $STR) != 1 )); then
echo "Site is UP"
exit
else
echo "Site is DOWN"
sudo wo clean --all && sudo wo stack reload --all
fi
Please explain to me why it's not working, and how to do this correctly.
Thank you.
ADDITIONS:
What the script is testing for is an odd situation in which the site suddenly starts redirecting to, literally, https:///. This obviously causes the site to be down. Safari, for instance, takes this as a redirection to localhost. Chrome simply spits the dummy with a redirect error, ERR_INVALID_REDIRECT.
When this is occurring, the headers from the site are:
HTTP/2 301
server: nginx
date: Thu, 12 May 2022 10:19:58 GMT
content-type: text/html; charset=UTF-8
content-length: 0
location: https:///
x-redirect-by: WordPress
x-powered-by: WordOps
x-frame-options: SAMEORIGIN
x-xss-protection: 1; mode=block
x-content-type-options: nosniff
referrer-policy: no-referrer, strict-origin-when-cross-origin
x-download-options: noopen
x-srcache-fetch-status: HIT
x-srcache-store-status: BYPASS
I choose to test for the string location: https:/// since that's the most specific (and unique) to this issue. Could also test for HTTP/2 301.
The intention of the script is to remedy the problem when it occurs, as a temporary solution whilst I figure out what's causing Wordpress to generate such an odd redirect. Also in case it happens whilst I am not at work, or sleeping. :-) I will have a cron job running the script every 5 mins, so at least the site is never down for longer than that.
grep reads a file, not a string. Also, you need to quote strings, especially if they might contain whitespace or shell metacharacters.
More tantentially, grep -q is the usual way to check if a string exists at least once. Perhaps see also Why is testing “$?” to see if a command succeeded or not, an anti-pattern?
I can see no reason to save the string in a variable which you only examine once; though if you want to (for debugging reasons etc) probably avoid upper case variables. See also Correct Bash and shell script variable capitalization
The parts which should happen unconditionally should be outside the condition, rather than repeated in both branches.
Nothing here is Bash-specific, so I changed the shebang to use sh instead, which is more portable and sometimes faster. Perhaps see also Difference between sh and bash
#!/bin/sh
if http -h https://example.com/ | grep -q 'location: https:///'
then
echo "Site is UP"
else
echo "Site is DOWN"
fi
sudo wo clean --all && sudo wo stack reload --all
For basic diagnostics, probably try http://shellcheck.net/ before asking for human assistance.

curl does not terminate after successful POST

I have created some curl command to send a POST to my server where I am listening on that port for input to trigger additional action. The command is the following (Just masked the URL):
curl -v -H "Content-Type: application/json" -X POST -d "{\"Location\":\"Some Name\",\"Value\":\"40%\"}" http://example.com:8885/
I get the following output from curl:
About to connect() to example.com port 8885 (#0)
Trying 5.147.XXX.XXX...
Connected to example.com (5.147.XXX.XXX) port 8885 (#0)
POST / HTTP/1.1
User-Agent: curl/7.29.0
Host: example.com:8885
Accept: /
Content-Type: application/json
Content-Length: 40
upload completely sent off: 40 out of 40 bytes
However after that curl does not close the connection. Am I doing something wrong? Also on the server I only receive the POST as soon as I hit ctrl+c.
It sits there waiting for the proper HTTP response, and after that has been received it will exit cleanly.
A minimal HTTP/1.1 response could look something like:
HTTP/1.1 200 OK
Content-Length: 0
... and it needs an extra CRLF after the last header to signal the end of headers.
I'm a bit rusty on this, but according to section 6.1 of RFC7230, you might need to add a Connection: close header as well. Quoting part of the paragraph:
The "close" connection option is defined for a sender to signal
that this connection will be closed after completion of the
response. For example,
Connection: close
in either the request or the response header fields indicates that
the sender is going to close the connection after the current
request/response is complete (Section 6.6).
Let me know if it solves your issue :-)
Is there a question mark in link ?
I found that my link had question mark like http... .com/something/something?properties=1 and i tried header connection: close but it was still active so i tried then removing ?properties etc. and it worked...

Why would a web server reply with 301 and the exact location that was requested?

I'm trying to retrieve pages from a web server via https, using lua with luasec. For most pages my script works as intended, but if the ressource contains special characters (like ,'é), I'm being sent into a loop with 301 responses.
let this code sniplet illustrate my dilemma (actual server details redacted to protect the innocent):
local https = require "ssl.https"
local prefix = "https://www.example.com"
local suffix = "/S%C3%A9ance"
local body,code,headers,status = https.request(prefix .. suffix)
print(status .. " - GET was for \"" .. prefix .. suffix .. "\"")
print("headers are " .. myTostring(headers))
print("body is " .. myTostring(body))
if suffix == headers.location then
print("equal")
else
print("not equal")
end
local body,code,headers,status = https.request(prefix .. headers.location)
print(status .. " - GET was for \"" .. prefix .. suffix .. "\"")
which results in the paradoxical
HTTP/1.1 301 Moved Permanently - GET was for "https://www.example.com/S%C3%A9ance"
headers are { ["content-type"]="text/html; charset=UTF-8";["set-cookie"]="PHPSESSID=e80oo5dkouh8gh0ruit7mj28t6; path=/";["content-length"]="0";["connection"]="close";["date"]="Wed, 15 Mar 2017 19:31:24 GMT";["location"]="S%C3%A9ance";}
body is ""
equal
HTTP/1.1 301 Moved Permanently - GET was for "https://www.example.com/S%C3%A9ance"
How might one be able to retrieve the elusive pages, using lua and as little additional dependencies as possible?
Obvious as it may seem, perhaps the requested url does differ from the actual location.
If you have a similar problem, do check deep within your external libraries to make sure they do what you think they do.
In this case, luasocket did urldecode and then urlencode the url and thus the final request was not what it seemed to be.

What is the fastest way to perform a HTTP request and check for 404?

Recently I needed to check for a huge list of filenames if they exist on a server. I did this by running a for loop which tried to wget each of those files. That was efficient enough, but took about 30 minutes in this case. I wonder if there is a faster way to check whether a file exists or not (since wget is for downloading files and not performing thousands of requests).
I don't know if that information is relevant, but it's an Apache server.
Curl would be the best option in a for loop and here is a straight forward simple way, run this in your forloop
curl -I --silent http://www.yoururl/linktodetect | grep -m 1 -c 404
What this simply does is check the http response header for a 404 returned on the link and if its detected as a missing file/link throwing a 404 then the command line output will display you a number 1; otherwise, if the file/link is valid and does not return a 404 then the command line output will display you a number 0.

POST receiver (server)

I find myself in need of a way to test the post requests made from my application.
I'm sending my request to http://macbook-pro.local/, and I'm trying to figure out what service I can run to read and display the request.
I looked into RestKit without any luck.
Using SignalR could work, but the port to macos isn't working as expected.
What you want is basically a very simple web server, but if all you want is printing out what comes in a HTTP POST request you can get away with using the built-in 'nc' command. You can use Terminal to print out the contents of incoming requests on local port 10000 by running 'nc' in a loop like this:
while true; do nc -l 10000 < /dev/null ; printf '\n\n\n'; done
You can then go to http://localhost:10000 in your browser and see the HTTP request appear in your Terminal window. The web browser will give an error message since 'nc' isn't smart enough to reply.
To test an HTTP POST request you can use 'curl':
curl --data "this-is-POST-data" http://localhost:10000
Again, curl will give an error message because 'nc' simply closes the connection without giving a proper HTTP reply. You can have 'nc' reply with a static HTTP response to all requests like this:
while true; do printf 'HTTP/1.0 200 OK\r\nContent-type: text-plain\r\n\r\nHello, world!' | nc -l 10000 ; printf '\n\n\n'; done
If you need to use port 80 you'll need to run 'nc' as root (e.g. using 'sudo').
If you need to have any kind of real HTTP traffic going on, however, you will need to get a proper web server. OS X comes with the Apache web server which can be started with the command "apachectl start" ("apachectl stop" to stop it). CGI is enabled so you can put executables into /Library/WebServer/CGI-Executables and access them using http://localhost/cgi-bin/filename. For example, if you create the following CGI script:
#!/bin/sh
printf 'Content-type: text/plain\r\n\r\n'
cat > /tmp/post-data
echo OK
call it, say, "test.sh" and place it in the CGI-Executables folder, and run:
chmod +x /Library/WebServer/CGI-Executables/test.sh
Then, whenever you send a POST request to http://localhost/cgi-bin/test.sh it will save the contents of the POST data to the file /tmp/post-data on your computer.
Note: in all examples, "localhost" can be replaced with "macbook-pro.local" for accesses over the network if that is your computer hostname.
Also note that your OS X firewall permissions may block 'nc' and other software from listening to TCP ports. Usually you should get a permission dialog, but if you simply get "permission denied", tweak your firewall settings in System Preferences -> Firewall -> Firewall Options.
Look for SBJson framework
These are sample lines u can write to parse the GET data.
SBJsonParser *parser = [[SBJsonParser alloc] init];
NSDictionary *dict = [parser objectWithData:urlData];
[dictionary setDictionary:dict];
[parser release];
These are sample lines u can write to POST data.
SBJsonWriter *writer = [[SBJsonWriter alloc] init];
jsonStr = [writer stringWithObject:dictionary];
[writer release];
There are many more methods in framework to do some useful stuffs.

Resources