How can i capture request headers using Bash - bash

I need to make a script that can get a access token located in the request headers of a website can anyone help me with it?

You will be able to achieve this by piping your curl output through grep and cut commands. Here I have captured the value of Content-Length header.
curl -s -I example.com | grep "Content-Length" | cut -d ':' -f 2
Below is a sample script.
#!/bin/bash
DOMAIN="example.com"
HEADER="Content-Length"
HEADER_VALUE=$(curl -s -I $DOMAIN | grep $HEADER | cut -d ':' -f 2)
echo $HEADER_VALUE

Try using curl with option -I:
Example:
$ curl -I stackoverflow.com
HTTP/1.1 301 Moved Permanently
Content-Length: 143
Content-Type: text/html; charset=utf-8
Location: https://stackoverflow.com/
X-Request-Guid: 2396f2a8-3398-4264-9b26-ad79f282cb71
Content-Security-Policy: upgrade-insecure-requests
Accept-Ranges: bytes
Date: Mon, 08 Apr 2019 06:49:12 GMT
Via: 1.1 varnish
Connection: keep-alive
X-Served-By: cache-hhn1522-HHN
X-Cache: MISS
X-Cache-Hits: 0
X-Timer: S1554706153.814312,VS0,VE79
Vary: Fastly-SSL
X-DNS-Prefetch-Control: off
Set-Cookie: prov=e68f14b2-d35a-6ca6-8e8d-0b3f936049b4; domain=.stackoverflow.com;
expires=Fri, 01-Jan-2055 00:00:00 GMT; path=/; HttpOnly

Related

BASH command-line method to obtain OUI vendor info from MAC address

I' m trying to reproduce a method outlined in an old UnixStackExchange post to use a curl command to search for the vendor name using a MAC address obtained locally. The command is:
curl -sS "http://standards-oui.ieee.org/oui.txt" | grep -i "$OUI" | cut -d')' -f2 | tr -d 't'
However, it produces nothing when I run it. I've verified that "OUI" contains my MAC address to search on. Example:
echo $OUI
EC-58-EA
This is because HTTP server return a 301 Moved Permanently response
➜ ~ curl http://standards-oui.ieee.org/oui.txt -i
HTTP/1.1 301 Moved Permanently
Server: nginx/1.12.0
Date: Sun, 07 Mar 2021 05:41:37 GMT
Content-Type: text/html
Content-Length: 185
Location: http://standards-oui.ieee.org/oui/oui.txt
Connection: keep-alive
<html>
<head><title>301 Moved Permanently</title></head>
<body bgcolor="white">
<center><h1>301 Moved Permanently</h1></center>
<hr><center>nginx/1.12.0</center>
</body>
</html>
Indicating new location ---> < Location: http://standards-oui.ieee.org/oui/oui.txt
You can curl new location o tell to curl to follow 301 redirection : curl -L http://standards-oui.ieee.org/oui.txt
testing
➜ ~ curl -LsS "http://standards-oui.ieee.org/oui.txt" | grep -i "EC-58-EA" | cut -d')' -f2 | tr -d 't'
Ruckus Wireless

HTTP Request Include Equals checkbox can't be unchecked

When defining HTTP Request, there's a checkbox for each parameter: Include Equals
This checkbox can't be unchecked even when choosing different method or parameter.
I don't see any reference in HTTP Request for using it.
Why is this checkbox shown? Is there any usage for it?
Also it seems that Content-Type value per parameter is ignored,in GET it isn't sent:
GET http://www.google.com/?token=0Bfdsa
GET data:
In POST it send the regular www-form-urlencoded:
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
I've also stumbled upon what does it mean, and I think I've found it. It gives you the option to include = (equals) sign or not for parameters with no value: foo= vs. foo. If the parameter has a value you cannot uncheck "Include Equals?":
| Name: | Value | Include Equals? |
|-------|-------|:---------------:|
| foo | | [x] |
| bar | | [ ] |
| baz | qux | [x] |
The above configuration generates the following url-encoded form:
foo=&bar&baz=qux
The "Content-Type" appears used with the "Use multipart/form-data" option checked – every parameter is sent as a separate part and its own Content-Type:
[x] Use multipart/form-data
| Name: | Value | Content-Type |
|-------|-------|--------------|
| foo | | text/x-foo |
| bar | | text/x-bar |
| baz | qux | text/x-baz |
The generated request looks like:
Content-Type: multipart/form-data; boundary=zIVpNBG_m1irxcTtk7ByTwBgDHbsjB1UjTdRTS
--zIVpNBG_m1irxcTtk7ByTwBgDHbsjB1UjTdRTS
Content-Disposition: form-data; name="foo"
Content-Type: text/x-foo; charset=US-ASCII
Content-Transfer-Encoding: 8bit
--zIVpNBG_m1irxcTtk7ByTwBgDHbsjB1UjTdRTS
Content-Disposition: form-data; name="bar"
Content-Type: text/x-bar; charset=US-ASCII
Content-Transfer-Encoding: 8bit
--zIVpNBG_m1irxcTtk7ByTwBgDHbsjB1UjTdRTS
Content-Disposition: form-data; name="baz"
Content-Type: text/x-baz; charset=US-ASCII
Content-Transfer-Encoding: 8bit
qux
--zIVpNBG_m1irxcTtk7ByTwBgDHbsjB1UjTdRTS--
Here it worked for me,
I unchecked 'use multipart/form-data' and from header pass 'Content-Type application/x-www-form-urlencoded'

BASH: How to get X & Y lines from a file (1-liner)

I have some output and I would like to get lines 1 and 7. As a stream of output we could stream it with 2 different moduli. But, I digress.
I could easily do this with a for loop but I wonder if there is a more functional / 1-line approach to this:
Here is the data I am working with (I want the URL and the content type):
--2019-02-01 01:02:19-- https://artifactory/artifactory/BIF-Releases/com/foo/bif/eventlog/maven-metadata.xml.md5
Reusing existing connection to :443.
HTTP request sent, awaiting response...
HTTP/1.1 200 OK
Server: nginx/1.12.2
Date: Fri, 01 Feb 2019 09:02:33 GMT
Content-Type: application/x-checksum
Content-Length: 32
Connection: keep-alive
X-Artifactory-Id: d111c347124a8603:2a97a6e1:1681a62df25:-8000
Last-Modified: Fri, 01 Feb 2019 09:02:33 GMT
--
--2019-02-01 01:02:19-- https://artifactory/artifactory/BIF-Releases/com/foo/bif/eventlog/maven-metadata.xml.sha1
Reusing existing connection to artifactory:443.
HTTP request sent, awaiting response...
HTTP/1.1 200 OK
Server: nginx/1.12.2
Date: Fri, 01 Feb 2019 09:02:33 GMT
Content-Type: application/x-checksum
Content-Length: 40
Connection: keep-alive
X-Artifactory-Id: d111c347124a8603:2a97a6e1:1681a62df25:-8000
Last-Modified: Fri, 01 Feb 2019 09:02:33 GMT
The output I would want is just:
--2019-02-01 01:02:19-- https://artifactory/artifactory/BIF-Releases/com/foo/bif/eventlog/maven-metadata.xml.md5
Content-Type: application/x-checksum
--2019-02-01 01:02:19-- https://artifactory/artifactory/BIF-Releases/com/foo/bif/eventlog/maven-metadata.xml.sha1
Content-Type: application/x-checksum
You could also try to select some content:
grep -E "https://|Content-Type:"
# Or when you want to remove the date
grep -Eo "(https://|Content-Type:).*"
To get lines 1 and 7:
sed -n -e 1p -e 7p
You probably also want to terminate early:
sed -n -e 1p -e '7{p; q;}'
or
sed -n -e 1p -e 7p -e 7q
Lines 1 and 7:
awk 'NR == 1 || NR == 7'

Extracting HTTP content in bash doesn't work with nc output

Let's say I have this HTTP response:
POST / HTTP/1.1
Content-Type: text/plain;charset=UTF-8
Content-Length: 5
Connection: Keep-Alive
Accept-Encoding: gzip
Accept-Language: en,*
User-Agent: Mozilla/5.0
Host: 127.0.0.1:55764
Hello
And I'm interested only in content ("Hello"). I found this command to work if the text is fed from a file:
cat data.txt | tr '\n' '#' | sed "s/.*##//" | tr '#' '\n'
Hello
where data.txt contains the text above.
But if I try to feed it with the output of nc:
#!/bin/bash
while true
do
echo -e "HTTP/1.1 200 OK\n\n" | ./busybox-armv7l nc -l -p 55764 | tr '\n' '#' | sed "s/.*##//" | tr '#' '\n'
done
it doesn't work, i.e. it just print out everything:
POST / HTTP/1.1
Content-Type: text/plain;charset=UTF-8
Content-Length: 5
Connection: Keep-Alive
Accept-Encoding: gzip
Accept-Language: en,*
User-Agent: Mozilla/5.0
Host: 127.0.0.1:55764
HelloPOST / HTTP/1.1
Content-Type: text/plain;charset=UTF-8
Content-Length: 5
Connection: Keep-Alive
Accept-Encoding: gzip
Accept-Language: en,*
User-Agent: Mozilla/5.0
Host: 127.0.0.1:55764
Hello
Why the piping works with cat but not with nc?
output of nc goes to stderr just add & after second | to make the pipe effective:
echo -e "HTTP/1.1 200 OK\n\n" | ./busybox-armv7l nc -l -p 55764 |& tr '\n' '#' | sed "s/.*##//" | tr '#' '\n

Read the output of a command(multiple lines)

I want to read the 'Content-Lenght' of one internet file. To do that i use cURL to retrieve the headers
OUTPUT=`curl -I $URL`
HTTP/1.1 200 OK
Date: Sun, 12 Jan 2014 00:41:11 GMT
Server: Apache/2.2.15 (Red Hat)
Last-Modified: Sun, 05 Jan 2014 09:41:44 GMT
Accept-Ranges: bytes
Content-Length: 553648128
Content-Type: application/octet-stream
but, when i try to print $OUTPUT, i get only the last line.
OUTPUT=$(curl -I $URL | grep 'Content-Length')
bash
curl -I $url | while read -r response
do
case "$response" in
*Content-Length* )
echo "==> $response"
;;
esac
done

Resources