How to get custom header in bash - bash

I'm adding a custom header in Asp.Net app:
context.Response.Headers.Add("X-Date", DateTime.Now.ToString());
context.Response.Redirect(redirectUrl, false);
When I'm using Fiddler I can see the "X-Date" header in the response.
I need to receive it by using bash.
I tried curl -i https://my.site.com and also wget -O - -o /dev/null --save-headers https://my.site.com with no success.
In both cases I see just the regular headers like: Content-Type, Server, Date, etc...
How I can receive the "X-Date" header?
Thanks,
Lev

protocol headers are different than file-headers (like http-header and tcp-header are different). When you create a protocol header you wiil need a server to resolve it and use the associated enviroment variables. Example ...
#!/bin/bash
# Apache - CGI
echo "text/plain"
echo ""
echo "$CONTENT_TYPE"
echo "$HTTP_ACCEPT"
echo "$SERVER_PROTOCOL"
When calling this script via web, The response ony my browser was...
text/html
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
HTTP/1.1
What you looking for are enviorment variables called $HTTP_ACCEPT, $CONTENT_TYPE and maybe $SERVER_PROTOCOL too.

Related

Make a (CURL-like) HTTP request without the "Http version" for testing?

I'm testing malformed HTTP requests on OSX, but I can't workout how to make a request with a missing/malformed http version.
Curl seems to only allow valid presets (--http1.0, --http1.1, --http1)
Whats the easiest way to construct a request without "http version"?
Example:
Given the following commands create the following request lines:
Ex1.
command: curl -i http://localhost:8080/cat.jpg?v=1
request: GET cat.jpg?v=1 HTTP/1.1
Ex2.
command: curl -i http://localhost:8080/cat.jpg?v=1 --http1.0
request: GET cat.jpg?v=1 HTTP/1.0
Wanted
How could I create the following
command: ???
request: GET cat.jpg?v=1 (missing http version)
EDIT: ANSWER
curl only deals with valid requests. netcat is an alternative that has more control.
See this answer
Thanks #DanFromGermany

JMeter Not Sending File with HTTP Request

I'm new to JMeter and trying to put a file to our API using an HTTP Request. When I put the file via curl using the -F flag, it works no problem.
Here's my curl request:
curl -X PUT -u uname:pword https://fakehostname.com/psr-1/controllers/vertx/upload/file/big/ADJTIME3 -F "upload1=#ADJTIME" -vis
and here's the relevant part of the response from the server:
> User-Agent: curl/7.37.1 Host: myfakehost.com Accept: */*
> Content-Length: 4190 Expect: 100-continue Content-Type:
> multipart/form-data; boundary=------------------------d76566a6ebb651d3
When I do the same put via JMeter, the Content-Length is 0 which makes me think that JMeter isn't reading the file for some reason. I know the path is correct because I browsed to the file from JMeter. Litte help?
In File Upload, make your file path RELATIVE to .jmx file or place next to it and specify file name only.
Thanks to everyone who offered solutions and suggestions. It turns out that the API I was trying to load test was the issue. I can PUT a file via curl no problem, but there's something about the Jmeter PUT that the API does not like. I finally tried doing a PUT to an unrelated API and was successful.

How to use curl post method while the login info is required?

For example, if I wanna issue a post request to the server. But the website requires the username and password to login first. How should I do these two operations?
If it's requires some programmatic username and password built into the web page, you'd need to submit what it expects for a user logging in, then capture the cookies you get, and then send those cookies back with your post. This can get involved if the login process involves multiple pages which are redirected to. curl can do this, but be prepared to spend some time on it.
To get the cookie being returned by the server, use curl -i to include headers. You can also add -L to automatically follow redirects (which you otherwise would have to do manually by retrieving the URI in the Location: field of an HTTP 301 or 302 response). Example:
curl -i -L stackoverflow.com > /tmp/so.html
grep -i 'Set-Cookie:' /tmp/so.html
Yields:
Set-Cookie: prov=31c24327-c0bf-474d-b504-fc97dc69ab61; domain=.stackoverflow.com; expires=Fri, 01-Jan-2055 00:00:00 GMT; path=/; HttpOnly
(Until you get the predictable logic right and how you need to submit the requests, you'll need to inspect the rest of the headers to be able to accomodate redirects, see if there are multple cookies, etc.)
To submit a cookie, use curl -b:
curl -b "prov=31c24327-c0bf-474d-b504-fc97dc69ab61" [rest of curl command]
Be patient and good luck, and be sure to check the curl man page.
curl -u username:password -X POST --data "name1=value1&name2=value2" http://yourwebpage.com/

Using CURL to download file and view headers and status code

I'm writing a Bash script to download image files from Snapito's web page snapshot API. The API can return a variety of responses indicated by different HTTP response codes and/or some custom headers. My script is intended to be run as an automated Cron job that pulls URLs from a MySQL database and saves the screenshots to local disk.
I am using curl. I'd like to do these 3 things using a single CURL command:
Extract the HTTP response code
Extract the headers
Save the file locally (if the request was successful)
I could do this using multiple curl requests, but I want to minimize the number of times I hit Snapito's servers. Any curl experts out there?
Or if someone has a Bash script that can respond to the full documented set of Snapito API responses, that'd be awesome. Here's their API documentation.
Thanks!
Use the dump headers option:
curl -D /tmp/headers.txt http://server.com
Use curl -i (include HTTP header) - which will yield the headers, followed by a blank line, followed by the content.
You can then split out the headers / content (or use -D to save directly to file, as suggested above).
There are three options -i, -I, and -D
> curl --help | egrep '^ +\-[iID]'
-D, --dump-header FILE Write the headers to FILE
-I, --head Show document info only
-i, --include Include protocol headers in the output (H/F)

How do I execute an HTTP PUT in bash?

I'm sending requests to a third-party API. It says I must send an HTTP PUT to http://example.com/project?id=projectId
I tried doing this with PHP curl, but I'm not getting a response from the server. Maybe something is wrong with my code because I've never used PUT before. Is there a way for me to execute an HTTP PUT from bash command line? If so, what is the command?
With curl it would be something like
curl --request PUT --header "Content-Length: 0" http://website.com/project?id=1
but like Mattias said you'd probably want some data in the body as well so you'd want the content-type and the data as well (plus content-length would be larger)
If you really want to only use bash it actually has some networking support.
echo -e "PUT /project?id=123 HTTP/1.1\r\nHost: website.com\r\n\r\n" > \
/dev/tcp/website.com/80
But I guess you also want to send some data in the body?
Like Mattias suggested, Bash can do the job without further tools. If you want to send data, you have to preset at least "Content-length". With variables "host", "port", "resource" and "data" defined, you can do a HTTP put with
echo -e "PUT /$resource HTTP/1.1\r\nHost: $host:$port\r\nContent-Length: ${#data}\r\n\r\n$data\r\n" > /dev/tcp/$host/$port
I tested this with a Rest API and it workes fine.

Resources