I want to get the random page from wiki and paste it on txt file.
curl -I https://en.wikipedia.org/wiki/Special:Random|grep -E "Location:"|cut -d ' ' -f2 > "result.txt"
But when I retrieve it from txt file and it come out the error.
cat result.txt| xargs -I % curl %
How about just following redirects with curl by adding the -L switch? No need to parse the Location header:
curl -L https://en.wikipedia.org/wiki/Special:Random
Related
I have tried this command
curl -Ik https://dev.mydomain.com/
and it does print everything. And now what I want is to print out content-security-policy only.
Do I need to use jq or is there any other helpful tool that I can use?
curl -sIk https://stackoverflow.com/ | grep content-security-policy | cut -d ' ' -f 2-
Will curl the url, grep only the line with content-security-policy, cut on a space, and get all the fields from 2 onwards.
Example:
➜ ~ curl -sIk https://stackoverflow.com/ | grep content-secur | cut -d ' ' -f 2-
upgrade-insecure-requests; frame-ancestors 'self' https://stackexchange.com
If you use cURL >= 7.84.0, you can use the syntax %header{name} :
curl -Iks https://stackoverflow.com -o /dev/null -w "%header{content-security-policy}"
If you want to try it without installing a new version, you can run the Docker image :
docker run --rm curlimages/curl:7.85.0 -Iks https://stackoverflow.com -o /dev/null -w "%header{content-security-policy}"
Need to write some status checker at bash-script:
Have file with strings like that:
domain.com; 111.111.111.111,222.222.222.222; /link/to/somefile.js,/link/to/somefile2.js
domain2.com; 122.122.111.111,211.211.222.222; /link/to/somefile2.js,/link/to/somefile3.js
Need to execute such commands at total:
curl -s -I -H 'Host: domain.com' http://111.111.111.111/link/to/somefile.js
curl -s -I -H 'Host: domain.com' http://222.222.222.222/link/to/somefile.js
curl -s -I -H 'Host: domain.com' http://111.111.111.111/link/to/somefile2.js
curl -s -I -H 'Host: domain.com' http://222.222.222.222/link/to/somefile2.js
curl -s -I -H 'Host: domain2.com' http://122.122.111.111/link/to/somefile2.js
curl -s -I -H 'Host: domain2.com' http://211.211.222.222/link/to/somefile2.js
curl -s -I -H 'Host: domain2.com' http://122.122.111.111/link/to/somefile3.js
curl -s -I -H 'Host: domain2.com' http://211.211.222.222/link/to/somefile3.js
The question is:
what tool do I need to use to have such result at total?
Maybe xargs with some arguments/flags can do that or gnu parallel?
Can you, please, show examples?
I can to separate lines and set result to different variables that's isn't problem at all:
domain=$(cut -d';' -f1 file| xargs -I N -d "," echo curl -H) \'N\'
ip=$(cut -d';' -f2 file| xargs -I N -d "," echo curl -H) \'N\'
and else
But question at other :) :
how after delimiting and separating strings to variables, I can execute curl with different variables at that case - the number of arguments for different variables will be different ?
The answer's that get Barmar doesn't cover task problem at all, cause it has greater than two list's. The problem is not at ignorance of bash, but of way I can resolve issue
#!/usr/bin/env bash
# ^^^^- IMPORTANT: not /bin/sh
# print command instead of running it, so people can test their answers without real URLs
log_command() { printf '%q ' "$#"; printf '\n'; }
while IFS='; ' read -r domain addrs_str files_str; do
IFS=, read -a addrs <<<"$addrs_str"
IFS=, read -a files <<<"$files_str"
for file in "${files[#]}"; do
for addr in "${addrs[#]}"; do
log_command curl -s -I -H "Host: $domain" "http://$addr/$file"
done
done
done
...emits as output (as the list of commands if it would run if the log_command prefix were removed):
curl -s -I -H Host:\ domain.com http://111.111.111.111//link/to/somefile.js
curl -s -I -H Host:\ domain.com http://222.222.222.222//link/to/somefile.js
curl -s -I -H Host:\ domain.com http://111.111.111.111//link/to/somefile2.js
curl -s -I -H Host:\ domain.com http://222.222.222.222//link/to/somefile2.js
curl -s -I -H Host:\ domain2.com http://122.122.111.111//link/to/somefile2.js
curl -s -I -H Host:\ domain2.com http://211.211.222.222//link/to/somefile2.js
curl -s -I -H Host:\ domain2.com http://122.122.111.111//link/to/somefile3.js
curl -s -I -H Host:\ domain2.com http://211.211.222.222//link/to/somefile3.js
...as you can see at https://ideone.com/dTC8q8
Now how does this work?
Step 1: Read each line into domain, addrs_str and files_str, split on semicolons and spaces.
That's what's done by the line IFS='; ' read -r domain addrs_str files_str, which operates as described in BashFAQ #1, and in How to read variables from file, with multiple variables per line?
Step 2: For addrs_str and files_str, split them on commas into separate arrays. This is described in How do I split a string on a delimiter in Bash?
Step 3: Iterate over those arrays, and call curl for each combination. If you wanted to call the first IP with only the first file, and the second IP with the second file, you could use Iterate over two arrays simultaneously in bash; otherwise, it's a plain nested loop.
With GNU Parallel it would look like this
doit() {
domain="$1"
ips="$2"
paths="$3"
parallel --dry-run -d ',' -q curl -s -I -H Host:\ "$domain" http://{1}/{2} ::: "$ips" ::: "$paths"
}
export -f doit
parallel --colsep ';' doit :::: input.file
Remove --dry-run when you are convinced it works.
Trying to parse each line of a log file I am tailing to curl as a variable.
Currently I have this:
tail -f log.txt |
buffer |
while read -r LINE; do
curl -s -X POST https://api.com -d id=-123456 -d text="'$LINE'"
done
It's not really working and my goal is to output each line as a variable inside the text field: text=$LINE
How can I accomplish this? Is it an issue with the variable?
I am using following command to download a single webpage with all its images and js using wget in Windows 7:
wget -E -H -k -K -p -e robots=off -P /Downloads/ http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
It is downloading the HTML as required, but when I tried to pass on a text file having a list of 3 URLs to download, it didn't give any output, below is the command I am using:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt -B 'http://'
I tried this also:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt
This text file had URLs http:// prepended in it.
list.txt contains list of 3 URLs which I need to download using a single command. Please help me in resolving this issue.
From man wget:
2 Invoking
By default, Wget is very simple to invoke. The basic syntax is:
wget [option]... [URL]...
So, just use multiple URLs:
wget URL1 URL2
Or using the links from comments:
$ cat list.txt
http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
http://www.verizonwireless.com/smartphones-2.shtml
http://www.att.com/shop/wireless/devices/smartphones.html
and your command line:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt
works as expected.
First create a text file with the URLs that you need to download.
eg: download.txt
download.txt will as below:
http://www.google.com
http://www.yahoo.com
then use the command wget -i download.txt to download the files. You can add many URLs to the text file.
If you have a list of URLs separated on multiple lines like this:
http://example.com/a
http://example.com/b
http://example.com/c
but you don't want to create a file and point wget to it, you can do this:
wget -i - <<< 'http://example.com/a
http://example.com/b
http://example.com/c'
pedantic version:
for x in {'url1','url2'}; do wget $x; done
the advantage of it you can treat is as a single wget url command
when i use bash to upload files to dropbox, it works fine but when i manually use command line it does not work.
I'm thinking it might be the & in the url.. im not sure..
Bash code:
CURL_BIN="/usr/bin/curl"
#Note: This option explicitly allows curl to perform "insecure" SSL connections and transfers.
#CURL_ACCEPT_CERTIFICATES="-k"
CURL_PARAMETERS="--progress-bar"
APPKEY="zrwv8z3bycfk3m8"
OAUTH_ACCESS_TOKEN="aaaaaaaa"
APPSECRET="aaaaaaaaaa"
OAUTH_ACCESS_TOKEN_SECRET="aaaaaaaaa"
ACCESS_LEVEL="dropbox"
API_UPLOAD_URL="https://api-content.dropbox.com/1/files_put"
RESPONSE_FILE="temp2.txt"
FILE_SRC="temp.txt"
$CURL_BIN $CURL_ACCEPT_CERTIFICATES $CURL_PARAMETERS -v -i -o "$RESPONSE_FILE" --upload-file "$FILE_SRC" "$API_UPLOAD_URL/$ACCESS_LEVEL/$FILE_DST?oauth_consumer_key=$APPKEY&oauth_token=$OAUTH_ACCESS_TOKEN&oauth_signature_method=PLAINTEXT&oauth_signature=$APPSECRET%26$OAUTH_ACCESS_TOKEN_SECRET"
Manual code:
curl --insecure --progress-bar -v -i -o temp2.txt --upload-file temp.txt https://api-content.dropbox.com/1/files_put/dropbox/attachments/temp.txt?oauth_consumer_key=aaaaaaaaaa&oauth_token=aaaaaaaaa&oauth_signature_method=PLAINTEXT&oauth_signature=aaaaaaaaa%26aaaaaaaaaa
curl --insecure --progress-bar -v -i -o temp2.txt --upload-file temp.txt "https://api-content.dropbox.com/1/files_put/dropbox/attachments/temp.txt?oauth_consumer_key=aaaaaaaaaa&oauth_token=aaaaaaaaa&oauth_signature_method=PLAINTEXT&oauth_signature=aaaaaaaaa%26aaaaaaaaaa"
The solution is to add in the inverted commas "