i just stuck with my self coded application calling ws command (web socket) and i'm trying to export the output. Also i want to exit wscat when it's finished after sometime of input from API from the JSON backend devlopment
#!/bin/bash
while getopts a:c: flag
do
case "${flag}" in
a) accesskey=${OPTARG};;
c) clientnodeid=${OPTARG};;
esac
done
master="wscat -c ws://localhost:8091/ws/callback -H accessKey:$accesskey -H clientNodeId:$clientnodeid"
sleep 15
eval $master
final=$(eval echo "$master")
echo $final >>logfile.log
ps -ef | grep wscat | grep -v grep | awk '{print $2}' | xargs kill
#curl -X POST --data "$final" -k "https://localhost:7460/activate" -H "accept: application/json" -H "accessKey:$accesskey" -H "clientNodeId:$clientnodeid" -H "Content-Type: application/json" -H "callbackRequested:true"
exit
I want to call then output from wscat to sent over curl
When i run the script manually it got success but when i call it from another application (java) it's it running but not generating log.
With all words, i want to export $final to text file and that text file i should import it to --data of curl calling
Fixed based on #Barmar's comment:
You're overcomplicating this with all those variables. Just do
eval "$master" >> logfile.log
Related
I have a 100 Jetpacks that I have to sign in to configure. I am trying to do it in a bash script but I am having no luck. I can connect to the wifi no problem but my POST request are not achieving anything. Any Advice? Here is link to my github. I have copies of what I captured on Burp suite https://github.com/Jdelgado89/post_Script
TYIA
#!/bin/bash
nmcli device wifi rescan
nmcli device wifi list
echo "What's they last four?"
read last4
echo "What's the Key?"
read key
nmcli device wifi connect Ellipsis\ \Jetpack\ $last4 password $key
echo "{"Command":"SignIn","Password":"$key"}" > sign_on.json
echo "{"CurrentPassword":"$key","NewPassword":"G1l4River4dm1n","SecurityQuestion":"NameOfStreet","SecurityAnswer":"Allison"}" > change_admin.json
echo "{"SSID":"GRTI Jetpack","WiFiPassword":"G1l4River3r","WiFiMode":0,"WiFiAuthentication":6,"WiFiEncription":4,"WiFiChannel":0,"MaxConnectedDevice":8,"PrivacySeparator":false,"WMM":true,"Command":"SetWifiSetting"}" > wifi.json
cat sign_on.json
cat change_admin.json
cat wifi.json
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #sign_on.json http://192.168.1.1/cgi-bin/sign_in.cgi
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #change_admin.json http://192.168.1.1/cgi-bin/settings_admin_password.cgi
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #wifi.json http://192.168.1.1/cgi-bin/settings_admin_password.cgi
This is not correct:
echo "{"Command":"SignIn","Password":"$key"}" > sign_on.json
The double quotes are not being put literally into the file, they're just terminating the shell string beginning with the previous double quote. So this is writing
{Command:SignIn,Password:keyvalue}
into the file, with no double quotes. You need to escape the nested double quotes.
echo "{\"Command\":\"SignIn\",\"Password\":\"$key\"}" > sign_on.json
However, it would be best if you used the jq utility instead of formatting JSON by hand. See Create JSON file using jq.
jq -nc --arg key "$key" '{"Command":"SignIn","Password":$key}' >sign_on.json
I have a simple bash file as below
#!/bin/bash
net=$(curl -s -H "Content-Type: application/json" -H "X-Auth-Token: $token" -d '{"network": {"name": "net1"}}' http://10.1.10.146:18090/network/v2.0/networks 2>&1 | awk '/id/{print $1}' | jq -r .network.id)
echo $net
Running this file gives me an error as below
parse error: Invalid numeric literal at line 2, column 0
tried making the changes according to these links
https://unix.stackexchange.com/questions/354943/setting-jq-output-to-a-bash-variable
Working with Bash and cURL
but nothing helped me, unable to figure out where i am going wrong. let me know the reason for the error and possible changes.
The curl output for the command
curl -s -H "Content-Type: application/json" -H "X-Auth-Token: $token" -d '{"network": {"name": "net1"}}' http://10.1.10.146:18090/network/v2.0/networks
output:
{"network":{"status":"ACTIVE","router:external":false,"availability_zone_hints":[],"availability_zones":[],"description":"","subnets":[],"shared":false,"tenant_id":"d0e75710820c401db3291ac6278f326f","created_at":"2018-05-15T07:37:42Z","tags":[],"ipv6_address_scope":null,"mtu":1450,"updated_at":"2018-05-15T07:37:42Z","admin_state_up":true,"revision_number":2,"ipv4_address_scope":null,"is_default":false,"port_security_enabled":true,"project_id":"d0e75710820c401db3291ac6278f326f","id":"1548df56-a35b-4232-9550-54a3c2266d60","name":"net1"}}
the idea is to get only the id from the output and store into a bash variable, to get the id i used the below command
curl -s -H "Content-Type: application/json" -H "X-Auth-Token: $token" -d '{"network": {"name": "net1"}}' http://10.1.10.146:18090/network/v2.0/networks 2>&1 | awk '/id/{print $1}' | jq -r .network.id
output:
be831582-90c1-499c-875f-9c0b0d1969a6
I have also tried removing the awk and parsing the curl json response, the same error is showing up.
thanks in advance.
The "parse error" message appears because the 2>&1 redirects any STDERR message into jq, which cannot parse it. Compare the output from these commands:
> curl http://no.such.host/network/v2.0/networks 2>&1
curl: (6) Could not resolve host: no.such.host
> curl http://no.such.host/network/v2.0/networks 2>&1 | jq '.'
parse error: Invalid numeric literal at line 1, column 4
Here are some ideas:
Separate all of the piped commands into separate commands.
Try adding set -x near the top of the script to "debug" all of the
commands.
Remove the 2>&1 because it is NOT helping you!
Add error handling to the separate commands.
We could help more if you provided the curl output. (We do not have HTTP access to the 10.1.10.146 host.)
I have the following shell script. The issue is that I want to run the transactions parallel/concurrently without waiting for one request to finish to go to the next request. For example if I make 20 requests, I want them to be executed at the same time.
for ((request=1;request<=20;request++))
do
for ((x=1;x<=20;x++))
do
time curl -X POST --header "http://localhost:5000/example"
done
done
Any guide?
You can use xargs with -P option to run any command in parallel:
seq 1 200 | xargs -n1 -P10 curl "http://localhost:5000/example"
This will run curl command 200 times with max 10 jobs in parallel.
Using xargs -P option, you can run any command in parallel:
xargs -I % -P 8 curl -X POST --header "http://localhost:5000/example" \
< <(printf '%s\n' {1..400})
This will run give curl command 400 times with max 8 jobs in parallel.
Update 2020:
Curl can now fetch several websites in parallel:
curl --parallel --parallel-immediate --parallel-max 3 --config websites.txt
websites.txt file:
url = "website1.com"
url = "website2.com"
url = "website3.com"
This is an addition to #saeed's answer.
I faced an issue where it made unnecessary requests to the following hosts
0.0.0.1, 0.0.0.2 .... 0.0.0.N
The reason was the command xargs was passing arguments to the curl command. In order to prevent the passing of arguments, we can specify which character to replace the argument by using the -I flag.
So we will use it as,
... xargs -I '$' command ...
Now, xargs will replace the argument wherever the $ literal is found. And if it is not found the argument is not passed. So using this the final command will be.
seq 1 200 | xargs -I $ -n1 -P10 curl "http://localhost:5000/example"
Note: If you are using $ in your command try to replace it with some other character that is not being used.
Adding to #saeed's answer, I created a generic function that utilises function arguments to fire commands for a total of N times in M jobs at a parallel
function conc(){
cmd=("${#:3}")
seq 1 "$1" | xargs -n1 -P"$2" "${cmd[#]}"
}
$ conc N M cmd
$ conc 10 2 curl --location --request GET 'http://google.com/'
This will fire 10 curl commands at a max parallelism of two each.
Adding this function to the bash_profile.rc makes it easier. Gist
Add “wait” at the end, and background them.
for ((request=1;request<=20;request++))
do
for ((x=1;x<=20;x++))
do
time curl -X POST --header "http://localhost:5000/example" &
done
done
wait
They will all output to the same stdout, but you can redirect the result of the time (and stdout and stderr) to a named file:
time curl -X POST --header "http://localhost:5000/example" > output.${x}.${request}.out 2>1 &
Wanted to share my example how I utilised parallel xargs with curl.
The pros from using xargs that u can specify how many threads will be used to parallelise curl rather than using curl with "&" that will schedule all let's say 10000 curls simultaneously.
Hope it will be helpful to smdy:
#!/bin/sh
url=/any-url
currentDate=$(date +%Y-%m-%d)
payload='{"field1":"value1", "field2":{},"timestamp":"'$currentDate'"}'
threadCount=10
cat $1 | \
xargs -P $threadCount -I {} curl -sw 'url= %{url_effective}, http_status_code = %{http_code},time_total = %{time_total} seconds \n' -H "Content-Type: application/json" -H "Accept: application/json" -X POST $url --max-time 60 -d $payload
.csv file has 1 value per row that will be inserted in json payload
Based on the solution provided by #isopropylcyanide and the comment by #Dario Seidl, I find this to be the best response as it handles both curl and httpie.
# conc N M cmd - fire (N) commands at a max parallelism of (M) each
function conc(){
cmd=("${#:3}")
seq 1 "$1" | xargs -I'$XARGI' -P"$2" "${cmd[#]}"
}
For example:
conc 10 3 curl -L -X POST https://httpbin.org/post -H 'Authorization: Basic dXNlcjpwYXNz' -H 'Content-Type: application/json' -d '{"url":"http://google.com/","foo":"bar"}'
conc 10 3 http --ignore-stdin -F -a user:pass httpbin.org/post url=http://google.com/ foo=bar
I have a script that ran as a daemon listening to a file:
#!/bin/bash
echo '1'
while inotifywait -e close_write /home/homeassistant/.homeassistant/automations.yaml
do
echo 'automations'
curl -X POST -H "x-ha-access: pass" -H "Content-Type: application/json" http://hassbian.local:8123/api/services/automation/reload
done;
I wanted to listen to several files, and tried adding two more loops:
while inotifywait -e close_write /home/homeassistant/.homeassistant/groups.yaml
do
echo 'gropus'
curl -X POST -H "x-ha-access: pass" -H "Content-Type: application/json" http://hassbian.local:8123/api/services/group/reload
done;
while inotifywait -e close_write /home/homeassistant/.homeassistant/core.yaml
do
echo 'core'
curl -X POST -H "x-ha-access: pass" -H "Content-Type: application/json" http://hassbian.local:8123/api/services/homeassistant/reload_core_config
done;
I realized that the first loop never gets closed so the other ones never get started, but not sure how I should solve this.
You need to run the first loop in a background process so that it doesn't block your script. You may want to run each loop in the background for symmetry, then wait on them at the end of the script.
while inotifywait -e close_write /home/homeassistant/.homeassistant/groups.yaml
do
echo 'gropus'
curl -X POST -H "x-ha-access: pass" -H "Content-Type: application/json" http://hassbian.local:8123/api/services/group/reload
done &
while inotifywait -e close_write /home/homeassistant/.homeassistant/core.yaml
do
echo 'core'
curl -X POST -H "x-ha-access: pass" -H "Content-Type: application/json" http://hassbian.local:8123/api/services/homeassistant/reload_core_config
done &
wait
However, you can run inotifywait in monitor mode and monitor multiple files, piping its output into a single loop. (Caveat: like any line-oriented output format, this cannot cope with filenames containing newlines. See the --format and --csv options for dealing with filenames containing whitespace.)
files=(
/home/homeassistant/.homeassistant/groups.yaml
/home/homeassistant/.homeassistant/core.yaml
)
take_action () {
echo "$1"
curl -X POST "x-ha-access: pass" -H "Content-Type: application/json" \
http://hassbian.local:8123/api/services/"$2"
}
inotifywait -m -e close_write "${files[#]}" |
while IFS= read -r fname _; do
case $fname in
*/groups.yaml) take_action "groups" "group/reload" ;;
*/core.yaml) take_action "core" "homeassistant/reload_core_config" ;;
sac
done
I have a curl request like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url
Which returns a json as response. So I will parse the response using jq to get some specific data. Like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url | jq '<expression>'
Now if the curl request fails then obviously the parsing operation throws ugly error. I want to avoid this. How to store the response first and then later parse it if the request is successful. I don't want to display the json whole response. Also if I add -w "%{http_code}" in my request it appends the status code with the JSON response which messes up the parsing. How to solve this ? I basically want to first check if the curl request is successful or not then get the JSON response and parse it.I also want to get the status code, so that if it fails I can display the status code. But status code is now messing up with json response.
You can combine the --write and --fail options:
# separating the (verbose) curl options into an array for readability
curl_args=(
--write "%{http_code}\n"
--fail
--silent
--user "$user:$password"
--request GET
--header "Content-Type: application/json"
)
if ! output=$(curl "${curl_args[#]}" "$url"); then
echo "Failure: code=$output"
else
# remove the "http_code" line from the end of the output, and parse it
sed '$d' <<<"$output" | jq '...'
fi
Also note: quote your variables!
I found glenn jackman's answer good, but a bit confusingly written, so I rewrote it, and altered it so I can use it as a safer alternative to curl | jq.
#!/bin/bash
# call this with normal curl arguments, especially url argument, e.g.
# safecurl.sh "http://example.com:8080/something/"
# separating the (verbose) curl options into an array for readability
curl_args=(
-H 'Accept:application/json'
-H 'Content-Type:application/json'
--write '\n%{http_code}\n'
--fail
--silent
)
echo "${curl_args[#]}"
# prepend some arguments, but pass on whatever arguments this script was called with
output=$(curl "${curl_args[#]}" "$#")
return_code=$?
if [ 0 -eq $return_code ]; then
# remove the "http_code" line from the end of the output, and parse it
echo "$output" | sed '$d' | jq .
else
# echo to stderr so further piping to jq will process empty output
>&2 echo "Failure: code=$output"
fi
Note: This code does not test for services that ignore the requested content type and respond with HTML. You'd need to test for grep -l '</html>' for that.