Commit list from GIT is always processed completely per loop instead of individual elements - bash

I have a problem with my bash script. Namely, I am doing procedures against an API via CURL for each JSON file pushed to GIT. In principle the procedure works, however I have the problem that for processing in each loop the complete commit list is controlled against the API.
So if I push two JSON files to GIT, then each JSON file is executed twice against the API instead of each JSON file only once.
Example:
Git Push:
file1.json
file2.json
file3.json
file4.json
file5.json
Execution:
file1.json file1.json file1.json file1.json file1.json
file2.json file2.json file2.json file2.json file2.json
file3.json file3.json file3.json file3.json file3.json
file4.json file4.json file4.json file4.json file4.json
file5.json file5.json file5.json file5.json file5.json
Expectation would be each file only once.
I tried to solve the issue using arrays, but apparently it doesn't work as thought.
Here is the actual function from the code:
#!/bin/bash
# Create an empty array to store processed files
processed_files=()
#Login
endpoint=xxx
username=xxx
password=xxx
# Get list of files in commit
files=`git diff --name-only HEAD HEAD~1`
# Test each file that is a json
for file in $files
do
if [[ $file == *.json ]]
then
# Check if the file has already been processed
if [[ ! " ${processed_files[#]} " =~ " ${file} " ]]
then
# Add the file to the array
processed_files+=("$file")
echo "Jobexecution"
curl -k -s -H "Authorization: xxx" -X POST -F "definitionsFile=#../$file" -F "$endpoint/deploy"
submit=$(curl -k -s -H "Authorization: xxx" -X POST -F "jobDefinitionsFile=#../$file" -F "$endpoint/run")
runid=$(echo ${submit##*runId\" : \"} | cut -d '"' -f 1)
# Check job status
jobstatus=$(curl -k -s -H "Authorization: xxx" "$endpoint/run/status/$runid")
status=$(echo ${jobstatus##*status\" : \"} | cut -d '"' -f 1)
# Wait till jobs ended
echo "Wait till jobs ended"
until [[ $status == Ended* ]]; do
sleep 10
tmp=$(curl -k -s -H "Authorization: xxx" "$endpoint/run/status/$runid")
echo $tmp | grep 'Not OK' >/dev/null && exit 2
tmp2=$(echo ${tmp##*$'\"type\" : \"Folder\",\\n'})
status=$(echo ${tmp2##*\"status\" : \"} | cut -d '"' -f 1)
done
else
echo "Job was already executed. Ill skip this one."
fi
fi
done
# Logout
curl -k -s -H "Authorization: xxx" -X POST "$endpoint/session/logout"
# Exit
if [[ $status == *Not* ]]; then
echo 'Job failed!'
exit 1
else
echo 'Success!'
exit 0
fi
As already mentioned, I tried to solve the issue using arrays, but apparently it doesn't work as thought.

I solve the problem. The issue was, that the Jenkins Pipeline send always the whole commit list for each JSON-File.
To solve the problem, I execute the Bash-Script in the Jenkinspipeline with an argument, wich is the actually JSON-File in the loop.

Related

How do I check the HTTP status code and also parse the payload

Imagine I have the following code in a bash script:
curl -s https://cat-fact.herokuapp.com/facts/random?animal=cat | jq .
Notice that I wish to display the payload of the response by passing it to jq.
Now suppose sometimes those curls sometimes return a 404, in such cases my script currently still succeeds so what I need to do is check the return code and exit 1 as appropriate (e.g. for a 404 or 503). I've googled around and found https://superuser.com/a/442395/722402 which suggests --write-out "%{http_code}" might be useful however that simply prints the http_code after printing the payload:
curl -s --write-out "%{http_code}" https://cat-fact.herokuapp.com/facts/random?animal=cat | jq .
$ curl -s --write-out "%{http_code}" https://cat-fact.herokuapp.com/facts/random?animal=cat | jq .
{
"_id": "591f98783b90f7150a19c1ab",
"__v": 0,
"text": "Cats and kittens should be acquired in pairs whenever possible as cat families interact best in pairs.",
"updatedAt": "2018-12-05T05:56:30.384Z",
"createdAt": "2018-01-04T01:10:54.673Z",
"deleted": false,
"type": "cat",
"source": "api",
"used": false
}
200
What I actually want to is still output the payload, but still be able to check the http status code and fail accordingly. I'm a bash noob so am having trouble figuring this out. Help please?
I'm using a Mac by the way, not sure if that matters or not (I'm vaguely aware that some commands work differently on Mac)
Update, I've pieced this together which sorta works. I think. Its not very elegant though, I'm looking for something better.
func() {
echo "${#:1:$#-1}";
}
response=$(curl -s --write-out "%{http_code}" https://cat-fact.herokuapp.com/facts/random?animal=cat | jq .)
http_code=$(echo $response | awk '{print $NF}')
func $response | jq .
if [ $http_code == "503" ]; then
echo "Exiting with error due to 503"
exit 1
elif [ $http_code == "404" ]; then
echo "Exiting with error due to 404"
exit 1
fi
What about this. It uses a temporary file. Seems me a bit complicated but it separates your content.
# copy/paste doesn't work with the following
curl -s --write-out \
"%{http_code}" https://cat-fact.herokuapp.com/facts/random?animal=cat | \
tee test.txt | \ # split output to file and stdout
sed -e 's-.*\}--' | \ # remove everything before last '}'
grep 200 && \ # try to find string 200, only in success next step is done
echo && \ # a new-line to juice-up the output
cat test.txt | \ #
sed 's-}.*$-}-' | \ # removes the last line with status
jq # formmat json
Here a copy/paste version
curl -s --write-out "%{http_code}" https://cat-fact.herokuapp.com/facts/random?animal=cat | tee test.txt | sed -e 's-.*\}--' | grep 200 && echo && cat test.txt | sed 's-}.*$-}-' | jq
This is my attempt. Hope it works for you too.
#!/bin/bash
result=$( curl -i -s 'https://cat-fact.herokuapp.com/facts/random?animal=cat' )
status=$( echo "$result" | grep -E '^HTTPS?/[1-9][.][1-9] [1-9][0-9][0-9]' | grep -o ' [1-9][0-9][0-9] ')
payload=$( echo "$result" | sed -n '/^\s*$/,//{/^\s*$/ !p}' )
echo "STATUS : $status"
echo "PAYLOAD : $payload"
Output
STATUS : 200
PAYLOAD : {"_id":"591f98803b90f7150a19c23f","__v":0,"text":"Cats can't taste sweets.","updatedAt":"2018-12-05T05:56:30.384Z","createdAt":"2018-01-04T01:10:54.673Z","deleted":false,"type":"cat","source":"api","used":false}
AWK version
payload=$( echo "$result" | awk '{ if( $0 ~ /^\s*$/ ){ c_p = 1 ; next; } if (c_p) { print $0} }' )
Regards!
EDIT : I have simplified this even more by using the -i flag
EDIT II : Removed empty line from payload
EDIT III : Included an awk method to extract the payload in case sed is problematic
Borrowing from here you can do:
#!/bin/bash
result=$(curl -s --write-out "%{http_code}" https://cat-fact.herokuapp.com/facts/random?animal=cat)
http_code="${result: -3}"
response="${result:0:${#result}-3}"
echo "Response code: " $http_code
echo "Response: "
echo $response | jq
Where
${result: -3} is the 3rd index starting from the right of the string till the end. This ${result: -3:3} also would work: Index -3 with length 3
${#result} gives us the length of the string
${result:0:${#result}-3} from the beginning of result to the end minus 3 from the http_status code
The site cat-fact.herokuapp.com isn't working now so I had to test it with another site

Bash Find File check IF and execute

i have this files
1.json2 - 2.json2 - 3.json2
1.json2.ml - 2.json2.ml -
Example ml file
1.json2.ml
{"message":"Validation error","error":"validation_error",...
2.json2.ml
{"Ok":"OK":"OK"...}
I want to search if *.json2.ml isnt execute a post and save.
If the file exist look if error is there and execute Post.
Here is the code i use for this
find . -type f -name '*.json2' | xargs bash -c 'for fname
do if [ ! -e ${fname}.ml ]
then curl -X POST -H "Content-Type: application/json" -d #${fname} https://web/api/post > ${fname}.ml
else
sed '1d' ${fname}.ml | while read line
do
FS=',' read pid pname
if [ "$var" -e ""error":"validation_error"" ]
then
curl -X POST -H "Content-Type: application/json" -d #${fname} https://web/api/post > ${fname}.ml
echo que ha y $pname
fi
done
' bash
I have this result
syntax error near unexpected token `fi'
What is the expected result
1 - Post 3.json2 ( file .ml no exist )
2- Post 1.json2 ( File .ml exist and have error in )
3- 2.json2 and 3.json2 ( do nothing because json2.ml is OK)
I found the solution whit this code
else
VAR1=$(head -n 1 ${fname}.ml)
IFS="," read -ra images <<< "$VAR1"
echo que ha y $images
Thx alot to Barmar and William

How to send TCPDUMP output to remote script via CURL

So I have a script here that is taking a TCPDUMP output. We are trying to send (2) variables to a PHP script over the web ($SERVER). The filename header is created and contains both $FILETOSEND which is the filename and filedata. The actual data for the filedata variable is coming from a file called 1 (the data is formatted as you can tell). I am having issues with the section that calls out #send common 10 sec dump.
I am trying to CURL the file 1 and I am doing so by using curl --data "$(cat 1)" $SERVER
The script isn't sending the file 1 at all, mostly just sends the filename and no file data. Is there a problem with the way I am sending the file? Is there a better way to format it?
while true; do
sleep $DATASENDFREQ;
killall -9 tcpdump &> /dev/null
if [ -e $DUMP ]; then
mv $DUMP $DUMP_READY
fi
create_dump
DATE=`date +"%Y-%m-%d_%H-%M-%S"`
FILETOSEND=$MAC-$DATE-$VERSION
# we write fileheader to the file. 2 vars : filename, filedata.
FILEHEADER="filename=$FILETOSEND&filedata="
echo $FILEHEADER > 2
# change all colons to underscores for avoiding Windows filenames issues
sed -i 's/:/_/g' 2
# delete all newlines \n in the file
tr -d '\n' < 2 > 1
# parsing $DUMP_READY to awk.txt (no header in awk.txt)
awk '{ if (NF > 18 && $10 == "signal") {print "{\"mac\": \""$16"\",\"sig\": \""$9"\",\"ver\": \""$8"\",\"ts\": \""$1"\",\"ssid\": \""$19"\"}" }}' $DUMP_READY > awk.txt
sed -i 's/SA://g' awk.txt
sed -i 's/&/%26/g' awk.txt
cat awk.txt >> 1
sync
# send $OFFLINE
if [ -e $OFFLINE ]; then
curl -d $OFFLINE $SERVER
if [ $? -eq "0" ]; then
echo "status:dump sent;msg:offline dump sent"
rm $OFFLINE
else
echo "status:dump not sent;msg:offline dump not sent"
fi
fi
# send common 10 secs dump
curl --data "$(cat 1)" $SERVER
if [ $? -eq "0" ]; then
echo "status:dump sent"
else
cat 1 >> $OFFLINE
echo "status:dump not sent"
fi
if [ -e $DUMP_READY ]; then
rm -f $DUMP_READY 1 2 upload_file*
fi

Curl echo Server response Bash

I'm trying to create a bash script that check url from list status code and echo server name from header. I'm actually new.
#!/bin/bash
while read LINE; do
curl -o /dev/null --silent --head --write-out '%{http_code}' "$LINE"
echo " $LINE" &
curl -I /dev/null --silent --head | grep -Fi Server "$SERVER"
echo " $SERVER"
done < dominios-https
I get the following output
301 http://example.com
grep: : No such file or directory
1) while read LINE can not use last line if text file not ended with new line.
2) You don't set "$SERVER" anywhere, and grep say it
3) Not all servers return "Server:" in headers
try it:
scriptDir=$( dirname -- "$0" )
for siteUrl in $( < "$scriptDir/myUrl.txt" )
do
if [[ -z "$siteUrl" ]]; then break; fi # break line if him empty
httpCode=$( curl -I -o /dev/null --silent --head --write-out '%{http_code}' "$siteUrl" )
echo "HTTP_CODE = $httpCode"
headServer=$( curl -I --silent --head "$siteUrl" | grep "Server" | awk '{print $2}' )
echo "Server header = $headServer"
done

How to get the result of a command in a HERE_doc in Bash

I'm using the at command to schedule a job in the future.
DoCurlAt () {
if [ -n "${AuthToken:-}" ] ; then
$4 << 'EOF'
curl -s -H "${AuthHeader:-}" -H "$1" --data-urlencode "$2" "$3"
EOF
Exitcode=$?
fi
WriteLog Output Info "AT Output: $AtOutput Exitcode: $Exitcode"
}
How can I capture the result of the at in a variable called $AtOutput?
I tried with
AtOutput=$(bash $4 << EOF
curl -s -H "${AuthHeader:-}" -H "$1" --data-urlencode "$2" "$3"
EOF
)
But that does't really give any result.
Also tried with:
AtOutput=$(curl -s -H "${AuthHeader:-}" -H "$1" --data-urlencode "$2" "$3" | at "$4")
But I would prefer to use the HERE-doc.
The function is called with
DoCurlAt "$AcceptJson" "argString=$ArgString" "$ApiUrl/$ApiVersion/job/$JobUid/run" "$OneTime"
$OneTime ($4) could be for example "at 15:19 today" The output is mostly something like this:
job 7 at 2016-08-16 15:30
at writes to standard error, not standard output. Use the 2>&1 redirection to copy standard error to standard output first.
$ at_output=$( echo "cmd" | at "$when" 2>&1 )

Resources