I am using below curl command to restart the Connectors/tasks. I have scheduled this curl command in Crontab. The purpose of this curl command is to restart the Connectors/Tasks, if their status is "Failure". I would like to incorporate "if" statement in this curl command, so that it attempts only 3 times in order to restart the Connectors/Tasks. After attempting 3 times, it has to stop the restarting the connectors/tasks and send an Email.
curl -s "http://localhost:8083/connectors?expand=status" | \
jq -c -M 'map({name: .status.name } + {tasks: .status.tasks}) | .[] | {task: ((.tasks[]) + {name: .name})} | select(.task.state=="FAILED") | {name: .task.name, task_id: .task.id|tostring} | ("/connectors/"+ .name + "/tasks/" + .task_id + "/restart")' | \
xargs -I{connector_and_task} curl -v -X POST "http://localhost:8083"\{connector_and_task\}
Could you please give me the solution
Related
I have a file with a bunch of output from some performance tests. It looks similar to the following:
index | master | performance-fix | change %
--- | --- | --- | ---
load | 26212.8 | 28223.6 | 7.67%
type | 67.5 | 75.41 | 11.72%
minType | 56.91 | 59.6 | 4.73%
maxInserterSearch | 185.45 | 283.25 | 52.74%
minInserterHover | 25.97 | 27.55 | 6.08%
maxInserterHover | 44.47 | 44.7 | 0.52%
I am trying to submit a new comment on a Github issue using that table data. Standard text works fine, but when I try and pass the table along I'm getting the error:
{
"message": "Problems parsing JSON",
"documentation_url": "https://docs.github.com/rest/reference/issues#update-an-issue-comment"
}
My cURL request is as follows:
NEW_COMMENT=$(curl -sS \
-X PATCH \
-u $GH_LOGIN:$GH_AUTH_TOKEN \
-H "Accept: application/vnd.github.v3+json" \
"https://api.github.com/repos/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME/issues/comments/$COMMENT_ID" \
-d '{"body": "Results: <br />'"$TEST_RESULTS"'"}')
I have also tried creating the {"body": ...} using jq, and using the --data-urlencode flag. Both return the same "Problems parsing JSON" error.
It looks like $TEST_RESULTS contains characters that make the JSON not what you think it is, like including quotation marks and newlines
Maybe escaping the JSON output like this will help
escaped="$(printf '%s' "$TEST_RESULTS" | jq -Rs '.')"
... \
-d '{"body": "Results: <br />'"$escaped"'"}')
I'm getting some values with jq command like these:
curl xxxxxx | jq -r '.[] | ["\(.job.Name), \(.atrib.data)"]' | #tsv' | column -t -s ","
It gives me:
AAAA PENDING
ZZZ FAILED BAD
What I want is that I get is a first field with a secuencial number (1 ....) like these:
1 AAA PENDING
2 ZZZ FAILED BAD
......
Do you know if it's possible? Thanks!
One way would be to start your pipeline with:
range(0;length) as $i | .[$i]
You then can use $i in the remainder of the program.
Can anyone help me to understand how I can print countryCode followed by connectionName and load with a percentage symbol all on one line nicely formatted - all using jq - not using sed, column or any other unix external command. I cannot seem print anything other than the one column
curl --silent "https://api.surfshark.com/v3/server/clusters" | jq -r -c "map(select(.countryCode == "US" and .load <= "99")) | sort_by(.load) | limit(20;.[]) | [.countryCode, .connectionName, .load] | (.[1])
Is this what you wanted ?
curl --silent "https://api.surfshark.com/v3/server/clusters" |
jq -r -c 'map(select(.countryCode == "US" and .load <= 99)) |
sort_by(.load) |
limit(20;.[]) |
"\(.countryCode) \(.connectionName) \(.load)%"'
I am getting list of values as below using the curl command:
curl -s http://internal.registry.com/v2/_catalog | jq -r '.repositories[0:5] | to_entries | map( .value )[]'
Output:
centos
containersol/consul-server
containersol/mesos-agent
containersol/mesos-master
cybs/address-api
I want to make sure that output should not have the prefix cybs/ in it. for example, cybs/address-api should just be address-api
Just use sub:
curl ... | jq -r '.repositories[0:5][] | sub("^cybs/"; "")'
Also note that to_entries | map( .value ) is a NOP and should be removed.
Output:
centos
containersol/consul-server
containersol/mesos-agent
containersol/mesos-master
address-api
If I use shell:
ps -eaf | grep groovy
I can get such output:
[root#test www]# ps -eaf | grep groovy
root 924 539 1 03:15 pts/0 00:00:05 /usr/java/jdk1.6.0_31/bin/java -classpath
/root/dev/groovy-1.8.8/lib/groovy-1.8.8.jar -Dscript.name=./groovysh -Dprogram.name=groovysh
-Dgroovy.starter.conf=/root/dev/groovy-1.8.8/conf/groovy-starter.conf
-Dgroovy.home=/root/dev/groovy-1.8.8 -Dtools.jar=/usr/java/jdk1.6.0_31/lib/tools.jar
org.codehaus.groovy.tools.GroovyStarter --main org.codehaus.groovy.tools.shell.Main
--conf /root/dev/groovy-1.8.8/conf/groovy-starter.conf --classpath .
root 1127 562 0 03:20 pts/1 00:00:00 grep groovy
[root#test www]#
But if I run this command in groovy:
proc = "ps -eaf | grep groovy".execute()
proc.waitFor() // => return 1
proc.in.text // => return ""
proc.err.text // => see following
The proc.err.text will be the document of ps command:
ERROR: Garbage option.
********* simple selection ********* ********* selection by list *********
-A all processes -C by command name
-N negate selection -G by real group ID (supports names)
-a all w/ tty except session leaders -U by real user ID (supports names)
-d all except session leaders -g by session OR by effective group name
-e all processes -p by process ID
T all processes on this terminal -s processes in the sessions given
But if I run ps -eaf it will be correct.
It seems | can't be used, is it true? How to fix it?
Yeah, you can't use shell output piping and redirection like that.
One option is to do:
Process ps = 'ps -eaf'.execute()
Process gr = 'grep groovy'.execute()
Process all = ps | gr
println all.text
The other is to wrap it in a new shell using the List form of execute:
println( [ 'sh', '-c', 'ps -eaf | grep groovy' ].execute().text )