Using jq in Gitlab CI yaml - shell

I'm having a json file sample.json. Below is the snippet from the sample.json -
{
"AddOnModules": {
"Description": "add on modules",
"Type": "Array",
"AllowedValues": [
"a",
"b",
"c"
],
"value": []
}
}
I'm trying to provide value to AddOnModules through git ci variable (parameter-value) at runtime while running the pipeline. The following is the snippet of the pipeline -
stages:
- deploy
# Job to deploy for development
dev-deploy:
variables:
before_script:
- apk add jq
image: python:3.7.4-alpine3.9
script:
- tmp=$(mktemp)
- jq -r --arg add_on_modules "$add_on_modules" '.AddOnModules.value |= .+ [$add_on_modules] ' sample.json > "$tmp" && mv "$tmp" sample.json
- cat sample.json
stage: deploy
tags:
- docker
- linux
only:
variables:
- $stage =~ /^deploy$/ && $deployment_mode =~ /^dev$/
I'm giving the value of variable add_on_modules as "a","b" through git ci while running the pipeline. On performing cat sample.json, it's observed to be -
{
"AddOnModules": {
"Description": "add on modules",
"Type": "Array",
"AllowedValues": [
"a",
"b",
"c"
],
"value": [ "\"a\",\"b\""]
}
}
The extra double quotes are getting the prepended and appended while the existing ones are escaped.
I want output something like -
{
"AddOnModules": {
"Description": "add on modules",
"Type": "Array",
"AllowedValues": [
"a",
"b",
"c"
],
"value": ["a","b"]
}
}
Looks like I'm missing something with jq -
- jq -r --arg add_on_modules "$add_on_modules" '.AddOnModules.value |= .+ [$add_on_modules] ' sample.json > "$tmp" && mv "$tmp" sample.json
Tried with -r/--raw-output flag with jq but no success. Any suggestions on how to solve this?
This is how I'm running the pipeline -
Pipeline run

If $add_on_modules is ["a","b"]
If you can set add_on_modules to the textual representation of a JSON array, then you would use --argjson, like so:
add_on_modules='["a","b"]'
jq -r --argjson add_on_modules "$add_on_modules" '
.AddOnModules.value += $add_on_modules
' sample.json
If $add_on_modules is the string "a","b"
add_on_modules='"a","b"'
jq -r --arg add_on_modules "$add_on_modules" '
.AddOnModules.value += ($add_on_modules|split(",")|map(fromjson))
' sample.json

Related

how to include bash in yaml so we can pass the value of output variable to yaml file

The below value of private_localaddress needs to passed to "range_start" in yaml file. how to do this ?
Here is the script:
#!/bin/bash
MAC=`curl -s http://169.254.169.254/latest/meta-data/network/interfaces/macs/`
myarr=($MAC)
for val in ${myarr[#]}; do
interfaceindex=`curl -s http://169.254.169.254/latest/meta-data/network/interfaces/macs/$val/device-number`
if [ $interfaceindex == 2 ];
then
private_localaddress=`curl -s http://169.254.169.254/latest/meta-data/network/interfaces/macs/$val/local-ipv4s`
echo $private_localaddress
fi
done
Here is the YAML file :
---
apiVersion: "k8s.cni.cncf.io/v1"
kind: NetworkAttachmentDefinition
metadata:
name: worker-private-eth2
spec:
config: '{
"cniVersion": "0.3.1",
"ipam": {
"type": "whereabouts",
"range": "10.30.11.0/25",
"range_start": $private_localaddress,
"range_end": "10.30.11.127",
"routes": [
{ "dst": "10.93.123.0/24", "gw": "10.30.11.1" }
],
Ffffff
Use a proper yaml parser. Here are some examples:
Using mikefarah/yq:
value="$private_localaddress" yq \
'.spec.config |= (fromjson | .ipam.range_start = strenv(value) | tojson)'
Using kislyuk/yq:
yq -y --arg value "$private_localaddress" \
'.spec.config |= (fromjson | .ipam.range_start = $value | tojson)'
Using itchyny/gojq:
gojq --arg value "$private_localaddress" --yaml-input --yaml-output \
'.spec.config |= (fromjson | .ipam.range_start = $value | tojson)'
Perhaps the simplest would be to make private_localaddress an environment variable and use envsubst to apply the changes to your YAML.
However to end up with correct YAML code, you need to ensure that you have a quoted string to the right of the colon. The best would be if you can already ensure that your YAML is written as
"range_start": "$private_localaddress",
If this is not feasible, you have to add the quotes when creating the variable, i.e.
export private_localaddress=\"$(curl -s http://169.254.169.254/latest/meta-data/network/interfaces/macs/$val/local-ipv4s)\"

Can't set different json values with different values

Linux Mint 20.2
Here report.empty.json
{
"total": 0,
"project": "",
"Severity": [],
"issues": []
}
I want to set value = 500 (int value) to "total" and "MY_PROJECT".
To do this I use tool "jq"
Here by bash script file:
#!/bin/bash
readonly PROJECT_KEY=MY_PROJECT
readonly PAGE_SIZE=500
jq --arg totalArg "$PAGE_SIZE" '.total = $totalArg' report.empty.json > report.json
jq --arg projectKey "${PROJECT_KEY}" '.project = $projectKey' report.empty.json > report.json
echo "Done"
But it set only key project. The key total is not changed.
Content of file report.json
{
"total": 0,
"project": "MY_PROJECT",
"Severity": [],
"issues": []
}
But I need to update BOTH KEYS.
The result must be:
{
"total": 500,
"project": "MY_PROJECT",
"Severity": [],
"issues": []
}
The second command reads from report.empty.json instead of the already-modified report.json.
You could chain the jq
jq --arg totalArg "$PAGE_SIZE" '.total = $totalArg' report.empty.json |
jq --arg projectKey "${PROJECT_KEY}" '.project = $projectKey' >report.json
But a better solution is to use just use one command.
jq --arg totalArg "$PAGE_SIZE" --arg projectKey "$PROJECT_KEY" '
.total = $totalArg | .project = $projectKey
' report.empty.json >report.json
My proposal for How to populate JSON values, using jq
Thinking about How to process arrays using jq, here is my modified version of your script. (Of course, you could keep empty.json out of script)...
#!/bin/bash
declare -r projectKey=MY_PROJECT
declare -ir pageSize=500
declare -a issueList=()
declare -i issueCnt=0
declare issueStr='' jqCmd='.project = $projArg | .total = $totArg | .issues=[ '
declare promptMessage='Enter issue (or [return] if none): '
while read -rp "$promptMessage" issue && [ "$issue" ];do
promptMessage='Enter next issue (or [return] if no more): '
issueCnt+=1
issueList+=(--arg is$issueCnt "$issue")
issueStr+="\$is$issueCnt, "
done
jqCmd+="${issueStr%, } ]"
jq --arg totArg "$pageSize" --arg projArg "$projectKey" \
"${issueList[#]}" "( $jqCmd )" <<-EoEmptyJson
{
"total": 0,
"project": "",
"Severity": [],
"issues": []
}
EoEmptyJson
Sample run (I want to add two issues):
./reportJson
Enter issue (or [return] if none): Foo
Enter next issue (or [return] if no more): Bar Baz
Enter next issue (or [return] if no more):
{
"total": "500",
"project": "MY_PROJECT",
"Severity": [],
"issues": [
"Foo",
"Bar Baz"
]
}
No answer (so far) accounts for the requirement that total be of type int. This can be accomplished by using --argjson instead of --arg. Here's my two cents:
jq --argjson total 500 --arg project "MY_PROJECT" '. + {$total, $project}' report.json
{
"total": 500,
"project": "MY_PROJECT",
"Severity": [],
"issues": []
}

jq: create array of object in json and insert the new object each time bash scripts executes [duplicate]

This question already has answers here:
Add new element to existing JSON array with jq
(3 answers)
Closed 3 years ago.
I want to create valid json using jq in bash.
each time when bash script will execute "Add new element to existing JSON array" and if file is empty create new file.
I am using following jq command to create my json (which is incomplete, please help me to complete it)
$jq -n -s '{service: $ARGS.named}' \
--arg transcationId $TRANSACTION_ID_METRIC '{"transcationId":"\($transcationId)"}' \
--arg name $REALPBPODDEFNAME '{"name ":"\($name )"}'\
--arg lintruntime $Cloudlintruntime '{"lintruntime":"\($lintruntime)"}' \
--arg status $EXITCODE '{"status":"\($status)"}' \
--arg buildtime $totaltime '{"buildtime":"\($buildtime)"}' >> Test.json
which is producing output like
{
"service": {
"transcationId": "12345",
"name": "sdsjkdjsk",
"lintruntime": "09",
"status": "0",
"buildtime": "9876"
}
}
{
"service": {
"transcationId": "123457",
"servicename": "sdsjkdjsk",
"lintruntime": "09",
"status": "0",
"buildtime": "9877"
}
}
but I don't want output in this format
json should be created first time like
what should be jq command for creating below jason
{
"ServiceData":{
"date":"30/1/2020",
"ServiceInfo":[
{
"transcationId":"20200129T130718Z",
"name":"MyService",
"lintruntime":"178",
"status":"0",
"buildtime":"3298"
}
]
}
}
and when I next time execute the bash script element should be added into the array like
what is the jq command for getting json in this format
{
"ServiceData":{
"date":"30/1/2020",
"ServiceInfo":[
{
"transcationId":"20200129T130718Z",
"name":"MyService",
"lintruntime":"16",
"status":"0",
"buildtime":"3256"
},
{
"transcationId":"20200129T130717Z",
"name":"MyService",
"lintruntime":"16",
"status":"0",
"buildtime":"3256"
}
]
}
}
also I want "date " , "service data" , "service info"
fields in my json which are missing in my current one
You don't give a separate filter to each --arg option; it just defines a variable which can be used in the single filter argument. You just want to add new object to your input. jq doesn't do in-place file editing, so you'll have to write to a temporary file and replace your original after the fact.
jq --arg transactionId "$TRANSACTION_ID_METRIC" \
--arg name "$REALPBPODDEFNAME" \
--arg lintruntime "$Cloudlintruntime" \
--arg status "$EXITCODE" \
--arg buildtime "$totaltime" \
'.ServiceData.ServiceInfo += [ {transactionID: $transactionId,
name: $name,
lintruntime: $lintruntime,
status: $status,
buildtime: $buildtime
}]' \
Test.json > tmp.json &&
mv tmp.json Test.json
Here's the same command, but using an array to store all the --arg options and a variable to store the filter so the command line is a little simpler. (You also don't need explicit line continuations inside an array definition.)
args=(
--arg transactionId "$TRANSACTION_ID_METRIC"
--arg name "$REALPBPODDEFNAME"
--arg lintruntime "$Cloudlintruntime"
--arg status "$EXITCODE"
--arg buildtime "$totaltime"
)
filter='.ServiceData.ServiceInfo += [
{
transactionID: $transactionId,
name: $name,
lintruntime: $lintruntime,
status: $status,
buildtime: $buildtime
}
]'
jq "${args[#]}" "$filter" Test.json > tmp.json && mv tmp.json Test.json

JQ query on JSON file

I am having below code in JSON file.
{
"comment": {
"vm-updates": [],
"site-ops-updates": [
{
"comment": {
"message": "You can start maintenance on this resource"
},
"hw-name": "Machine has got missing disks. "
}
]
},
"object_name": "4QXH862",
"has_problems": "yes",
"tags": ""
}
I want to separate "hw-name" from this JSON file using jq. I've tried below combinations, but nothing worked.
cat jsonfile | jq -r '.comment[].hw-name'
cat json_file.json | jq -r '.comment[].site-ops-updates[].hw-name'
Appreciated help from StackOverflow!!!
It should be:
▶ cat jsonfile | jq -r '.comment."site-ops-updates"[]."hw-name"'
Machine has got missing disks.
Or better still:
▶ jq -r '.comment."site-ops-updates"[]."hw-name"' jsonfile
Machine has got missing disks.
From the docs:
If the key contains special characters, you need to surround it with double quotes like this: ."foo$", or else .["foo$"].

Bad indentation of a sequence entry bitbucket pipelines

I currently have a step in bitbucket pipelines which does some stuff. The last step is to start an aws ecs task, like this:
- step:
name: Migrate database
script:
- curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
- apt-get update
- apt-get install -y unzip python
- unzip awscli-bundle.zip
- ./awscli-bundle/install -b ~/bin/aws
- export PATH=~/bin:$PATH
- aws ecs run-task --cluster test-cluster --task-definition test-task --overrides '{ "containerOverrides": [ { "name": "test-container", "command": [ "echo", "hello world" ], "environment": [ { "name": "APP_ENV", "value": "local" } ] } ] }' --network-configuration '{ "awsvpcConfiguration": { "subnets": ["subnet-xxxxxxx"], "securityGroups": ["sg-xxxxxxx"], "assignPublicIp": "ENABLED" }}' --launch-type FARGATE
This fails the validation with the error:
Bad indentation of a sequence entry bitbucket pipelines
Splitting the statement up on multiple lines is not working either. What would be the correct approach here?
The issue is you have a colon followed by a space, which causes the YAML parser to interpret this as a map and not a string.
The easiest solution would be to move
aws ecs run-task --cluster test-cluster --task-definition test-task --overrides '{ "containerOverrides": [ { "name": "test-container", "command": [ "echo", "hello world" ], "environment": [ { "name": "APP_ENV", "value": "local" } ] } ] }' --network-configuration '{ "awsvpcConfiguration": { "subnets": ["subnet-xxxxxxx"], "securityGroups": ["sg-xxxxxxx"], "assignPublicIp": "ENABLED" }}' --launch-type FARGATE
Into a script file, and call it from Pipelines.
You could also remove all the spaces after any ':' characters. But given the amount of JSON there, you'd likely encounter the same issue again when modifying it. So the script file is probably the easier option here.

Resources