Why jq does not see environment variables when run in script? - bash

I have the following JSON file:
{
"1":
{
"media_content":"test3.xspf"
},
"2":
{
"media_content":"test3.xspf"
}
}
In the terminal, using bash as shell, I can execute the following commands:
export schedules="1"
echo $(jq '.[env.schedules]["media_content"]' json_file.json)
Which results in outputing this:
test3.xspf
So it works as expected, but when I place that jq command in a script and run it, it just returns null.
I did echo the values of schedules to make sure the value is non-null inside the script, and it is ok:
echo $schedules
But I did not manage to find the reason, why this command works when run directly in shell and does not work when run in script.
I run the script in the following ways:
bash script.sh
./script.sh
PS: yes, I did offer execute permission: chmod +x script.sh
HINT: env.schedules represents the environment variable 'schedules', and I did make sure that it is assigned in the script before calling jq.
EDIT: I am posting now a whole script, specifying the files tree.
There is one directory containing:
script.sh
json_file.json
static.json
script.sh:
export zone=$(cat static.json | jq '.["1"]');
echo "json block: "$zone
export schedules="$(echo $zone | jq '.schedules')"
echo "environment variable: "$schedules
export media_content=$(jq '.[env.schedules]["media_content"]' json_file.json)
echo "What I want to get: \"test3.xspf\""
echo "What I get: "$media_content
json_file.json:
{
"1":
{
"media_content":"test3.xspf"
},
"2":
{
"media_content":"test3.xspf"
}
}
static.json:
{
"1":
{
"x": "0",
"y": "0",
"width": "960",
"height": "540",
"schedules":"1"
}
}
If I run the script, it displays:
json block: { "x": "0", "y": "0", "width": "960", "height": "540", "schedules": "1" }
environment variable: "1"
What I want to get: "test3.xspf"
What I get: null
If I hardcode the variable:
export schedules="1"
The problem no longer occurs

The problem is simple.
It's not jq's fault.
It the unproper way the schedule's value is piped to the next command.
You have to remove the "s that surround the variable's value, add the second command that uses sed to do that:
export schedules="$(echo $zone | jq '.schedules')"
schedules=$( echo $schedules | sed s/\"//g )
Long answer
Let's see:
here schedules is a string and echo shows its value as being 1:
export schedules="1" ; echo $schedules
here even though double quotes are not mentioned:
export schedules=1 ; echo $schedules
But the result from this also generates additional "s:
export schedules=$(echo $zone | jq '.schedules')
If you print it now you will see additional "s:
echo $schedules # "1"
So just remove the "s from the value:
schedules=$( echo $schedules | sed s/\"//g )

Related

Bash command in variable with herestring as input

Using bash, I have a need to echo a series of commands before I run them. Take this example, which is nice and easy and works as expected. Note that all code examples have been run through ShellCheck which reports no issues detected.
OUTPUTS_CMD=(aws cloudformation describe-stacks --stack-name "#{StackName}" --query 'Stacks[0].Outputs')
echo "${OUTPUTS_CMD[*]}"
OUTPUTS=$("${OUTPUTS_CMD[#]}")
However, other commands require input, and while I can successfully echo these commands, I can't actually get them to run.
GET_FUNCTION_NAME_CMD=(jq --raw-output "'map(select(.OutputKey == \"ProxyFunctionName\")) | .[].OutputValue'")
echo "${GET_FUNCTION_NAME_CMD[*]} <<< \"\$OUTPUTS\""
FUNCTION_NAME=$("${GET_FUNCTION_NAME_CMD[#]}" <<< "$OUTPUTS")
The above example outputs the following, which if copied and pasted returns the correct value.
jq --raw-output 'map(select(.OutputKey == "ProxyFunctionName")) | .[].OutputValue' <<< "$OUTPUTS"
However, the command "${GET_FUNCTION_NAME_CMD[#]}" <<< "$OUTPUTS" returns an error. The same error occurs if I echo "$OUTPUTS" and pipe that in to the command saved in the variable instead of using a herestring, so I believe the error is with how the command is defined in the array.
$ FUNCTION_NAME=$("${GET_FUNCTION_NAME_CMD[#]}" <<< "$OUTPUTS")
jq: error: syntax error, unexpected INVALID_CHARACTER, expecting $end (Unix shell quoting issues?) at <top-level>, line 1:
'map(select(.OutputKey == "ProxyFunctionName")) | .[].OutputValue'
jq: 1 compile error
How can I get the command in the variable to run with a herestring?
Example value for $OUTPUTS
[
{
"OutputKey": "ProxyFunctionName",
"OutputValue": "MyFunctionName",
"Description": "Proxy Lambda Function ARN"
},
{
"OutputKey": "ProxyFunctionUrl",
"OutputValue": "https://my.function.url",
"Description": "Proxy Lambda Function invocation URL"
}
]
OUTPUTS=$(cat <<JSON
[
{
"OutputKey": "ProxyFunctionName",
"OutputValue": "MyFunctionName",
"Description": "Proxy Lambda Function ARN"
},
{
"OutputKey": "ProxyFunctionUrl",
"OutputValue": "https://my.function.url",
"Description": "Proxy Lambda Function invocation URL"
}
]
JSON
)
OutputKey=ProxyFunctionName
GET_FUNCTION_NAME_CMD="jq -r '.[] | objects | select(.OutputKey == \"$OutputKey\") | .OutputValue'"
echo "$GET_FUNCTION_NAME_CMD <<<\"\$OUTPUTS\""
# jq -r '.[] | objects | select(.OutputKey == "ProxyFunctionName") | .OutputValue' <<<"$OUTPUTS"
FUNCTION_NAME=$(eval $GET_FUNCTION_NAME_CMD <<<"$OUTPUTS")
echo $FUNCTION_NAME
# MyFunctionName

Looping over a bash array to add new data with jq -- only seeing the last change

I have a Bash script that uses jq and a for loop to iterate through an array, grab a directory that I need to be monitored by Amazon CloudWatch, and stick it into the latter's JSON configuration file. However, for some reason, only the last item in the array is actually being written. I assume there's something in my logic that is not appending my changes, and instead overwriting them in a particular place, but I can't quite figure out the fix.
Here is my code:
logPaths=("/shared/logs/application/application1"
"/shared/logs/application/application2"
"/shared/logs/application/application3")
# Loop through array to create stanzas and export them to the temp file
for i in ${logPaths[#]}; do
jq "
.logs.logs_collected.files.collect_list[-1] |= . + {
\"file_path\": \"$i\",
\"log_group_name\": \"/aws-account/aws/ec2/syslogs\",
\"log_stream_name\": \"$definedElsewhere\",
\"timestamp_format\": \"%b %d %H:%M:%S\"}" \
/opt/aws/amazon-cloudwatch-agent/amazon-cloudwatch-agent.json \
> /opt/aws/amazon-cloudwatch-agent/amazon-cloudwatch-agent.json.tmp \
&& cp /opt/aws/amazon-cloudwatch-agent/amazon-cloudwatch-agent.json.tmp /opt/aws/amazon-cloudwatch-agent/amazon-cloudwatch-agent.json
done
When this is executed, and I look at amazon-cloudwatch-agent.json, only a record for the 3rd entry in the array (/application3) appears in the configuration file.
I can't reproduce your bug -- but it's irrelevant, because if this were correctly written there wouldn't be any loop needed at all.
Using jq --args allows the logPaths array to be passed in as a set of positional arguments, and referred to from within the relevant jq code as $ARGS.positional. Thus:
#!/usr/bin/env bash
logPaths=("/shared/logs/application/application1"
"/shared/logs/application/application2"
"/shared/logs/application/application3")
# Make up some sample input, since the OP didn't provide any
cat >old.json <<'EOF'
{
"logs": {
"logs_collected": {
"files": {
"collect_list": [
{"test": "make sure this old data is retained"}
]
}
}
}
}
EOF
jq --arg definedElsewhere "Other Value" '
($ARGS.positional | [
.[] | { "file_path": .,
"log_group_name": "/aws-account/aws/ec2/syslogs",
"log_stream_name": $definedElsewhere,
"timestamp_format": "%b %d %H:%M:%S"
}]) as $newLogSinks |
.logs.logs_collected.files.collect_list += $newLogSinks
' --args "${logPaths[#]}" <old.json >new.json && mv new.json old.json
...which correctly emits as output:
{
"logs": {
"logs_collected": {
"files": {
"collect_list": [
{
"test": "make sure this old data is retained"
},
{
"file_path": "/shared/logs/application/application1",
"log_group_name": "/aws-account/aws/ec2/syslogs",
"log_stream_name": "Other Value",
"timestamp_format": "%b %d %H:%M:%S"
},
{
"file_path": "/shared/logs/application/application2",
"log_group_name": "/aws-account/aws/ec2/syslogs",
"log_stream_name": "Other Value",
"timestamp_format": "%b %d %H:%M:%S"
},
{
"file_path": "/shared/logs/application/application3",
"log_group_name": "/aws-account/aws/ec2/syslogs",
"log_stream_name": "Other Value",
"timestamp_format": "%b %d %H:%M:%S"
}
]
}
}
}
}

JQ query on JSON file

I am having below code in JSON file.
{
"comment": {
"vm-updates": [],
"site-ops-updates": [
{
"comment": {
"message": "You can start maintenance on this resource"
},
"hw-name": "Machine has got missing disks. "
}
]
},
"object_name": "4QXH862",
"has_problems": "yes",
"tags": ""
}
I want to separate "hw-name" from this JSON file using jq. I've tried below combinations, but nothing worked.
cat jsonfile | jq -r '.comment[].hw-name'
cat json_file.json | jq -r '.comment[].site-ops-updates[].hw-name'
Appreciated help from StackOverflow!!!
It should be:
▶ cat jsonfile | jq -r '.comment."site-ops-updates"[]."hw-name"'
Machine has got missing disks.
Or better still:
▶ jq -r '.comment."site-ops-updates"[]."hw-name"' jsonfile
Machine has got missing disks.
From the docs:
If the key contains special characters, you need to surround it with double quotes like this: ."foo$", or else .["foo$"].

jenkins pipelines: shell script cannot get the updated environment variable

In Jenkins, I want to get a user input and pass to a shell script for further use.
I tried to set as environment variable, but the shell script failed to get the latest value and the old value is echo.
pipeline {
agent none
environment{
myVar='something_default'
}
stages {
stage('First stage') {
agent none
steps{
echo "update myVar by user input"
script {
test = input message: 'Hello',
ok: 'Proceed?',
parameters: [
string(name: 'input', defaultValue: 'update it!', description: '')
]
myVar = "${test['input']}"
}
echo "${myVar}" // value is updated
}
}
stage('stage 2') {
agent any
steps{
echo "${myVar}" // try to see can myVar pass between stage and it output expected value
sh("./someShell.sh") // the script just contain a echo e.g. echo "myVar is ${myVar}"
// it echo the old value. i.e.something_default
}
}
}
}
The environment variables that we set in the pipeline Script will be accessible only within the script. So, even if you declare your variable as global, it will not work inside a shell script.
Only option I can think off is, pass as it as argument to the shell script
sh("./someShell.sh ${myVar}")
EDIT:
Updated Answer based on OP's query on Shell script for parsing input
LINE="[fristPara:100, secondPaa:abc]"
LINE=$(echo $LINE | sed 's/\[//g')
LINE=$(echo $LINE | sed 's/\]//g')
while read -d, -r pair; do
IFS=':' read -r key val <<<"$pair"
echo "$key = $val"
done <<<"$LINE,
"
You need to pass the variables between your stages as environment variables, e.g. like this:
stage("As user for input") {
steps {
env.DO_SOMETING = input (...)
env.MY_VAR = ...
}
}
stage("Do something") {
when { environment name: 'DO_SOMETING', value: 'yes' }
steps {
echo "DO_SOMETING has the value ${env.DO_SOMETHING}"
echo "MY_VAR has the value ${env.MY_VAR}"
}
}
You have to declare the variable on a global scope so that both places refer to the same instance.
def myVal
pipeline { ... }

Jenkins Pipeline Syntax - Need to get Parameters for Job from wget

Im new to Jenkins Pipeline , Groovy syntax etc.
I have a Jenkins job that takes 5 parameters.
I want to schedule a pipeline that checks for listings via WGET ( format is csv - i can switch to JSON output also ) and the csv is one row with 5 parameters listed
a,b,c,d,e
I need to parse that list and pass the parameters to the
JOB "IF" there are rows , if not , skip and complete the pipeline.
I have searched and basically got as far as this for testing...:
pipeline {
environment {
testVar='foo'}
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
script {
sh "RESULT=\$(wget -qO- https://www.url.com/getlist)"
sh "echo \$RESULT"
//variable define based on parse of CSV???
}
}
}
stage('Example Deploy') {
when {
expression { testVar=='foo' }
}
steps {
echo 'Deploying'
build job: 'Testing', parameters:
[
string(name: 's_e', value: 'u'),
string(name: 't_e', value: 't'),
string(name: 's_s', value: 'DS'),
string(name: 't_s', value: 'SH'),
string(name: 'efg', value: 'TEST')
]
}
}
}}
Obviously i have more work to do around parse of RESULT (but I am not sure how I can achieve this in Pipeline).
I then need to check for RESULT empty or not , then pass variables to the Build.
I opted for a different option.
Instead I now have a Jenkins Job where I use the "Trigger/Call Builds on other Projects"
Before thats added as a build step , i have some code to get the WGET CSV information.
RESULT=$(wget -qO- https://url.com/getlist)
if [ -z "$RESULT" ]
then
echo "Nothing to do"
# exit 1
else
echo "$RESULT"
s_env_upper=$(echo $RESULT | awk -F',' '{print $1}')
t_env_upper=$(echo $RESULT | awk -F',' '{print $2}')
s_env=$(echo $s_env_upper| tr [A-Z] [a-z])
t_env=$(echo $t_env_upper| tr [A-Z] [a-z])
echo "s_env=$s_env" > params.cfg
echo "t_env=$t_env" >> params.cfg
fi
Hope this helps someone else... i was breaking my heart trying to get pipeline to do the work and answer was simpler.

Resources