I am working on a shell script, and I wish to do the following as part of a shell script without declaring a function. Basically, I would like to convert the following code into a shell script simply without declaring a function.
#!/bin/bash
function json2keyvalue {
cat<<EOF | jq -r 'to_entries|map("\(.key)\t\(.value|tostring)")[]'
{
"hello1": "world1",
"testk": "testv"
}
EOF
}
while IFS=$'\t' read -r key value
do
export "$key"="$value"
done < <(json2keyvalue)
into a shell script. I did the following,
values='{"hello1":"world1","hello1.world1.abc1":"hello2.world2.abc2","testk":"testv"}'
while IFS=$'\t' read -r key value
do
export "$key"="$value"
done < < (echo $values | jq -r 'to_entries|map("\(.key)\t\(.value|tostring)")[]')
But this doesn't seem to work. When I run the shell script, it gives the error as follows, where file name is abc.sh.
./abc.sh: line 6: syntax error near unexpected token `<'
./abc.sh: line 6: `done < < (jq -r 'to_entries|map("\(.key)\t\(.value|tostring)")[]' <<<"$values")'
The script takes a JSON like the following, and converts the key-values into environment variables.
{
"hello1": "world1",
"testk": "testv"
}
I suggest:
#!/bin/bash
shopt -s lastpipe
cat <<EOF | jq -r 'to_entries|map("\(.key)\t\(.value|tostring)")[]' | while IFS=$'\t' read -r key value; do export "$key"="$value"; done
{
"hello1": "world1",
"testk": "testv"
}
EOF
lastpipe: If set, and job control is not active, the shell runs the last command of a pipeline not executed in the background in the current shell environment.
Update:
If variable $values contains valid json code, this should work:
#!/bin/bash
shopt -s lastpipe
values='place valid json code here'
echo "$values" | jq -r 'to_entries|map("\(.key)\t\(.value|tostring)")[]' | while IFS=$'\t' read -r key value; do export "$key"="$value"; done
Related
The goal here is when parameters $AutoBasher is set read from file line by line then take that variable and add it to a curl command. I also tried to have it when the parameters is set then run this command, if not then this command. Id like the script to run a one automated command, if not then run command with $URL.
AutoBasher()
{
if [[ -z "$AutoBasher" ]]; then
echo ''
else
for FileLine in $AutoBasher; do
while IFS= `read -r Line`; do
echo ${Line}
done < "FileLine"
done < "$AutoBasher"
fi
}
curl $(AutoBasher)
I have two scripts. Script A includes script B and calls a function in script B.
The setup looks like this:
Test file - ~/file.txt
one==1.0.0
two==2.0.0
three==3.0.0
four==4.0.0
Script A - ~/script_a.sh
#!/bin/bash
source script_b.sh
func_one
Script B - ~/script_b.sh
#!/bin/bash
# Note: don't forget to change the spaces to tabs else heredoc won't work
my_user=$USER
func_two() {
# Here, I need run everything in the heredoc as user $my_user
sudo su - $my_user -s /bin/bash <<- EOF
while read -r line || [[ -n "$line" ]];
do
# **This is the problem line**
# I can confirm that all the lines are being
# read but echo displays nothing
echo "$line"
# The line below will be printed 4 times as there are 4 lines in the file of interest
echo "Test"
done < "/home/$my_user/file.txt"
EOF
}
func_one() {
func_two
}
To run
cd ~
bash script_a.sh
Question: Why is the line echo "$line" not producing any output?
The problem is that bash is substituting $line with its value (nothing) before it gets passed to su. Escaping the dollar sign should fix it. So $line should be changed to \$line in both places in script_b.sh.
I have a curl command that looks like this:
curl -X PUT -H "myheader:coca-cola" -d '{ "name":"harrypotter" }' http://mygoogle.com/service/books/123
Running this command as is via terminal returns the expected results.
I am trying to incorporate this curl command in my bash script as follows:
#!/bin/bash
MYURL=http://mygoogle.com/service/books/123
# Generate body for curl request
generate_put_data()
{
cat <<EOF
{
"name":"harrypotter"
}
EOF
}
put_data=$(echo "$(generate_put_data)")
put_data_with_single_quotes="'$put_data'"
# Generate headers for curl request
header=myheader:coca-cola
header_with_double_quotes="\"$header\""
# The following function takes two inputs - a simple string variable (with no spaces or quotes) and the curl command string
function run_cmd() {
echo $1
echo $2
#Run the curl command
"$2"
#Check return code of the curl command
if [ "$?" -ne 0 ]; then
#do something with simple string variable
echo "$1"
echo "Job failed"
exit 1
else
#do something with simple string variable
echo "$1"
echo "Job Succeeded"
fi
}
# Run the bash function - run_cmd
run_cmd "mysimplestring" "curl -X PUT -H $header_with_double_quotes -d $put_data_with_single_quotes $MYURL"
However, when I try to run the above bash script, it fails at the point where I call run_cmd() function with the two inputs. I get the following error:
curl -X PUT -H "myheader:coca-cola" -d '{
"name":"harrypotter"
}' http://mygoogle.com/service/books/123: No such file or directory
Job failed
This error occurs on the line where "$2" is being executed in the run_cmd() function declaration.
Could someone help me understand where I am going wrong? Thanks!
"$2"
This will take the second argument and try to run it without doing any word splitting. It treats it as one string.
You're going to run into trouble passing in the curl command as one string. You'll do better if you pass it without quotes, just as if you typed it on the command line. You'll want to quote each of the variables but not quote the command as a whole.
run_cmd "mysimplestring" curl -X PUT -H "$header" -d "$put_data" "$MYURL"
Notice that you don't need the "with_quotes" variables any more. You don't have to do anything like that. The original plain values will work.
Now you can access the command using array syntax:
function run_cmd() {
local name=$1; shift
local cmd=("$#")
#Run the curl command
"${cmd[#]}"
}
By the way, this is a useless use of echo:
put_data=$(echo "$(generate_put_data)")
Make that:
put_data=$(generate_put_data)
I want to read in a remote configuration file in a bash script; locally the following works fine:
while IFS="=" read -r name value; do
declare "$name=$value"
done < "$cfg"
I tried to do the same using ssh and cat:
ssh "$hostname" "cat $remote_cfg" |
while IFS="=" read -r name value; do
declare "$name=$value"
echo $name $value
done
But my variables are only declared in scope of the while loop, how can I bring them to the outer scope?
Thanks in advance!
I have figured things out (source: http://mywiki.wooledge.org/BashFAQ/024).
Process substitution works:
while IFS="=" read -r name value; do
declare "$name=$value"
done < <(ssh "$hostname" "cat ~/.cuttleline.cfg")
I've been trying several fails to perform the following:
Basically, what I need is to execute several sequenced commands on a remote unix shell, such as setting environment variables with variables that I have on the script, move to a particular directory and run a script there and so on.
I've tried using a printf with the portion of the script and then piped the ssh command, but it didn't work quite well, also, I've read about the "ssh ... >> END" marker, which is great but since I'm using functions, it doesn't work well.
Do you have any thoughts?
Here's an excerpt of the code:
deployApp() {
inputLine=$1;
APP_SPECIFIC_DEPLOY_SCRIPT="$(echo $inputLine | cut -d ' ' -s -f1)";
BRANCH="$(echo $inputLine | cut -d ' ' -s -f2)";
JBOSS_HOME="$(echo $inputLine | cut -d ' ' -s -f3)";
BASE_PORT="$(echo $inputLine | cut -d ' ' -s -f4)";
JAVA_HOME_FOR_JBOSS="$(echo $inputLine | cut -d ' ' -s -f5)";
JAVA_HEAP="$(echo $inputLine | cut -d ' ' -s -f6)";
echo "DEPLOYING $APP_SPECIFIC_DEPLOY_SCRIPT"
echo "FROM BRANCH $BRANCH"
echo "IN JBOSS $JBOSS_HOME"
echo "WITH BASE PORT $BASE_PORT"
echo "USING $JAVA_HOME_FOR_JBOSS"
if [[ -n "$JAVA_HEAP" ]]; then
echo "WITH $JAVA_HEAP"
fi
echo
echo "Exporting jboss to $JBOSS_HOME"
ssh me#$SERVER <<END
cleanup() {
rm -f $JBOSS_SERVER/log/*.log
rm -Rf $JBOSS_SERVER/deploy/
rm -Rf $JBOSS_SERVER/tmp/
mkdir $JBOSS_SERVER/deploy
}
startJboss() {
cd $JBOSS_SERVER/bin
./jbossctl.sh start
return 0;
}
export JBOSS_HOME
export JBOSS_SERVER=$JBOSS_HOME/server/default
END
return 0;
}
With that "HERE" approach, I'm getting this error: "syntax error: unexpected end of file"
Thanks a lot in advance!
Just put the functions in your here document, too:
var="Hello World"
ssh user#host <<END
x() {
print "x function with args=$*"
}
x "$var"
END
[EDIT] Some comments:
You say "export JBOSS_HOME" but you never define a value for the variable in the here document. You should use export JBOSS_HOME="$JBOSS_HOME". BASH will take all text between the two END, replace all variables, and send the result to SSH for processing.
That also means the other side will see rm -f /path/to/jboss/server/*.log; the assignment to JBOSS_SERVER in the last line of the here document has no effect (at least not to the code in cleanup()).
If you want to pass $ unmodified to the remote server, you have to escape it with \: rm -f \$JBOSS_SERVER/log/*.log
You never call cleanup()
There is a } missing after return 0 to finish the definition of deployapp()
There may be other problems as well. Run the script with bash -x to see what the shell actually executes. You can also add echo commands in the here document to see what the values of the variables are or you can add set -x before cleanup() to get the same output as with bash -x but from the remote side.
I don't understand why you're using cut to split the arguments to your function. Just do
APP_SPECIFIC_DEPLOY_SCRIPT=$1
BRANCH=$2
JBOSS_HOME=$3
# etc.
If you don't quote your here document delimiter, the contents are expanded before they're sent to the server. That may be what you want. If you don't and you want all expansion to be done on the server side, then quote it like this:
ssh me#$SERVER <<'END'
# etc.
END
If you wan't a mixture, don't quote the delimiter, but do escape those things that you want delayed expansion for:
ssh me#$SERVER <<END
echo $EXPAND_ME_NOW \$EXPAND_ME_LATER
END
What are the export statements supposed to do? I can't see that they would have any effect at all.