Execute bash within some text file [duplicate] - bash

This question already has answers here:
Forcing bash to expand variables in a string loaded from a file
(13 answers)
Closed 1 year ago.
I would like to create a templating system that executes bash within a text file.
For example, let's consider we created a simple template.yaml file:
my_path: $(echo ${PATH})
my_ip: $(curl -s http://whatismyip.akamai.com/)
some_const: "foo bar"
some_val: $(echo -n $MY_VAR | base64)
The desire is to execute each one, such that the result may look like:
my_path: /Users/roman/foo
my_ip: 1.2.3.4
some_const: "foo bar"
some_val: ABC
How would I go about doing such a substitution?
Reasons for wanting this:
There are many values, and doing something like a sed or envsubst isn't practical
It would be common to apply a series of piped transformations
The configuration file would be populated from numerous sources, all of them essentially bash commands
I do need to create a yaml file of a specific format (ultimately used by another tool)
I could create aliases etc to increase readability
By having it execute in it's own shell none of the semi-sensitive values are stored as in history or as files.
I'm not married to this approach, and would happily attempt a recommendation that fulfils the reasons.

This might work, you can try though:
you can create a script: script.sh which will take one argument as .yaml file and will expand the variables inside that file:
script.sh :
echo 'cat <<EOF' > temp.sh
cat "$1" >> temp.sh
echo 'EOF' >> temp.sh
bash temp.sh
rm temp.sh
and you can invoke the script as from the command line : ./script.sh template.yaml

Thank you to #joshmeranda for pointing me in the right direction, this solved my problem
echo -e "$(eval "echo -e \"`<template.yaml`\"")"
While eval can be dangerous, in my case its usage is controlled.

Related

Bash create file script with echo command [duplicate]

This question already has answers here:
Multi-line string with extra space (preserved indentation)
(12 answers)
Closed 1 year ago.
I need to generate the following file content with the bash script.
#!/bin/sh
script_dir=$(dirname "$(realpath $0)")
export LD_LIBRARY_PATH=$script_dir/lib
exec $script_dir/{EXEC_FILE_WITH_EXT}
i.e. something like
my_script_generator.sh
#!/bin/sh
echo "<filecontent above>" > new_script.sh
The problem's a make all screening correctly to avoid echo evaluate the content and put it just as is.
For the more huge script, it becomes a problem.
Is there an online service that simplifies that work?
Use cat+here-document and backslash before $ signs you dont't want to be evaluated.
#!/bin/sh
cat > new_script.sh <<EOF
#!/bin/sh
script_dir=\$(dirname "\$(realpath \$0)")
export LD_LIBRARY_PATH=\$script_dir/lib
exec \$script_dir/{EXEC_FILE_WITH_EXT}
EOF
A script like this (not bullet proof) could help automate the task
#!/bin/sh
# make-script.sh
# input : a sequence of commands
# output : sequence of commands which generates a copy of the source script file
echo "cat <<EOF"
echo "#!/bin/sh"
echo
sed "s/\\\$/\\\\\\\$/g"
echo "EOF"

fgrep with string containing spaces inside ksh script

I am trying to write an fgrep statement removing records with a full record match from a file. I can do this on the command line, but not inside a ksh script. The code I am using boils down to these 4 lines of code:
Header='abc def|ghi jkl' #I use the head command to populate this variable
workfile=abc.txt
command="fgrep -Fxv \'$Header\' $workfile" >$outfile
$command
When I echo $command to STDIN the command is exactly what I would type on the command line (with the single quotes) and that works on the command line. When I execute it within the ksh script (file) the single quotes seem not to be recognized because the errors show it is parsing on spaces.
I have tried back ticks, exec, eval, double quotes instead of single quotes, and not using the $command variable. The problem remains.
I can do this on the command line, but not inside a ksh script
Here's a simple, portable, reliable solution using a heredoc.
#!/usr/bin/env ksh
workfile=abc.txt
outfile=out.txt
IFS= read -r Header <<'EOF'
abc def|ghi jul
EOF
IFS= read -r command <<'EOF'
grep -Fxv "$Header" "$workfile" > "$outfile"
EOF
eval "$command"
Explanation :
(Comments can't be added to the script above because they would affect the lines in the heredoc)
IFS= read -r Header <<'EOF' # Line separated literal strings
abc def|ghi jul # Set into the $Header variable
EOF # As if it were a text file
IFS= read -r command <<'EOF' # Command to execute
grep -Fxv "$Header" "$workfile" > "$outfile" # As if it were typed into
EOF # the shell command line
eval "$command" # Execute the command
The above example is the same as having a text file called header.txt, which contains the contents: abc def|ghi jul and typing the following command:
grep -Fxvf header.txt abc.txt
The heredoc addresses the problem of the script operating differently than the command line as a result of quoting/expansions/escaping issues.
A Word of caution regarding eval:
The use of eval in this example is specific. Please see Eval command and security issues for information on how eval can be misused and cause potentially very damaging results.
More Detail / Alternate Example:
For the sake of completeness, clarity, and ability to apply this concept to other situations, some notes about the heredoc and an alternative demonstration:
This implementation of the heredoc in this example is specifically designed with the following criteria:
Literal string assignment of contents, to the variables (using 'EOF')
Use of the eval command to evaluate and execute the referenced variables within the heredoc itself.
File or heredoc ?
One strength of using a heredoc combined with grep -F (fgrep), is the ability to treat a section of the script as if it were a file.
Case for file:
You want to frequently paste "pattern" lines into the file, and remove them as necessary, without having to modify the script file.
Case for heredoc:
You apply the script in an environment where specific files already exist, and you want to match specific exact literal patterns against it.
Example:
Scenario: I have 5 VPS Servers, and I want a script to produce a new fstab file but to ensure it doesn't contain the exact line:
/dev/xvda1 / ext3 errors=remount-ro,noatime,barrier=0 0 1
This scenario fits the type of situation addressed in this question. I could use the boilerplate from the above code in this answer and modify it as following:
#!/usr/bin/env ksh
workfile=/etc/fstab
IFS= read -r Header <<'EOF'
/dev/xvda1 / ext3 errors=remount-ro,noatime,barrier=0 0 1
EOF
IFS= read -r command <<'EOF'
grep -Fxv "$Header" "$workfile"
EOF
eval "$command"
This would give me a new fstab file, without the line contained in the heredoc.
Bash FAQ #50: I'm trying to put a command in a variable, but the complex cases always fail! provides comprehensive guidance - while it is written for Bash, most of it applies to Ksh as well.[1]
If you want to stick with storing your command in a variable (defining a function is the better choice), use an array, which bypasses the quoting issues:
#!/usr/bin/env ksh
Header='abc def|ghi jkl'
workfile=abc.txt
# Store command and arguments as elements of an array
command=( 'fgrep' '-Fxv' "$Header" "$workfile" )
# Invoke the array as a command.
"${command[#]}" > "$outfile"
Note: only a simple command can be stored in an array, and redirections can't be part of it.
[1] The function examples use local to create local variables, which ksh doesn't support. Omit local to make do with shell-global variables instead, or use function <name> {...} syntax with typeset instead of local to declare local variables in ksh.

Is it possible to perform shell injection through a read and/or to break out of quotes?

Consider this example of (attempted) shell injection:
test1.sh:
#!/bin/sh
read FOO
echo ${FOO}
z.dat:
foo && sleep 1 && echo 'exploited'
Then run:
cat z.dat | ./test.sh
On my machine (Ubuntu w/bash) the payload is always (correctly) treated as a single string and never executes the malicious sleep and echo commands.
Question 1: Is it possible to modify z.dat so that test.sh is vulnerable to injection? In particular are there specific shells that might be vulnerable?
Question 2: If so, is changing the test script to quote the variable (shown below) an absolute defense?
test2.sh:
#!/bin/sh
read FOO
echo "${FOO}"
Thanks!
Not according to: https://developer.apple.com/library/mac/documentation/OpenSource/Conceptual/ShellScripting/ShellScriptSecurity/ShellScriptSecurity.html
Search for 'Backwards Compatibility Example'

How to expand shell variables in a text file?

Consider a ASCII text file (lets say it contains code of a non-shell scripting language):
Text_File.msh:
spool on to '$LOG_FILE_PATH/logfile.log';
login 'username' 'password';
....
Now if this were a shell script I could run it as $ sh Text_File.msh and the shell would automatically expand the variables.
What I want to do is have shell expand these variables and then create a new file as Text_File_expanded.msh as follows:
Text_File_expanded.msh:
spool on to '/expanded/path/of/the/log/file/../logfile.log';
login 'username' 'password';
....
Consider:
$ a=123
$ echo "$a"
123
So technically this should do the trick:
$ echo "`cat Text_File.msh`" > Text_File_expanded.msh
...but it doesn't work as expected and the output-file while is identical to the source.
So I am unsure how to achieve this.. My goal is make it easier to maintain the directory paths embedded within my non-shell scripts. These scripts cannot contain any UNIX code as it is not compiled by the UNIX shell.
This question has been asked in another thread, and this is the best answer IMO:
export LOG_FILE_PATH=/expanded/path/of/the/log/file/../logfile.log
cat Text_File.msh | envsubst > Text_File_expanded.msh
if on Mac, install gettext first: brew install gettext
see:
Forcing bash to expand variables in a string loaded from a file
This solution is not elegant, but it works. Create a script call shell_expansion.sh:
echo 'cat <<END_OF_TEXT' > temp.sh
cat "$1" >> temp.sh
echo 'END_OF_TEXT' >> temp.sh
bash temp.sh >> "$2"
rm temp.sh
You can then invoke this script as followed:
bash shell_expansion.sh Text_File.msh Text_File_expanded.msh
If you want it in one line (I'm not a bash expert so there may be caveats to this but it works everywhere I've tried it):
when test.txt contains
${line1}
${line2}
then:
>line1=fark
>line2=fork
>value=$(eval "echo \"$(cat test.txt)\"")
>echo "$value"
line1 says fark
line2 says fork
Obviously if you just want to print it you can take out the extra value=$() and echo "$value".
If a Perl solution is ok for you:
Sample file:
$ cat file.sh
spool on to '$HOME/logfile.log';
login 'username' 'password';
Solution:
$ perl -pe 's/\$(\w+)/$ENV{$1}/g' file.sh
spool on to '/home/user/logfile.log';
login 'username' 'password';
One limitation of the above answers is that they both require the variables to be exported to the environment. Here's what i came up with that would allow the variables to be local to the current shell script:
#!/bin/sh
FOO=bar;
FILE=`mktemp`; # Let the shell create a temporary file
trap 'rm -f $FILE' 0 1 2 3 15; # Clean up the temporary file
(
echo 'cat <<END_OF_TEXT'
cat "$#"
echo 'END_OF_TEXT'
) > $FILE
. $FILE
The above example allows the variable $FOO to be substituted in the files named on the command line. I'm sure it can be improved, but this works for me so far.
Thanks to both previous answers for their ideas!
If the variables you want to translate are known and limited in number, you can always do the translation yourself:
sed "s/\$LOG_FILE_PATH/$LOG_FILE_PATH/g" input > output
And also assuming the variable itself is already known
This solution allows you to keep the same formatting in the ouput file
Copy and paste the following lines in your script
cat $1 | while read line
do
eval $line
echo $line
eval echo $line
done | uniq | grep -v '\$'
this will read the file passed as argument line by line, and then process to try and print each line twice:
- once without substitution
- once with substitution of the variables.
then remove the duplicate lines
then remove the lines containing visible variables ($)
Yes eval should be used carefully, but it provided me this simple oneliner for my problem. Below is an example using your filename:
eval "echo \"$(<Text_File.msh)\""
I use printf instead of echo for my own purposes, but that should do the trick. Thank you abyss.7 providing the link that solve my problem. Hope it helps.
Create an ascii file test.txt with the following content:
Try to replace this ${myTestVariable1}
bla bla
....
Now create a file “sub.sed” containing variable names, eg
's,${myTestVariable1},'"${myTestVariable1}"',g;
s,${myTestVariable2},'"${myTestVariable2}"',g;
s,${myTestVariable3},'"${myTestVariable3}"',g;
s,${myTestVariable4},'"${myTestVariable4}"',g'
Open a terminal move to the folder containing test.txt and sub.sed.
Define the value of the varible to be replaced
myTestVariable1=SomeNewText
Now call sed to replace that variable
sed "$(eval echo $(cat sub.sed))" test.txt > test2.txt
The output will be
$cat test2.txt
Try to replace this SomeNewText
bla bla
....
#logfiles.list:
$EAMSROOT/var/log/LinuxOSAgent.log
$EAMSROOT/var/log/PanacesServer.log
$EAMSROOT/var/log/PanacesStrutsGUI.log
#My Program:
cat logfiles.list | while read line
do
eval Eline=$line
echo $Eline
done

How can one store a variable in a file using bash?

I can redirect the output and then cat the file and grep/awk the variable, but I would like to use this file for multiple variables.
So If it was one variable say STATUS then i could do some thing like
echo "STATUS $STATUS" >> variable.file
#later perhaps in a remote shell where varible.file was copied
NEW_VAR=`cat variable.file | awk print '{$2}'`
I guess some inline editing with sed would help. The smaller the code the better.
One common way of storing variables in a file is to just store NAME=value lines in the file, and then just source that in to the shell you want to pick up the variables.
echo 'STATUS="'"$STATUS"'"' >> variable.file
# later
. variable.file
In Bash, you can also use source instead of ., though this may not be portable to other shells. Note carefully the exact sequence of quotes necessary to get the correct double quotes printed out in the file.
If you want to put multiple variables at once into the file, you could do the following. Apologies for the quoting contortions that this takes to do properly and portably; if you restrict yourself to Bash, you can use $"" to make the quoting a little simpler:
for var in STATUS FOO BAR
do
echo "$var="'"'"$(eval echo '$'"$var")"'"'
done >> variable.file
The declare builtin is useful here
for var in STATUS FOO BAR; do
declare -p $var | cut -d ' ' -f 3- >> filename
done
As Brian says, later you can source filename
declare is great because it handles quoting for you:
$ FOO='"I'"'"'m here," she said.'
$ declare -p FOO
declare -- FOO="\"I'm here,\" she said."
$ declare -p FOO | cut -d " " -f 3-
FOO="\"I'm here,\" she said."

Resources