Generating a bash script with echo, problem with shebang line - bash

I want to explain to some friends how to add multikey support to their linux systems at bootup but first I need them to make a bash script. I want to make a simple command for them to copy and paste and I'm testing out this command I made but it keeps throwing an error. Only when I add the shebang line which, well is important.
$ sudo echo -e "#!/bin/bash \nxmodmap \"keysym Alt_R = Multi_key\"" > /etc.init.d/multikey.sh
Any easy way to echo a shebang line?

Use the other quotes.
sudo echo -e '#!/bin/bash\nxmodmap "keysym Alt_R = Multi_key"'

If you want to impress your friends use here documents not echo strings :-)
~$ cat << EOF > /etc/init.d/multikey.sh
> #!/bin/bash
> xmodmap "keysym Alt_R = Multi_key"
> EOF

Related

I need help echoing a variable using bash

I'm trying to append the line
log-bin="/mysql-log/bin-log"
to the file
/etc/mysql/test
and the following command does the job
echo 'echo "log-bin=""\"/mysql-log/bin-log\"" >> /etc/mysql/test' | sudo -s
However, I'm having a hard time if I want to have a more flexible script where the line to append is store on a variable. That is by doing something like this:
#!/bin/bash
BIN_LOG_DIR="\"/mysql-log/bin-log\""
str="log-bin="$BIN_LOG_DIR
echo 'echo ${str} >> /etc/mysql/test' | sudo -s
This script adds an empty line rather that the intended value contained in the variable $str. How can I solve this problem?
Thanks in advance!!
gorf
When you use single quotes, the variables are nor interpolated, you need double quotes. That's the problem here
So :
#!/bin/bash
BIN_LOG_DIR="/mysql-log/bin-log"
str="log-bin$BIN_LOG_DIR"
echo "echo ${str} >> /etc/mysql/test" | sudo -s
See http://mywiki.wooledge.org/BashGuide/Practices#Quoting for explanations about the use of the different quotes.

escaping spaces and other special characters in cygwin shell script

I am pulling my hair out try to get a script to work on cygwin. Here's the latest version of the script I am trying to run:
$ cat start_vm_2.sh
#!/bin/sh
VMRUN='/cygdrive/c/\"Program Files (x86)\"/VMware/VMware\ VIX/vmrun"'
echo "VMRUN is [$VMRUN]"
ARGS='-T ws start \"C:\\Users\\red\\Documents\\Virtual Machines\\myvm-dev-006 \(2\)\\myvm-dev-006 \(2\).vmx\"'
echo "ARGS is [$ARGS]"
And this is the error message I get:
$ ./start_vm_2.sh
VMRUN is [/cygdrive/c/\"Program Files (x86)\"/VMware/VMware\ VIX/vmrun"]
ARGS is [-T ws start \"C:\\Users\\red\\Documents\\Virtual Machines\\myvm-dev-006 \(2\)\\myvm-dev-006 \(2\).vmx\"]
./start_vm_2.sh: line 8: /cygdrive/c/\"Program: No such file or directory
You should run it as bash instead and store your arguments as arrays. Also, do not add literal quotes to your spaces:
#!/bin/bash
VMRUN="/cygdrive/c/Program Files (x86)/VMware/VMware VIX/vmrun"
echo "VMRUN is [$VMRUN]"
ARGS=(-T ws start 'C:\Users\red\Documents\Virtual Machines\myvm-dev-006 (2)\myvm-dev-006 (2).vmx')
echo "ARGS is [${ARGS[*]}]"
"$VMRUN" "${ARGS[#]}"
Run bash script.sh.
this might be your problem... I was having the same issue until I figured out my script had Windows special characters (cat -e script.ksh)... so I did a dos2unix to the file and it started to flow as I wanted
Hope this is useful

gnome terminal tabs open multiple ssh connections

I have a file with a list of servers:
SERVERS.TXT:
192.168.0.100
192.168.0.101
192.168.0.102
From a gnome terminal script, I want open a new terminal, with a tab for each server.
Here is what I tried:
gnome-terminal --profile=TabProfile `while read SERVER ; do echo "--tab -e 'ssh usr#$SERVER'"; done < SERVERS.TXT`
Here is the error:
Failed to parse arguments: Argument to "--command/-e" is not a valid command: Text ended before matching quote was found for '. (The text was ''ssh')
Tried removing the space after the -e
gnome-terminal --profile=TabProfile `while read SERVER ; do echo "--tab -e'ssh usr#$SERVER'"; done < SERVERS.TXT`
And I get a similar error:
Failed to parse arguments: Argument to "--command/-e" is not a valid command: Text ended before matching quote was found for '. (The text was 'usr#192.168.0.100'')
Obviously there is a parsing error since the the shell is trying to be helpful by using the spaces to predict and place delimiters. The server file is changed without notice and many different sets of servers need to be looked at.
I found this question while searching for an answer to the issue the OP had, but my issue was a little different. I knew the list of servers, they where not in a file.
Anyway, the other solutions posted did not work for me, but the following script does work, and is what I use to get around the "--command/-e" is not a valid command" error.
The script should be very easy change to suit any need:
#!/bin/sh
# Open a terminal to each of the servers
#
# The list of servers
LIST="server1.info server2.info server3.info server4.info"
cmdssh=`which ssh`
for s in $LIST
do
title=`echo -n "${s}" | sed 's/^\(.\)/\U\1/'`
args="${args} --tab --title=\"$title\" --command=\"${cmdssh} ${s}.com\""
done
tmpfile=`mktemp`
echo "gnome-terminal${args}" > $tmpfile
chmod 744 $tmpfile
. $tmpfile
rm $tmpfile
Now the big question is why does this work when run from a file, but not from within a script. Sure, the issue is about the escaping of the --command part, but everything I tried failed unless exported to a temp file.
I would try something like:
$ while read SERVER;do echo -n "--tab -e 'ssh usr#$SERVER' "; \
done < SERVERS.txt | xargs gnome-terminal --profile=TabProfile
This is to avoid any interpretation that the shell could do of the parameters (anything starting with a dash).
Because it is concatenating strings (using -n), it is necessary to add an space between them.
Is this a problem of parsing command-line options? Sometimes if you have one command sending arguments to another command, the first can get confused. The convention is to use a -- like so:
echo -- "--tab -e 'ssh usr#$SERVER'";
Try to type
eval
before gnome terminal command.
it should be something like this:
eval /usr/bin/gnome-terminal $xargs
worked for me!

How to expand shell variables in a text file?

Consider a ASCII text file (lets say it contains code of a non-shell scripting language):
Text_File.msh:
spool on to '$LOG_FILE_PATH/logfile.log';
login 'username' 'password';
....
Now if this were a shell script I could run it as $ sh Text_File.msh and the shell would automatically expand the variables.
What I want to do is have shell expand these variables and then create a new file as Text_File_expanded.msh as follows:
Text_File_expanded.msh:
spool on to '/expanded/path/of/the/log/file/../logfile.log';
login 'username' 'password';
....
Consider:
$ a=123
$ echo "$a"
123
So technically this should do the trick:
$ echo "`cat Text_File.msh`" > Text_File_expanded.msh
...but it doesn't work as expected and the output-file while is identical to the source.
So I am unsure how to achieve this.. My goal is make it easier to maintain the directory paths embedded within my non-shell scripts. These scripts cannot contain any UNIX code as it is not compiled by the UNIX shell.
This question has been asked in another thread, and this is the best answer IMO:
export LOG_FILE_PATH=/expanded/path/of/the/log/file/../logfile.log
cat Text_File.msh | envsubst > Text_File_expanded.msh
if on Mac, install gettext first: brew install gettext
see:
Forcing bash to expand variables in a string loaded from a file
This solution is not elegant, but it works. Create a script call shell_expansion.sh:
echo 'cat <<END_OF_TEXT' > temp.sh
cat "$1" >> temp.sh
echo 'END_OF_TEXT' >> temp.sh
bash temp.sh >> "$2"
rm temp.sh
You can then invoke this script as followed:
bash shell_expansion.sh Text_File.msh Text_File_expanded.msh
If you want it in one line (I'm not a bash expert so there may be caveats to this but it works everywhere I've tried it):
when test.txt contains
${line1}
${line2}
then:
>line1=fark
>line2=fork
>value=$(eval "echo \"$(cat test.txt)\"")
>echo "$value"
line1 says fark
line2 says fork
Obviously if you just want to print it you can take out the extra value=$() and echo "$value".
If a Perl solution is ok for you:
Sample file:
$ cat file.sh
spool on to '$HOME/logfile.log';
login 'username' 'password';
Solution:
$ perl -pe 's/\$(\w+)/$ENV{$1}/g' file.sh
spool on to '/home/user/logfile.log';
login 'username' 'password';
One limitation of the above answers is that they both require the variables to be exported to the environment. Here's what i came up with that would allow the variables to be local to the current shell script:
#!/bin/sh
FOO=bar;
FILE=`mktemp`; # Let the shell create a temporary file
trap 'rm -f $FILE' 0 1 2 3 15; # Clean up the temporary file
(
echo 'cat <<END_OF_TEXT'
cat "$#"
echo 'END_OF_TEXT'
) > $FILE
. $FILE
The above example allows the variable $FOO to be substituted in the files named on the command line. I'm sure it can be improved, but this works for me so far.
Thanks to both previous answers for their ideas!
If the variables you want to translate are known and limited in number, you can always do the translation yourself:
sed "s/\$LOG_FILE_PATH/$LOG_FILE_PATH/g" input > output
And also assuming the variable itself is already known
This solution allows you to keep the same formatting in the ouput file
Copy and paste the following lines in your script
cat $1 | while read line
do
eval $line
echo $line
eval echo $line
done | uniq | grep -v '\$'
this will read the file passed as argument line by line, and then process to try and print each line twice:
- once without substitution
- once with substitution of the variables.
then remove the duplicate lines
then remove the lines containing visible variables ($)
Yes eval should be used carefully, but it provided me this simple oneliner for my problem. Below is an example using your filename:
eval "echo \"$(<Text_File.msh)\""
I use printf instead of echo for my own purposes, but that should do the trick. Thank you abyss.7 providing the link that solve my problem. Hope it helps.
Create an ascii file test.txt with the following content:
Try to replace this ${myTestVariable1}
bla bla
....
Now create a file “sub.sed” containing variable names, eg
's,${myTestVariable1},'"${myTestVariable1}"',g;
s,${myTestVariable2},'"${myTestVariable2}"',g;
s,${myTestVariable3},'"${myTestVariable3}"',g;
s,${myTestVariable4},'"${myTestVariable4}"',g'
Open a terminal move to the folder containing test.txt and sub.sed.
Define the value of the varible to be replaced
myTestVariable1=SomeNewText
Now call sed to replace that variable
sed "$(eval echo $(cat sub.sed))" test.txt > test2.txt
The output will be
$cat test2.txt
Try to replace this SomeNewText
bla bla
....
#logfiles.list:
$EAMSROOT/var/log/LinuxOSAgent.log
$EAMSROOT/var/log/PanacesServer.log
$EAMSROOT/var/log/PanacesStrutsGUI.log
#My Program:
cat logfiles.list | while read line
do
eval Eline=$line
echo $Eline
done

BASH script to pass variables without substitution into new script

As part of a system build script I have a script that creates various files and configurations.
However one part of the build script creates a new script that contains variables that I don't want resolved when the build script runs. Code snippet example
cat - > /etc/profile.d/mymotd.sh <<EOF
hostname=`uname -n`
echo -e "Hostname is $hostname"
EOF
I have tried all sorts of combinations of ' and " and ( and [ but I cannot get the script to send the content without substituting the values and placing the substitutes in the new script rather than the original text.
Ideas?
The easiest method, assuming you don't want anything to be substituted in the here doc, is to put the EOF marker in quotes, like this:
cat - > /etc/profile.d/mymotd.sh <<'EOF'
hostname=`uname -n`
echo -e "Hostname is $hostname"
EOF
Easiest is to escape the $
echo -e "Hostname is \$hostname"

Resources