Remove the trailing spaces from command line passed param in bash - bash

I run a command:
script.sh ___bubu__
The content of the script.sh is:
echo $1
When executed I get
___bubu__
How can I remove the trailing spaces from command line passed arguments?
I copy some params from a file and when pasting into command line I get some spaces and I do not want to manually remove the space. I am planning to use $1 as a parameter in a script. For example I want to create a folder with $1

Would something like this work?
..> more strip.sh
#!/bin/bash
arg=$1
arg=${arg// /}
echo "unstripped: --$1--"
echo "stripped: --$arg--"
.03:163> ./strip.sh " test "
unstripped: -- test --
stripped: --test--

Related

Bash preserve whitespaces and newlines from file content to variable

I have this code
TOKEN=$(cat ./config/token)
echo "$TOKEN"
cat > variables.env <<EOF
TOKEN=`echo "$TOKEN"`
EOF
I am trying to get the content of a file and output it in a new file prefixed by some text. The first echo in the console echoes the output I want, keeping the whitespaces and newlines.
However, in the new file the output is just the first line of the original string, while I'd like the same output I can see in the console with the first echo.
Use printf %q (in ksh or bash) to escape content in such a way that it will always evaluate back to its literal value:
printf 'TOKEN=%q\n' "$(<./config/token)" >variables.env
$(<file) is a ksh and bash extension which acts as a more efficient replacement for $(cat file) (as the regular command substitution needs to fork off a subprocess, set up a FIFO, and spawn an external copy of /bin/cat, whereas the $(<file) form simply tells the shell to read the file directly).
This way a taken containing an otherwise-hostile string such as $(rm -rf ~) or content that could simply be expanded as a variable ($$) will be emitted as literal content.
Providing an explicit example of how this behaves:
printf '%s\n' "first line" "second line" >token # write two lines to the file "token"
printf 'TOKEN=%q\n' "$(<token)" >variables.env # write a shell command which assigns those
# two lines to a variable to variables.env
source variables.env # execute variables.env in the current shell
echo "$TOKEN" # emit the value of TOKEN, as given in the current shell
...when run with bash, will emit the exact output:
first line
second line
...after writing the following (with bash 3.2.48; may vary with other releases) to variables.env:
TOKEN=$'first line\nsecond line'
Useless use of echo
This is what you could write:
cat > variables.env <<EOF
TOKEN=${TOKEN}
EOF
you are doing it in a very convoluted way, there are easier methods
sed '1s/./TOKEN=&/' file > newfile
will insert TOKEN= on the first line. This has an additional benefit of not modifying empty files (at least one char should exist in the original file). If that's not intended you can use unconditional insert.
You can do:
echo "TOKEN=" > newfile && cat ./config/token >> newfile
>> appends to a file.

How to capture chown output in a ksh script

I have written a script to change file ownerships based on an input list read in. My script works fine on directories without space in their name. However it fails to change files on directories with space in their name. I also would like to capture the output from the chown command to a file. Could anyone help ?
here is my script in ksh:
#!/usr/bin/ksh
newowner=eg27395
dirname=/home/sas/sastest/
logfile=chowner.log
date > $dir$logfile
command="chown $newowner:$newowner"
for fname in list
do
in="$dirname/$fname"
if [[ -e $in ]]
then
while read line
do
tmp=$(print "$line"|awk '{if (substr($2,1,1) == "/" ) print $2; if (substr($0,1,1) == "/" ) print '})
if [[ -e $tmp ]]
then
eval $command \"$tmp\"
fi
done < $in
else
echo "input file $fname is not present. Check file location in the script."
fi
done
a couple of other errors:
date > $dir$logfile -- no $dir variable defined
to safely read from a file: while IFS= read -r line
But to answer your main concern, don't try to build up the command so dynamically: don't bother with the $command variable, don't use eval, and quote the variable.
chmod "$newowner:$newowner" "$tmp"
The eval is stripping the quotes on this line
command="chown $newowner:$newowner"
In order to get the line to work with spaces you will need to provide backslashed quotes
command="chown \"$newowner:$newowner\""
This way the command that eval actually runs is
chown "$newowner:$newowner"
Also, you probably need quotes around this variable setting, although you'll need to tweak the syntax
tmp="$(print "$line"|awk '{if (substr($2,1,1) == "/" ) print $2; if (substr($0,1,1) == "/" ) print '})"
To capture the output you can add 2>&1 > file.out where file.out is the name of the file ... in order to get it working with eval as you are using it you will need to backslash any special characters much in the same way you need to backslash the double quotes
Your example code suggests that list is a "meta" file: A list of files that each has a list of files to be changed. When you only have one file you can remove the while loop.
When list is a variable with filenames you need echo "${list}"| while ....
It is not completely clear why you sometimes want to start with the third field. It seems that sometimes you have 2 words before the filename and want them to be ignored. Cutting the string on spaces becomes a problem when your filenames have spaces as well. The solution is look for a space followed by a slash: that space is not part of a filename and everything up to that space can be deleted.
newowner=eg27395
# The slash on the end is not really part of the dir name, doesn't matter for most commands
dirname=/home/sas/sastest
logfile=chowner.log
# Add braces, quotes and change dir into dirname
date > "${dirname}/${logfile}"
# Line with command not needed
# Is list an inputfile? It is streamed using "< list" at the end of while []; do .. done
while IFS= read -r fname; do
in="${dirname}/${fname}"
# Quotes are important
if [[ -e "$in" ]]; then
# get the filenames with a sed construction, and give it to chmod with xargs
# The sed construction is made for the situation that in a line with a space followed by a slash
# the filename starts with the slash
# sed is with # to avoid escaping the slashes
# Do not redirect the output here but after the loop.
sed 's#.* /#/#' "${in}" | xargs chmod ${newowner}:${newowner}
else
echo "input file ${fname} is not present. Check file location in the script."
fi
done < list >> "${dirname}/${logfile}"

Appending output from a command to a variable in Bash

I'm trying to append an output of a command to a variable in Bash. My code is
#!/bin/bash
for file in *
do
lineInfo=`wc -l $file`
echo "$lineInfo"
done
I understand how to "capture" the output of a command to a variable as I have done in this line by the use of backquotes.
lineInfo=`wc -l $file`
Is there a clean cut way I can place the output of that entire for loop into a variable in Bash? Or in each iteration of the for loop append the output of the wc command to linesInfo ? (Without redirecting anything to files) Thanks.
This stores all the line infos (separated by commas) into one variable and prints that variable:
#!/bin/bash
total=""
for file in *
do
lineInfo=`wc -l $file`
total="$total$lineInfo, " # or total+="$lineInfo, "
done
echo $total

Remove a line from a .txt line on grep match in Shell

So I'm fairly new to Shell scripting and trying to build a function that deletes a line from a .txt file.
To be clear I want to be able to run the following command
$ ./script.sh searchTerm delete
Which should find the line containing 'searchTerm' and remove it.
I am passing the $1 (to capture the searchTerm) into the deletePassword function but can't seem to get it to work.
Would love some advice :)
#Delete a password
if [[ $2 == "delete" ]]; then
deletePassword $1
fi
function deletePassword () {
line=grep -Hrn $1 pwstore.txt
sed -n $line pwstore.txt
echo "Deleted that for you.."
}
When running the previous command I get the following error:
sed: 1: "pwstore.txt": extra characters at the end of p command
Your line variable isn't being set as you expect, as you need to use command substitution to capture the result of a command like that. eg:
line=$(grep -Hrn $1 pwstore.txt)
I would suggest just using sed instead:
sed -i.bak "/$1/d" pwstore.txt
This will delete any lines which match the string stored in $1 from pwstore.txt (and create a backup of the original file at pwstore.txt.bak)

bash backup script error

So I'm writing a simple backup script that when called will either back up a specific file or all the files in my current directory into my backup directory.
This is my script
#!/bin/bash
#verify if $1 is empty, if so, copy all content to backup directory
if [ -z "$1" ]
then
$files=ls
#Looping through files
for file in $files
do
cp $file ../backup/
done
#else copy files specified
else
$files=ls $1
#Looping through files
for file in $files
do
cp $file ../backup/
done
fi
and the only error I'm getting is:
./backup: line 7: =ls: command not found
I'm not sure why the script won't recognize ls as a command. Any ideas?
to assign a variable, you don't need the dollar sign:
files=foo
to save the output of an command to a var, you need do:
files=$(ls)
or
files=$(ls /path/to/some/dir)
I see two mistakes:
When you initialize variable "files" you should not prepend it with "$" symbol
Command ls should be placed in back quotes (`):
This is a short example:
#!/bin/bash
files=`ls`
for file in $files
do
echo "file: $file"
done
You should try putting the ls into ` marks - better, the back quote. Like this:
files=`ls`
Here a little background. Check this page for more information about quotation marks in the shell environment:
The back quote is not used for quoting characters. That character is
used for command substitution, where the characters between them are
executed by the shell and the results is inserted on that line.
Example:
echo the date is `date`

Resources