Unsure how to use ' in a bash script - bash

I am trying to remotely SSH into a server and remove x amount of backups depending how many we set to keep in the script.
#!/bin/bash
BKUSER=4582
BKSERVER=int-backup2.domain.com
DELETEMORETHAN=$(ssh "$BKUSER"#"$BKSERVER" 'find ~/backup/ -maxdepth 1 -type d | wc -l')
if [ "$DELETEMORETHAN" -gt 4 ] ; then
COUNT=$(echo "$DELETEMORETHAN - 4" | bc -l)
ssh "$BKUSER"#"$BKSERVER" 'echo rm -rvf "$(ssh "$BKUSER"#"$BKSERVER" 'find ~/backup/ -maxdepth 1 -type d | head -"${COUNT}"')'
fi
In this example, I am trying to keep 4 of the latest backups. I am failing at
ssh "$BKUSER"#"$BKSERVER" 'echo rm -rvf "$(ssh "$BKUSER"#"$BKSERVER" 'find ~/backup/ -maxdepth 1 -type d | head -"${COUNT}"')'
I was trying to use: https://github.com/koalaman/shellcheck/wiki/SC2026 but it doesn't help since the ' are not being grouped properly, I am stuck!
dennis-b:
[root#ngx /]# ./test
bash: -c: line 0: unexpected EOF while looking for matching `''
bash: -c: line 3: syntax error: unexpected end of file

There's really no point repeatedly ssh'ing into your server in order to construct a command line which will be executed that server. Just ssh once, and give it the script that should be run.
To simplify a bit, there is no real need to use find. find ~/backup/ -maxdepth 1 -type d doesn't even produce the listing in order, so a simple glob ~/backup/*/ is probably better.
Assuming you have bash on the server,
ssh "$bkuser#$bkserver" \
'dirs=(~/backup/*/);
if ((${#dirs[#]} > 4)); then
echo rm -rvf "${dirs[#]:4}";
fi'
will probably do what you want. (Split into lines for readability; it can be typed all on one line, leaving out the line continuation character.)

The main difference between ' and " is that variables in double quotes are replaced but in single quotes no variables are replaced and $() is not executed.
So to build the command that is to be executed on the server, you need to put it into double quotes as you need the $() to work:
ssh "$BKUSER"#"$BKSERVER" "echo rm -rvf '$(ssh "$BKUSER"#"$BKSERVER" "find ~/backup/ -maxdepth 1 -type d | head -'${COUNT}'")"
You basically have to switch your single and double quotes: the outermost quotes must be double quotes so $() works and variables are replaced. The quotes that are required for the command to work on the server can be single quotes, as no variable replacement happens on the server, everything is done before the command is send.
You may wonder why '$(ssh ...)' works in spite of being in single quotes. Actually, it is in double quotes! The single quotes are not quotes here, they are just plain text inside of double quotes. They only get interpreted as single quotes on the server.

Related

shell script to batch replace specific string in .csv file

I want to replace some strings in my raw csv file for further use and I search for the internet and create the script so far. But it seems they doesn't work. Hope anyone can help me
The csv file is like this and I want to delete "^M" and "# Columns: " so that I can read my file.
# Task: bending1^M
# Frequency (Hz): 20^M
# Clock (millisecond): 250^M
# Duration (seconds): 120^M
# Columns: time,avg_rss12,var_rss12,avg_rss13,var_rss13,avg_rss23,var_rss23^M
#!/usr/bin/env bash
function scandir(){
cd `dirname $0`
echo `pwd`
local cur_dir parent_dir workir
workdir=$1
cd ${workdir}
if [ ${workdir}="/" ]
then
cur_dir=""
else
cur_dir=$(pwd)
fi
for dirlist in $(ls ${cur_dir})
do
if test -d ${dirlist}
then
cd ${dirlist}
scandir ${cur_dir}/${dirlist}
cd ..
else
vi ${cur_dir}/${dirlist} << EOF
:%s/\r//g
:%s/\#\ Columns:\ //g
:wq
EOF
fi
done
}
Your whole script looks like just:
find "$workdir" -type f | xargs -n1 sed -i -e 's/\r//g; s/^# Columns://'
Notes to your script:
Check your scripts for validity on https://www.shellcheck.net/
The of << EOF here document is invalid. The closing word EOF has to start from the beginning of the line inside the script:
vi ${cur_dir}/${dirlist} << EOF
:%s/\r//g
:%s/\#\ Columns:\ //g
:wq
EOF
#^^ no spaces in front of EOF, also no spaces/tabs after EOF
# the whole line needs to be exactly 'EOF'
There cannot be any spaces, tabs in front of it. Also, I don't think vi is not the best tool to run substitutions on a file, also I don't know how it acts with tabs or spaces infront of :. You may want to try to run it without whitespace characters in front of ::
vi ${cur_dir}/${dirlist} << EOF
:%s/\r//g
:%s/\#\ Columns:\ //g
:wq
EOF
Backticks ` are deprecated, less readable and don't allow for easy nesting. Use $( ... ) command substitution instead.
echo `pwd` is just invalid use of echo, just use pwd.
for dirlist in $(ls parsing ls output is bad. Use find command instead, or if you have to, shell globulation, ie. for dirlist in *.
if [ ${workdir}="/" ] is invalid. This tests if the string "${workdir}=/ is not null. Bash is space aware, it needs a space between = and operands. It should be if [ "${workdir}" = "/" ].
Always quote your variables. Don't cd ${dirlist} do cd "${dirlist}" and so on.
Well posted answer are corrects, but I would recommand this syntax:
find "$1" -type f -name '*.csv' -exec sed -e 's/\r$//;s/^# Columns: //' -i~ {} +
Using + instead of \; at end of find command will permit sed to work on many files at once, reducing forks and make whole job quicker.
The ~ after -i option will rename existing files by appending tilde at end of names instead of deleting them.
Using -type f will ensure working on files only (no symlinks, dirs, socket, fifos, devices...)
You can reduce the entire script to one command, and you do not have to use Vim to process the files:
find ${workdir} -name '*.csv' -exec sed -i -e 's/\r$//; /^#/d' '{}' \;
Explanation:
find <dir> -name <pattern> -exec <command> \; will search <dir> for files matchingand execute` on each file. You want to search for CSV files and do something with them (run a command on them).
The command run on every (CSV) file that was found will be sed -i -e 's/\r$//; /^#/d'. This means to edit the files in-place (-i) and run two transformations on them. s/\r$// will remove the ^M from each line, and /^#/d will remove all lines that start with a #.
'{}' is replaced with the files found by find and \; marks the end of the command run by find (see the find manual page for this).
Most of your script emulates part of the find command. That is not a good idea.
Also, for simple text processing, it is easier and faster to use sed instead of invoking an editor such as Vim.

How to get value of {} in shell?

find . -name "*.network" -o -name "*.index" | xargs -n 1 -I {} -P 5 sh -c "ncols=3; filename={}; echo $filename"
I want get the filename stored in a variable. By setting filename={}, and echo filename, I got nothing output in console.
Since I want to use multi-thread. xargs is necessary in my script.
I use single quotes as suggested by aicastell. But I want to used awk inside quotes. What should I do with single quotes inside quotes? \s did not work.
Can anybody help me with this ?
Since $filename is within double quotes, the shell substitues it before it already before it runs your pipe. With other words: The filename in the echo statement refers to the variable in the calling shell, not in the subshell which you open with sh -c.
Hence, use single quotes instead!
You can do something like this:
for FILENAME in $(find . -name "*.network" -o -name "*.index")
do
# process $FILENAME
done

Edit a find -exec echo command to include a grep for a string

So I have the following command which looks for a series of files and appends three lines to the end of everything found. Works as expected.
find /directory/ -name "file.php" -type f -exec sh -c "echo -e 'string1\string2\nstring3\n' >> {}" \;
What I need to do is also look for any instance of string1, string2, or string3 in the find ouput of file.php prior to echoing/appending the lines so I don't append a file unnecessarily. (This is being run in a crontab)
Using | grep -v "string" after the find breaks the -exec command.
How would I go about accomplishing my goal?
Thanks in advance!
That -exec command isn't safe for strings with spaces.
You want something like this instead (assuming finding any of the strings is reason not to add any of the strings).
find /directory/ -name "file.php" -type f -exec sh -c "grep -q 'string1|string2|string3' \"\$1\" || echo -e 'string1\nstring2\nstring3\n' >> \"\$1\"" - {} \;
To explain the safety issue.
find places {} in the command it runs as a single argument but when you splat that into a double-quoted string you lose that benefit.
So instead of doing that you pass the file as an argument to the shell and then use the positional arguments in the shell command with quotes.
The command above simply chains the echo to a failure from grep to accomplish the goal.

Syntax error: "(" unexpected assigning an array in bash

Within a bash script, I'm trying to pull all files with an extension '.jstd' into an array, loop over that array and carry out some action.
My script is failing to copy the path of each script into the array.
I have the following script.
#!/bin/bash
IFS=$'\n'
file_list=($(find '/var/www' -type f -name "*.jstd"))
for i in "${file_list[#]}"; do
echo "$i"
done
echo $file_list
unset IFS
The line file_list=($(find '/var/www' -type f -name "*.jstd")) works fine in the terminal, but fails in the script with:
Syntax error: "(" unexpected
I've googled, but failed. All ideas gratefully received.
edit: In case it helps in reproduction or clues, I'm running Ubuntu 12.04, with GNU bash, version 4.2.25(1)-release (i686-pc-linux-gnu)
This is precisely the error you would get if your shell were /bin/sh on Ubuntu, not bash:
$ dash -c 'foo=( bar )'
dash: 1: Syntax error: "(" unexpected
If you're running your script with sh yourscript -- don't. You must invoke bash scripts with bash.
That being given, though -- the better way to read a file list from find would be:
file_list=( )
while IFS= read -r -d '' filename; do
file_list+=( "$filename" )
done < <(find '/var/www' -type f -name "*.jstd" -print0)
...the above approach working correctly with filenames containing spaces, newlines, glob characters, and other corner cases.

why isn't this variable in a find iteration changing its value

I was trying to rename all files using find but after i ran this...
find . -name '*tablet*' -exec sh -c "new=$(echo {} | sed 's/tablet/mobile/') && mv {} $new" \;
i found that my files where gone, changed it to echo the value of $new and found that it always kept the name of the first file so it basically renamed all files to have the same name
$ find . -name '*tablet*' -exec sh -c "new=$(echo {} | sed 's/tablet/mobile/') && echo $new" \;
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
_prev_page.tablet.erb
also tried to change to export new=..., same result
Why doesn't the value of new change?
The problem I believe is that the command substitution is expanded by bash once then find uses the result in each invocation. I could be wrong with the reason.
When I have similar stuff before I write out a shell script eg
#! /bin/bash
old="$1"
new="${1/tablet/mobile}"
if [[ "${old}" != "${new}" ]]; then
mv "${old}" "${new}"
fi
that takes care of renaming the file then I can call that script from the find command
find . -name "*tablet*" -exec /path/to/script '{}' \;
makes things much simpler to sort out.
EDIT:
HAHA after some messing around with the quoting you can sort this out by changing the double quotes to single quotes encapsulating the command. As is the $() is expanded by the shell command. if done as below the command substitution is done by the shell invoked by the exec.
find . -name "*tablet*" -exec sh -c 'new=$( echo {} | sed "s/tablet/mobile/" ) && mv {} $new' \;
SO the issue is to do with when the command substitution is expanded, by puting it in single quotes we force the expansion in each invokation of sh.

Resources