ssh find command and store - bash

im trying to capture a find outcome within a batch script
it works fine until i add another word
example alien (works)
alien 1 (not working)
found201=$(ssh root#192.168.1.201 find "${folder201[#]}" ! -path "*/.wdmc/*" -type f -iname "*$ffind*" | sort)
if i run in a terminal
found201=$(ssh root#192.168.1.201 'find /shares/Public/ /shares/Videos/ -type f -iname "*alien 1*"' | sort)
with 'find .....' it works but nothing is plus its not using string/array
when i add '' to script get bad substitution (assuming it now treats as a string not a command)
i need to use find as later i need to delete files etc in a set manner
how can i add ' ' to find and use strings/array
running normally get this
++ ssh root#192.168.1.201 find /shares/Public/ /shares/Videos/ '' '!' -path '*/.wdmc/*' -type f -iname '*alien 1*'
++ sort
find: unrecognized: 1*

Try like this, enclosing the entire find command within double-quotes, and using single-quotes inside:
found201=$(ssh root#192.168.1.201 "find ${folder201[#]} ! -path '*/.wdmc/*' -type f -iname '*$ffind*'" | sort)
However there is a caveat:
this won't work with folders in ${folder201[#]} that contain spaces.

think i have sorted it
found201=$(ssh root#192.168.1.201 "find ${folder201[#]} ! -path '*/.wdmc/*' -type f -iname '*$ffind*'" | sort)
changed '${folder201[#]}' to ${folder201[#]} (removed single quotes)

Related

How to use wildcard * in alias [duplicate]

This question already has answers here:
Difference between single and double quotes in Bash
(7 answers)
Closed last year.
# first examle
> alias gostyle="goimports -w $(find . -type f -name '*.go' -not -path './vendor/*')"
> alias gostyle
gostyle=$'goimports -w / gofiles /'
# second example
> alias gostyle="goimports -w $(find . -type f -name 'main.go' -not -path './vendor/*')"
> alias gostyle
gostyle='goimports -w ./main.go'
Why in first example I have $ in the front of command.
How I can use wildcard * right in alias.
Why I have permission denied when I use first alias
Because you are using double quotes instead of single, the $(find ...) is executed once, at the time you define your alias. You end up with an alias with a hard-coded list of files.
The trivial fix is to use single quotes instead of double (where obviously then you need to change the embedded single quotes to double quotes instead, or come up with a different refactoring); but a much better solution is to use a function instead of an alias. There is basically no good reason to use an alias other than for backwards compatibility with dot files from the paleolithic age of Unix.
gostyle () {
goimports -w $(find . -type f -name '*.go' -not -path './vendor/*') "$#"
}
(Unfortunately, I am not familiar with goimports; perhaps it needs its argument to be double quoted, or perhaps you should add one -w for each result that find produces. Or perhaps you actually want
gostyle () {
find . -type f -name '*.go' -not -path './vendor/*' -print0 |
xargs -0 goimports -w "$#"
}
where you might or might not want to include "$#".)

Script to find recursively the number of files with a certain extension

We have a highly nested directory structure, where we have a directory, let's call it 'my Dir', appearing many times in our hierarchy. I am interested in counting the number of "*.csv" files in all directories named 'my Dir' (yes, there is a whitespace in the name). How can I go about it?
I tried something like this, but it does not work:
find . -type d -name "my Dir" -exec ls "{}/*.csv" \; | wc -l
If you want to the number of files matching the pattern '*.csv' under "my Dir", then:
don't ask for -type d; ask for -type f
don't ask for -name "my Dir" if you really want -name '*.csv'
don't try to ls *.csv on each match, because if there's more N csv files in a directory, you would potentially count each one N times
also beware of embedding {} in -exec code!
For counting files from find, I like to use a trick I learned from Stéphane Chazelas on U&L; for example, from: Counting files in Linux:
find "my Dir" -type f -name '*.csv' -printf . | wc -c
This requires GNU find, as -printf is a GNU extension to the POSIX standard.
It works by looking within "my Dir" (from the current working directory) for files that match the pattern; for each matching file, it prints a single dot (period); that's all piped to wc who counts the number of characters (periods) that find produced -- the number of matching files.
You would exclude all pathcs that are not My Dir:
find . -type f -not '(' -not -path '*/my Dir/*' -prune ')' -name '*.csv'
Another solution is to use the -path predicate to select your files.
find . -path '*/my Dir/*.csv'
Counting the number of occurrences could be a simple matter of piping to wc -l, though this will obviously produce the wrong result if some of the files contain newlines in their names. (This is slightly pathological, but definitely something you want to cover in production code.) A common arrangement is to just print a newline for every found file, instead of its name.
find . -path '*/my Dir/*.csv' -printf '.\n' | wc -l
(The -printf predicate is not in POSIX but it's not hard to replace with an -exec or similar.)

Bash basename only gives back the last name in folder list

FOLDERS=$( basename "$(find "${LOG_DIR}" ! -path "${LOG_DIR}" -type d )")
/storage/archive/fakeagent/2018-07-12
/storage/archive/fakeagent/2018-06-22
With the find command I get this list of folders, and I would like to get the last foldername (dates). When I use basename, I only get back one folder name, the last one: 2018-08-16.
How should I get all of the foldernames?
2018-07-12
2018-07-14
...
2018-08-16
You need to use option -a in the basename command to allow multiple arguments:
basename -a $(find "${LOG_DIR}" ! -path "${LOG_DIR}" -type d )
basename --help shows:
-a, --multiple support multiple arguments and treat each as a NAME
If some of your folder have spaces (or control character), you'd better use option -exec in the find command:
find "$LOG_DIR" -type d -exec basename "{}" \;
You could use awk to print whatever is after the last slash of each line:
find "${LOG_DIR}" ! -path "${LOG_DIR}" -type d | awk -F'/' '{print $NF}'
Or you can tell find to print just the basename directly:
find "${LOG_DIR}" ! -path "${LOG_DIR}" -type d -printf '%f\n'
As a side note, uppercase variable names are discouraged as they're more likely to clash with environment and shell variables, see the POSIX spec here, fourth paragraph.
You should not use the command like this in my opinion. You are mixing folders from different locations into one list, it's largely pointless. In other words, retain the full path in your results if it is going to be of any value. Another thing, you may want to put the values in an array so I will provide an alternative to the answers above :
array=()
while IFS= read -r -d $'\0'; do
array+=("$(echo $REPLY | sed 's:.*/::')")
done < <(find . -type d -print0)
echo ${array[#]}

find with nested command reading blacklist

I have a script that recursively searches all directories for specific files or specific file endings.
These certain files I want to save the path in a description file.
Looks for example like this:
./org/apache/commons/.../file1.pom
./org/apache/commons/.../file1.jar
./org/apache/commons/.../file1.zip
and so on.
In a blacklist , I describe which file endings I want to ignore.
! -path "./.cache/*" ! -path "./org/*" ! -name "*.sha1" ! -name"*.lastUpdated"
and so on.
Now i want to read this blacklist file while the search to ignore the described files:
find . -type f $(cat blacklist) > artifact.descriptor
Unfortunately, the blacklist will not be included while the search.
When:
echo "find . -type f $(cat blacklist) > artifact.descriptor"
Result is as expected:
find . -type f ! -path "./.cache/*" ! -path "./org/*" ! -name "*.sha1" ! -name"*.lastUpdated" > artifact.descriptor
But it does not work or exclude the described files.
I tried with following command and it works, but i want to know why not with with find alone.
find . -type f | grep -vf $blacklist > artifact.descriptor
Hopefully someone can explain it to me :)
Thanks a lot.
As tripleee suggests, it is generally considered bad practice to store a command in a variable because it does not catch all the cornercases.
However you can use eval as a workaround
/tmp/test$ ls
blacklist test.a test.b test.c
/tmp/test$ cat blacklist
-not -name *.c -not -name *.b
/tmp/test$ eval "find . -type f "`cat blacklist`
./test.a
./blacklist
In your case I think it fails because the quotes in your blacklist file are considered as a literal and not as enclosing the patterns and I think it works if you remove them, but still it's probably not safe for other reasons.
! -path ./.cache/* ! -path ./org/* ! -name *.sha1 ! -name *.lastUpdated

find and replace a string in a set of xml files using shell commands

I have 1000 files in a directory. I'm on a solaris machine
I want to replace the string " /example/test/temp " t0 " /testing/in/progress/ in all the xml files in my directory.
any pointers will be great.
thanks,
novice
How about (all on one line):
find . \( -type d ! -name . -prune \) -o \( -type f -name "*.xml" -print \) |
xargs perl -i.old -p -e 's-/example/test/temp-/testing/in/progress/-g'
Use sed(1).
In general it is safer to use an xslt processor to massage XML files, but in this particular example, the chances of some "funky" XML representation causing problems is pretty remote ...

Resources