Unix shell script not executing from another script - shell

I have written the below command using a shell script:
/usr/bin/find ${FilePath[$i]} -name ${FileName[$i]}* -type f -mtime +${DaysNo[$i]} | grep ${FilePath[$i]}$tempfile > tempFilesList
It looks good when I execute this script directly, but gives me below error when I try to execute it from another shell script.
ERROR : /usr/bin/find: bad option resultmgr.log_2019-11-07
/usr/bin/find: [-H | -L] path-list predicate-list

It's likely that ${FileName[$i]}* is being expanded to multiple file names which would give you something like -name file1 file2 in your command.
That could happen if, for example, files matching that mask existed in your current working directory for the case where you run it from another script, but not when you're running it from the command line. Some shells will expand if possible but leave alone if not, as per the following transcript:
~> echo testprog*
testprog testprog.c
~> echo nosuchfile*
nosuchfile*
~> _
That file2 would then be considered a control argument to find and therefore invalid.
You can check this by simply echoing out the command before running it:
echo Will run: /usr/bin/find ${FilePath[$i]} -name ${FileName[$i]}* -type f -mtime +${DaysNo[$i]} ...
and seeing what it outputs.

Related

What is good way to move a directory and then run a command to the file inside it using a bash shell one-liner

I would like to find txt files with find command and move the directory of the found file, and then apply a command to the file using a bash shell one-liner
For example, this command works, but the acmd is executed in the current directory.
$ find . -name "*.txt" | xargs acmd
I would like to run acmd in the txt file's direcotry.
Does anyone have good idea?
From the find man page:--
-execdir command ;
-execdir command {} +
Like -exec, but the specified command is run from the subdirec‐
tory containing the matched file, which is not normally the
directory in which you started find. This a much more secure
method for invoking commands, as it avoids race conditions dur‐
ing resolution of the paths to the matched files. As with the
-exec action, the `+' form of -execdir will build a command line
to process more than one matched file, but any given invocation
of command will only list files that exist in the same subdirec‐
tory. If you use this option, you must ensure that your $PATH
environment variable does not reference `.'; otherwise, an
attacker can run any commands they like by leaving an appropri‐
ately-named file in a directory in which you will run -execdir.
The same applies to having entries in $PATH which are empty or
which are not absolute directory names. If find encounters an
error, this can sometimes cause an immediate exit, so some pend‐
ing commands may not be run at all. The result of the action
depends on whether the + or the ; variant is being used;
-execdir command {} + always returns true, while -execdir com‐
mand {} ; returns true only if command returns 0.
Just for completeness, the other option would be to do:
$ find . -name \*.txt | xargs -i sh -c 'echo "for file $(basename {}), the directory is $(dirname '{}')"'
for file schedutil.txt, the directory is ./Documentation/scheduler
for file devices.txt, the directory is ./Documentation/admin-guide
for file kernel-parameters.txt, the directory is ./Documentation/admin-guide
for file gdbmacros.txt, the directory is ./Documentation/admin-guide/kdump
...
i.e. have xargs "defer to a shell". In usecases where -execdir suffices, go for it.

shell script does not find the directory

I'm starting in the shell script.I'm need to make the checksum of a lot of files, so I thought to automate the process using an shell script.
I make to scripts: the first script uses an recursive ls command with an egrep -v that receive as parameter the path of file inputed by me, these command is saved in a ambient variable that converts the output in a string, follow by a loop(for) that cut the output's string in lines and pass these lines as a parameter when calling the second script; The second script take this parameter and pass they as parameter to hashdeep command,wich in turn is saved in another ambient variable that, as in previous script,convert the output's command in a string and cut they using IFS,lastly I'm take the line of interest and put then in a text file.
The output is:
/home/douglas/Trampo/shell_scripts/2016-10-27-001757.jpg: No such file
or directory
----Checksum FILE: 2016-10-27-001757.jpg
----Checksum HASH:
the issue is: I sets as parameter the directory ~/Pictures but in the output error they return another directory,/home/douglas/Trampo/shell_scripts/(the own directory), in this case, the file 2016-10-27-001757.jpg is in the ~/Pictures directory,why the script is going in its own directory?
First script:
#/bin/bash
arquivos=$(ls -R $1 | egrep -v '^d')
for linha in $arquivos
do
bash ./task2.sh $linha
done
second script:
#/bin/bash
checksum=$(hashdeep $1)
concatenado=''
for i in $checksum
do
concatenado+=$i
done
IFS=',' read -ra ADDR <<< "$concatenado"
echo
echo '----Checksum FILE:' $1
echo '----Checksum HASH:' ${ADDR[4]}
echo
echo ${ADDR[4]} >> ~/Trampo/shell_scripts/txt2.txt
I think that's...sorry about the English grammatic errors.
I hope that the question has become clear.
Thanks ins advanced!
There are several wrong in the first script alone.
When running ls in recursive mode using -R, the output is listed per directory and each file is listed relative to their parent instead of full pathname.
ls -R doesn't list the directory in long format as implied by | grep -v ^d where it seems you are looking for files (non directories).
In your specific case, the missing file 2016-10-27-001757.jpg is in a subdirectory but you lost the location by using ls -R.
Do not parse the output of ls. Use find and you won't have the same issue.
First script can be replaced by a single line.
Try this:
#!/bin/bash
find $1 -type f -exec ./task2.sh "{}" \;
Or if you prefer using xargs, try this:
#!/bin/bash
find $1 -type f -print0 | xargs -0 -n1 -I{} ./task2.sh "{}"
Note: enclosing {} in quotes ensures that task2.sh receives a complete filename even if it contains spaces.
In task2.sh the parameter $1 should also be quoted "$1".
If task2.sh is executable, you are all set. If not, add bash in the line so it reads as:
find $1 -type f -exec bash ./task2.sh "{}" \;
task2.sh, though not posted in the original question, is not executable. It has a missing execute permission.
Add execute permission to it by running chmod like:
chmod a+x task2.sh
Goodluck.

putting find in a bash_profile function

I want to make bash function in my .bash_profile that basically does a find ./ -name $1, very simple idea, seems not to work. My tries don't print things the right way i.e.:
find_alias() {
`find ./ -name $1 -print`
}
alias ff='find_alias $1'
The above if I do something like ff *.xml I get the following one liner:
bash: .pom.xml: Permission denied
The following after that:
find_alias() {
echo -e `find ./ -name $1 -print`
}
alias ff='find_alias $1'
does find them all, but puts the output of that onto one massive long line, what am I doing wrong here?
find_alias() {
find ./ -name $1 -print
}
You don't need, nor want, the backticks. That would try to execute what the find command returns.
Backticks make shell treat output of what's inside them as command that should be executed. If you tried ´echo "ls"´ then it would first execute echo "ls", take the output which is text ls and then execute it listing all files.
In your case you are executing textual result of find ./ -name *.xml -print which is a list of matched files. Of course it has no sense because matched file names (in most cases) are not commands.
The output you are getting means two things:
you tried to execute script from pom.xml (like if you typed
./pom.xml) - makes no sense
you don't have execution rights for
that file
So the simple solution for you problem, as #Mat suggested, is to remove backticks and let the output of find be displayed in your terminal.

Pipe ls output to get path of all directories

I want to list all directories ls -d * in the current directory and list out all their full paths. I know I need to pipe the output to something, but just not sure what. I don't know if I can pipe the output to a pwd or something.
The desired result would be the following.
$ cd /home/
$ ls -d *|<unknown>
/home/Directory 1
/home/Directory 2
/home/Directory 3
<unknown> being the part which needs to pipe to pwd or something.
My overall goal is to create a script which will allow to me construct a command for each full path supplied to it. I'll type build and internally it will run the following command for each.
cd <full directory path>; JAVA_HOME=jdk/Contents/Home "/maven/bin/mvn" clean install
Try simply:
$ ls -d $PWD/*/
Or
$ ls -d /your/path/*/
find `pwd` -type d -maxdepth 1 -name [^.]\*
Note: The above command works in bash or sh, not in csh. (Bash is the default shell in linux and MacOSX.)
ls -d $PWD/* | xargs -I{} echo 'cd {} JAVA_HOME=jdk/Contents/Home "/maven/bin/mvn" clean install' >> /foo/bar/buildscript.sh
will generate the script for u.
Might I also suggest using -- within your ls construct, so that ls -d $PWD/*/ becomes ls -d -- $PWD/*/ (with an extra -- inserted)? This will help with those instances where a directory or filename starts with the - character:
/home/-dir_with_leading_hyphen/
/home/normal_dir/
In this instance, ls -d */ results in:
ls: illegal option -- -
usage: ls [-ABCFGHLOPRSTUWabcdefghiklmnopqrstuwx1] [file ...]
However, ls -d -- */ will result in:
-dir_with_leading_hyphen/ normal_dir/
And then, of course, you can use the script indicated above (so long as you include the -- any time you call ls).
No piping neccessary:
find $(pwd) -maxdepth 1 -type d -printf "%H/%f\n"
to my surprise, a command in the print statement works too:
find -maxdepth 1 -type d -printf "$(pwd)/%f\n"

help using xargs to pass mulitiple filenames to shell script

Can someone show me to use xargs properly? Or if not xargs, what unix command should I use?
I basically want to input more than (1) file name for input <localfile>, third input parameter.
For example:
1. use `find` to get list of files
2. use each filename as input to shell script
Usage of shell script:
test.sh <localdir> <localfile> <projectname>
My attempt, but not working:
find /share1/test -name '*.dat' | xargs ./test.sh /staging/data/project/ '{}' projectZ \;
Edit:
After some input from everybody and trying -exec, I am finding that my <localfile> filename input with find is also giving me the full path. /path/filename.dat instead of filename.dat. Is there a way to get the basename from find? I think this will have to be a separate question.
I'd just use find -exec here:
% find /share1/test -name '*.dat' -exec ./test.sh /staging/data/project/ {} projectZ \;
This will invoke ./test.sh with your three arguments once for each .dat file under /share1/test.
xargs would pack up all of these filenames and pass them into one invocation of ./test.sh, which doesn't look like your desired behaviour.
If you want to execute the shell script for each file (as opposed to execute in only once on the whole list of files), you may want to use find -exec:
find /share1/test -name '*.dat' -exec ./test.sh /staging/data/project/ '{}' projectZ \;
Remember:
find -exec is for when you want to run a command on one file, for each file.
xargs instead runs a command only once, using all the files as arguments.
xargs stuffs as many files as it can onto the end of the command line.
Do you want to execute the script on one file at a time or all files? For one at a time, use file's exec, which it looks like you're already using the syntax for, and which xargs doesn't use:
find /share1/test -name '*.dat' -exec ./test.sh /staging/data/project/ '{}' projectZ \;
xargs does not have to combine arguments, it's just the default behavior. this properly uses xargs, to execute the commands, as intended.
find /share1/test -name '*.dat' -print0 | xargs -0 -I'{}' ./test.sh /staging/data/project/ '{}' projectZ
When piping find to xargs, NULL termination is usually preferred, I recommend appending the -print0 option to find. After which you must add -0 to xargs, so it will expect NULL terminated arguments. This ensures proper handling of filenames. It's not POSIX proper, but considered well supported. You can always drop the NULL terminating options, if your commands lack support.
Remeber while find's purpose is finding files, xargs is much more generic. I often use xargs to process non-filename arguments.

Resources