I am running into the problem of commands failing because I expect them to be executed at some directory and that is not the case.
For example I want to do:
pdfcrop --margins '0 0 -390 0' $pag "$pag"_1stCol.pdf
to create a new pdf document, and then
mv `\ls /home/dir | grep '_1stCol'` /home/gmanglano/dir/columns
The problem is that the mv command is failing because it finds the document, it is trying to move that file found FROM the directory where I executed the script, not from where it was found.
This is happening to me somewhat often and I feel there is a concept I am missing or I am thinking this the wrong way arround.
The error I get is:
mv: cannot stat '1stCol.pdf': No such file or directory
When there is, in fact, said fail, it just is not in the directory I launched the script.
Instead of monkeying with ls and backticks and all that, just use the find command. It's built for to find files and then execute a command based on the results of that find:
find /home/dir -name "*_1stCol.pdf" -exec mv {} /home/gmanglano/dir/columns \;
This is finding files in /home/dir that match the name *_1stCol.pdf and then moves them. The {} is the token for the found file.
Don't parse the output of ls: if you simplify the mv command to
mv /home/dir/*_1stCol.pdf /home/gmanglano/dir/columns
then you won't have an issue with being in the wrong directory.
Related
I would like to find txt files with find command and move the directory of the found file, and then apply a command to the file using a bash shell one-liner
For example, this command works, but the acmd is executed in the current directory.
$ find . -name "*.txt" | xargs acmd
I would like to run acmd in the txt file's direcotry.
Does anyone have good idea?
From the find man page:--
-execdir command ;
-execdir command {} +
Like -exec, but the specified command is run from the subdirec‐
tory containing the matched file, which is not normally the
directory in which you started find. This a much more secure
method for invoking commands, as it avoids race conditions dur‐
ing resolution of the paths to the matched files. As with the
-exec action, the `+' form of -execdir will build a command line
to process more than one matched file, but any given invocation
of command will only list files that exist in the same subdirec‐
tory. If you use this option, you must ensure that your $PATH
environment variable does not reference `.'; otherwise, an
attacker can run any commands they like by leaving an appropri‐
ately-named file in a directory in which you will run -execdir.
The same applies to having entries in $PATH which are empty or
which are not absolute directory names. If find encounters an
error, this can sometimes cause an immediate exit, so some pend‐
ing commands may not be run at all. The result of the action
depends on whether the + or the ; variant is being used;
-execdir command {} + always returns true, while -execdir com‐
mand {} ; returns true only if command returns 0.
Just for completeness, the other option would be to do:
$ find . -name \*.txt | xargs -i sh -c 'echo "for file $(basename {}), the directory is $(dirname '{}')"'
for file schedutil.txt, the directory is ./Documentation/scheduler
for file devices.txt, the directory is ./Documentation/admin-guide
for file kernel-parameters.txt, the directory is ./Documentation/admin-guide
for file gdbmacros.txt, the directory is ./Documentation/admin-guide/kdump
...
i.e. have xargs "defer to a shell". In usecases where -execdir suffices, go for it.
I'm new to linux (using bash) and I wanted to ask about something that I do often while I work, I'll give two examples.
Deleting multiple specific folders inside a certain directory.
Copying multiple specific folders into a ceratin directory.
I succesfully done this with files, using find with some regex and then using -exec and -delete. But for folders I found it more problematic, because I had problem pipelining the list of folders I got to the cp/rm command succescfully, each time getting the "No such file or directory error".
Looking online I found the following command (in my case for copying all folders starting with a Z):
cp -r $(ls -A | grep "Z*") destination
But when I execute it it says nothing and the prompt won't show up again until I hit Ctrl+C and nothing is copied.
How can I achieve what I'm looking for? For both cp and rm.
Thanks in advance!
First of all, you are trying to grep "Z*" but it means you are looking for Z, ZZ, ZZZZ, ZZZZZ ?
also try to execute ls -A - you will get multiple columns. I think need at least ls -1A to print result one per line.
So for your command try something like:
cp -r $(ls -1A|grep "^p") destination
or
cp -r $(ls -1A|grep "^p") -t destination
But all the above is just to correct syntax of your example.
It is much better to use find. Just in case try to put target directory in quotas like:
find <PATH_FROM> -type d -exec cp -r \"{}\" -t target \;
By using the command :
rm /file_path/*.csv
I can delete all csv files in the required folder.However if the directory is empty or there are no csv files I get the following error:
No such file or directory
How do I avoid this error?I have this logic in a script with certain downstream dependancies so throwing this error will cause the rest of my code to stop.Whats the best way in bash to delete files only if they exist in the directory?
Another variant is to check if your folder is empty before to run your script:
find file_path/ -type d -empty
It returns the name of your folder if it is empty.
Or use the "-f" option with rm command if you want only avoid the error message:
Without:
rm -r file_path/*.csv
rm: cannot remove ‘file_path/*.csv’: No such file or directory
With:
rm -rf file_path/*.csv
See Test whether a glob has any matches in bash for ways to check if /file_path/*.csv matches anything. However, even if you do such a test before running the rm command it may fail if the directory has a very large number of CSV files. See Argument list too long error for rm, cp, mv commands.
If you have a modern version of find, this is a reliable and efficient option:
find /file_path -maxdepth 1 -type f -name '*.csv' -delete
You can do: [ -f /file_path/*.csv ] && rm /file_path/*.csv
I want to look through all files in my directory and subdirectory
then delete files with special name
Here is my code
for filename in $1*;do
if("$filename" == "hello.txt");then
echo "WOW!"
fi
done
My test directory is TEST/ and there are two files. one name "hello.txt" and "world.txt";However, when I run the code I receive
noStrange.sh: line 2: TEST/hello.txt: Permission denied
noStrange.sh: line 2: TEST/world.txt: Permission denied
I tried the command chmod u+x scriptname, it doesn't work
This is what I input
sh scriptname TEST/
Can anyone tell me what is wrong with the script?
Use basename command to get the basename of a file from file path variable.
for filename in $1*;do if [[ $(basename "$filename") == "hello.txt" ]] ; then echo "wow";fi; done
Or
Use find command. This would search through all the files exists in the current folder as well it's sub folders.
find . -name 'hello.txt'
The immediate answer is that your syntax for tests is wrong; you should have
if ["$filename" == "hello.txt"]; then
etc. However, there are a few issues with your code. Since $filename will match TEST/hello.txt instead of hello.txt, you probably won't get the behavior you want. Also, if you're looking to just delete files with certain names, you probably want a normal UNIX command like
rm TEST/hello.txt
If there are patterns you want do delete, you can use glob/wildcards, or a combination of find, xargs and rm. E.g.
find TEST -name 'hello*.txt' | xargs rm
I have a script called idk.sh at the root of a folder called autograder.
I also have a subdirectory in autograder called hw1 which contains some .sh files. I tried to print out the file name and contents but I failed. actually I tried /hw1, /hw1/, /hw1/* and failed. I dont really understand why I failed to fetch files and hope someone could answer me as I looked up the web and found that the approach should be /hw1/*. Thank you.
#!/bin/sh
for file in /hw1/*
do
echo $file
if [ -f $file ]
then
cat $file
echo $file
fi
done
~
~
I would simply do a find to achieve this
find /hw/ -type f -print -exec cat {} \;
A directory path starting with / means an absolute path, that is, a path from the root of the filesystem. Relative paths start with any character other than / (and \0, but that's a technicality). You'll also want to use a reference to the directory of the script, to be able to run the script from other directories.
See also:
How do I determine the location of my script?
Bash Pitfalls
Linux Filesystem Tree Overview