Rename a file in a directory without retyping the directory name - shell

Say we have a file test.txt in my_directory that I want to rename to yeah.txt.
Is there a way with zsh (or even just bash, just to know) to avoid retyping my_directory?
I find the following a bit long:
mv my_directory/test.txt my_directory/yeah.txt
Thanks.

I'd do it with brace expansion:
mv my_directory/{test,yeah}.txt

I have copy-prev-shell-word assigned to ^P
% cp my_directory/test.txt [^P] # expands to....
% cp my_directory/test.txt my_directory/test.txt
Then I just manually edit the last argument. For me this is a better solution than brace expansion, but I reckon it is just a preference.
If interested, you should look at one of these functions (man zshzle):
copy-prev-word (ESC-^_) (unbound) (unbound)
Duplicate the word to the left of the cursor.
copy-prev-shell-word
Like copy-prev-word, but the word is found by using shell parsing,
whereas copy-prev-word looks for blanks. This makes a difference when the
word is quoted and contains spaces.
I use this to bind the function bindkey -M emacs "^p" copy-prev-shell-word

Related

bash: using rename to left pad filenames with a zero under when their prefix is too short

I'm using a naming convention with number prefixes to track some files. But I am running out with 2-digit prefix. So, instead of 11.abc 12.def I want to move to 011.abc 012.def. I already have some 013.xxx 014.yyy.
Trying this in an empty directory:
touch 11.abc 12.def 013.xxx 014.yyy
ls -1 gives:
013.xxx
014.yyy
11.abc
12.def
Try #1:
This should match anything that starts with 2 digits, but not 3.
rename -n 's/^\d\d[^\d]/0$1/' *
Now I was kind of hoping that $1 would hold the match, like 11, with 0$1 giving me 011.
No such luck:
Use of uninitialized value $1 in concatenation (.) or string at (eval 2) line 1.
'11.abc' would be renamed to '0abc'
Use of uninitialized value $1 in concatenation (.) or string at (eval 2) line 1.
'12.def' would be renamed to '0def'
On the positive side, it's willing to leave 013 and 014 alone.
Try #2 rename -n 's/^\d\d[^\d]/0/' *
'11.abc' would be renamed to '0abc'
'12.def' would be renamed to '0def'
Since this is regex based, can I somehow save the match group 11 and 12?
If I can't use rename I'll probably write a quick Python script. Don't want to loop with mv on it.
And, actually, my naming covention is 2-3 digits followed by a dot, so this is a good match too.
rename -n 's/^\d\d\./<whatever needs to go here>/' *
For what it's worth, I am using the Homebrew version of rename, as I am on a mac.
try this:
rename 's/^(\d{2}\..*)/0$1/' *
rename is problematic because it's not part of POSIX (so it isn't normally available on many Unix-like systems), and there are two very different forms of it in widespread use. See Why is the rename utility on Debian/Ubuntu different than the one on other distributions, like CentOS? for more information.
This Bash code does the renaming with mv (which is part of POSIX):
#! /bin/bash -p
shopt -s nullglob # Patterns that match nothing expand to nothing.
for f in [0-9][0-9].* ; do
mv "$f" "0$f"
done
shopt -s nullglob is to prevent problems if the code is run in a directory that has no files that need to be renamed. If nullglob isn't enabled the code would try to rename a file called '[0-9][0-9].*', which would have unwanted consequences whether or not such a file existed.

Renaming the file Directory which contains Space based on CSV in Shell

I need to rename the files inside the folder that has a space in it eg(Deco/main library/file1.txt )
code:
while IFS="," read orig new pat
do
mv -v $pat$new $pat$orig
done < new.csv
csv file:
newname,file1.txt,Deco/main\\\ library/
error:
mv: invalid option -- '\'
Welcome to Stackoverflow!
First: Use quotes around the use of variables. That means except in very rare occasions, you always should use "$foo" instead of $foo because if you are using the latter, the shell is supposed (and will) interpret spaces in the variables as word delimiters which you rarely want. Especially in your case you do not want it.
Second: Your CSV file seems to contain backslashes to quote the spaces. And some additional step seems to have added another level of quotation so than now you end up with three backslashes and a space for each original space. If this really is the case (please double check if what you wrote in your question is correct, otherwise my answer doesn't fit), you need to unquote this before you can use it.
There are security issues involved in using eval, so do not use it lightly (this disclaimer is necessary whenever proposing to use eval), but if you have trust in the input you are handling to not contain any nastinesses, then you can do this using this code:
while IFS="," read orig new pat
do
eval eval mv -v "$pat$new" "$pat$orig"
done < new.csv
Using this, two levels of quotation are evaluated (that's what eval does) before the mv command is executed.
I strongly suggest to do a dry run by adding echo before the mv first. Then instead of executing your commands they are merely printed first.

parameter expansion using bang dollar (`!$`)

Is there any way to use !$ in a parameter expansion context? The desired usage that motivates this question is rapid (in terms of key strokes) alteration of the name of a file (e.g., instead of saving the file name in a variable and executing rsvg-convert $svg > ${svg/.svg/.png}, one could instead use rsvg-convert $! > $!{/.svg/.png}, where $!{/.svg/.png} is erroneous syntax intimating the desired effect; when the file in question was the last token on the preceding line, such a command can often be typed more quickly than alternatives like using tab completion in the presence of files sharing prefixes of varying length, or copying and pasting the file name by selecting with a mouse). As far as I can tell, there is no way to employ !$ in such a context, but perhaps through some chicanery a similar effect could be achieved.
Depending on how sophisticated you want the substitution, history expansion does support replacing the first occurrence of a string with another. You just precede the substitution with : like:
rsvg-convert !$ > !$:s/.svg/.png
You can see all the history modifiers here
At least in emacs-mode bash will also put the last argument of the previous command inline (not for expansion when you run the command) if you press alt+.. So in this case it might be fastest to type:
rsvg-convert
then alt+.>alt+. then delete the extension it just put in place with alt+bksp then the new extension: png
If you look further into the modifiers in Eric's example, you could also do:
rsvg-convert !$ > !$:r.png
Assuming .svg is a suffix of course

Removing an optional / (directory separator) in Bash

I have a Bash script that takes in a directory as a parameter, and after some processing will do some output based on the files in that directory.
The command would be like the following, where dir is a directory with the following structure inside
dir/foo
dir/bob
dir/haha
dir/bar
dir/sub-dir
dir/sub-dir/joe
> myscript ~/files/stuff/dir
After some processing, I'd like the output to be something like this
foo
bar
sub-dir/joe
The code I have to remove the path passed in is the following:
shopt -s extglob
for file in $files ; do
filename=${file#${1}?(/)}
This gets me to the following, but for some reason the optional / is not being taken care of. Thus, my output looks like this:
/foo
/bar
/sub-dir/joe
The reason I'm making it optional is because if the user runs the command
> myscript ~/files/stuff/dir/
I want it to still work. And, as it stands, if I run that command with the trailing slash, it outputs as desired.
So, why does my ?(/) not work? Based on everything I've read, that should be the right syntax, and I've tried a few other variations as well, all to no avail.
Thanks.
that other guy's helpful answer solves your immediate problem, but there are two things worth nothing:
enumerating filenames with an unquoted string variable (for file in $files) is ill-advised, as sjsam's helpful answer points out: it will break with filenames with embedded spaces and filenames that look like globs; as stated, storing filenames in an array is the robust choice.
there is no strict need to change global shell option shopt -s extglob: parameter expansions can be nested, so the following would work without changing shell options:
# Sample values:
file='dir/sub-dir/joe'
set -- 'dir/' # set $1; value 'dir' would have the same effect.
filename=${file#${1%/}} # -> '/sub-dir/joe'
The inner parameter expansion, ${1%/}, removes a trailing (%) / from $1, if any.
I suggested you change files to an array which is a possible workaround for non-standard filenames that may contain spaces.
files=("dir/A/B" "dir/B" "dir/C")
for filename in "${files[#]}"
do
echo ${filename##dir/} #replace dir/ with your param.
done
Output
A/B
B
C
Here's the documentation from man bash under "Parameter Expansion":
${parameter#word}
${parameter##word}
Remove matching prefix pattern. The word is
expanded to produce a pattern just as in pathname
expansion. If the pattern matches the beginning of
the value of parameter, then the result of the
expansion is the expanded value of parameter with
the shortest matching pattern (the ``#'' case) or
the longest matching pattern (the ``##'' case)
deleted.
Since # tries to delete the shortest match, it will never include any trailing optional parts.
You can just use ## instead:
filename=${file##${1}?(/)}
Depending on what your script does and how it works, you can also just rewrite it to cd to the directory to always work with paths relative to .

Tricky brace expansion in shell

When using a POSIX shell, the following
touch {quick,man,strong}ly
expands to
touch quickly manly strongly
Which will touch the files quickly, manly, and strongly, but is it possible to dynamically create the expansion? For example, the following illustrates what I want to do, but does not work because of the order of expansion:
TEST=quick,man,strong #possibly output from a program
echo {$TEST}ly
Is there any way to achieve this? I do not mind constricting myself to Bash if need be. I would also like to avoid loops. The expansion should be given as complete arguments to any arbitrary program (i.e. the program cannot be called once for each file, it can only be called once for all files). I know about xargs but I'm hoping it can all be done from the shell somehow.
... There is so much wrong with using eval. What you're asking is only possible with eval, BUT what you might want is easily possible without having to resort to bash bug-central.
Use arrays! Whenever you need to keep multiple items in one datatype, you need (or, should use) an array.
TEST=(quick man strong)
touch "${TEST[#]/%/ly}"
That does exactly what you want without the thousand bugs and security issues introduced and concealed in the other suggestions here.
The way it works is:
"${foo[#]}": Expands the array named foo by expanding each of its elements, properly quoted. Don't forget the quotes!
${foo/a/b}: This is a type of parameter expansion that replaces the first a in foo's expansion by a b. In this type of expansion you can use % to signify the end of the expanded value, sort of like $ in regular expressions.
Put all that together and "${foo[#]/%/ly}" will expand each element of foo, properly quote it as a separate argument, and replace each element's end by ly.
In bash, you can do this:
#!/bin/bash
TEST=quick,man,strong
eval echo $(echo {$TEST}ly)
#eval touch $(echo {$TEST}ly)
That last line is commented out but will touch the specified files.
Zsh can easily do that:
TEST=quick,man,strong
print ${(s:,:)^TEST}ly
Variable content is splitted at commas, then each element is distributed to the string around the braces:
quickly manly strongly
Taking inspiration from the answers above:
$ TEST=quick,man,strong
$ touch $(eval echo {$TEST}ly)

Resources