Batch remove former of two file extensions - bash

I am working with many pictures on a MacOSX 10.12.
In order to do some image analyses I need to change the format from .JPG to .gif.
Using ImageMagick I did it relatively quickly, and now I have multiple files with the double extension *.JPG.gif.
I would like to remove the ".JPG" part from the file name but for some reason what I am doing is not working. (I should say that this step is probably not critical to what I have to do next, but since I have a lot of files simplifying their name as much as possible is probably best. I should also say that I have all the super user permission and none of the file names actually contain brakes or spaces so even adding the "" to my code doesn't change anything).
Here is what I am trying using a bash script:
#!/bin/bash
for file in /folder/*.JPG.gif
do
mv $file ${file#.JPG}
done
My understanding is that this code should remove the .JPG part from $file starting the match from the front of the file's name. And yet when I call the ls command to see if the program did what it is supposed to do, all the names are still there with the double extension.
Any help is greatly appreciated.

Modify your mv command like this :
#!/bin/bash
for file in /folder/*.JPG.gif
do
mv "$file" "${file/\.JPG}"
done
Your initial code uses an expansion that removes text from the beginning, not in the middle. The expansion above removes inside the string.
Please note that this is not very robust. If you have ".JPG" in your path or filenames anywhere except at the end of your filenames, it will not do what you want. Quoting, even if not necessary in your case for now, is still good practice as things change, and code gets copy and pasted.

Related

Deleting files from a list using shell

I am a beginner with Applescript & Shell and am writing a script that at a certain point requires me to delete files that are listed within a .txt file. I have searched extensively on stackoverflow and was able to come up with the following command that I am running from within my Applescript...
do shell script "while read name; do
rm -r \"$name"\
done < ~Documents/Script\\ Test/filelist.txt"
It seems to recognize and read the file, but I get an error that says this and I cannot figure out why:
error "rm: ~/Documents/Script Test/filetodelete.rtf: No such file or directory" number 1
That said, I can navigate to that exact directory and verify that a file by that name with that extension does indeed exist. Can someone help shed some light on why this error might be occurring?
You have a typo. The path to the file is most probably ~/Documents, not ~Documents (which in Bash would be the home directory of a user whose account name is Documents).
If your shell is not Bash, it might not even support ~ for $HOME.
In the data file, you also cannot use ~ to refer to your home directory. You could augment the loop with a simple substitution to support this:
while read -r file; do
case $file in '~'*) file=$HOME${file#\~};; esac
rm -r "$file"
done < ~/"Documents/Script Test/filelist.txt"
Notice also the use of read -r to avoid some pesky problems with the legacy default behavior of read.

nemo script for torrents

Hi I am new to scripting and I do mean a complete noobie. I am working on a script to automatically make a torrent with nemo scripts.
#!/bin/bash
DIR="$NEMO_SCRIPT_SELECTED_FILE_PATHS"
BNAME=$(basename "$DIR")
TFILE="$BNAME.torrent"
TTRACKER="http://tracker.com/announce.php"
USER="USERNAME"
transmission-create -o "/home/$USER/Desktop/$TFILE" -t $TTRACKER "$DIR"
It does not work.
However if I replace
DIR="$NEMO_SCRIPT_SELECTED_FILE_PATHS"
with
DIR="absolutepath"
than it works like a charm. It creates it on the desktop with the tracker I want. I think this would come in handy for many people. I dont really know what to put. Have questions please ask. Again complete noobie.
The $NEMO_SCRIPT_SELECTED_FILE_PATHS is the same as $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS. It's populated by nemo/nautilus when you run the script and contains a newline-delimited (I think) list of the selected files/folders. Assuming you are selecting only one file or folder, I don't really see why it wouldn't work - unless the newline character is in there and causing problems. If that's the case, you may be able to strip it with sed. Not running nemo or nautilus, so I can't test it.
I finally found the solution to yours and my problem [https://askubuntu.com/questions/243105/nautilus-scripts-nautilus-script-selected-file-paths-have-problems-with-spac][1]
The variable $NEMO_SCRIPT_SELECTED_FILE_PATH/$NAUTILUS_SCRIPT_SELECTED_FILE_PATH is a list of paths/filenames seperated by a Newline. This messes up anything that assumes its just one filename, even if it is.
#!/bin/bash
echo "$NEMO_SCRIPT_SELECTED_FILE_PATHS" | while read DIR; do
BNAME=$(basename "$DIR")
TFILE="$BNAME.torrent"
TTRACKER="http://tracker.com/announce.php"
USER="USERNAME"
transmission-create -o "/home/$USER/Desktop/$TFILE" -t $TTRACKER "$DIR"
done
Notice it seems to do an extra pass for the newline. You either need to filter that out or put an if the file/folder exists

How do I remove all lines matching a pattern from a set of files?

I've got an irritating closed-source tool which writes specific information into its configuration file. If you then try to use the configuration on a different file, then it loads the old file. Grrr...
Luckily, the configuration files are text, so I can version control them, and it turns out that if one just removes the offending line from the file, no harm is done.
But the tool keeps putting the lines back in. So every time I want to check in new versions of the config files, I have to remove all lines containing the symbol openDirFile.
I'm about to construct some sort of bash command to run grep -v on each file, store the result in a temporary file, and then delete the original and rename the temporary, but I wondered if anyone knew of a nice clean solution, or had already concocted and debugged a similar invocation.
For extra credit, how can this be done without destroying a symbolic link in the same directory (favourite.rc->signals.rc)?
sed -i '/openDirFile/d' *.conf
this do the removing on all conf files
you can also combine the line with "find" command if your conf files are located in different paths.
Note that -i will do the removing "in place".
This was the bash-spell that I came up with:
for i in *.rc ; do
TMP=$(mktemp)
grep -v openDirFile "$i" >"$TMP" && mv "$TMP" "$i"
done
(You can obviously turn this into a one-liner by replacing the newlines with semicolons, except after do.)
Kent's answer is clearly superior.

Save file In a different folder in vim

I'm working with several files in gvim in Windows 7. I need to test the files (Python scripts) in linux. So apart from their original location I want to also save the files in a folder called linux. I want to do this with new files that I will be creating/modifying. That's why I want to use a mapping with the % sign to get the name of the current file Into the new path.
The problem I'm having is that the % sign is escaped with a backslash, so this doesn't work :
:w C:\projects\linux\%:t
Being the original location:
C:\projects\foo\
Is there a simple way to just save the current file in a different folder? (I have read that the % sign is a filename character, so I could erase it from the string isfname and it should work but I think I am making it more complicated than what it really is.)
Sorry, late to the party, but you could also use the workaround
:exe 'w C:\projects\linux\' . expand('%:t')
My recollection is that you can escape the backslash by doubling it (but I'm not on Windows at present so I can't confirm it immediately). You don't need to escape them all, just the one which is causing trouble:
:w C:\projects\linux\\%:t
You might be able to do this sort of thing fairly automatically using the autocmd feature.
The following (untested) line in your platform's equivalent of ~/.vimrc will update a copy of a file when gvim makes modifications:
" clear commands
autocmd!
" when writing buffers, save a copy -- see :help filename-modifiers
autocmd BufWritePost c:/path/to/source/directory w %:t
The :t will take just the tail of the pathname; if you're working with multi-level directories, perhaps :p:. would be better. See the documentation for more details.
If you change the last backslash to a forward slash it will work:
:w C:\projects\linux/%:t

Protect against accidental deletion

Today I first saw the potential of a partial accidental deletion of a colleague's home directory (2 hours lost in a critical phase of a project).
I was enough worried about it to start thinking of the problem ad a possible solution.
In his case a file named '~' somehow went into a test folder, which he after deleted with rm -rf... when rm arrived to the file bash expanded it to his home folder (he managed to CTRL-C almost in time).
A similar problem could happen if one have a file named '*'.
My first thought was to prevent creation of files with "dangerous names", but that would still not solve the problem as mv or other corner case situations could lead to the risky situation as well.
Second thought was creating a listener (don't know if this is even possible) or an alias of rm that checks what files it processes and if it finds a dangerous one skips sending a message.
Something similar to this:
take all non-parameter arguments (so to get the files one wants to delete)
cycle on these items
check if current item is equal to a dangerous item (say for example '~' or '*'), don't know if this works, at this point is the item already expanded or not?
if so echo a message, don't do anything on the file
proceed with iteration
Third thought: has anyone already done or dealed with this? :]
There's actually pretty good justification for having critical files in your home directory checked into source control. As well as protecting against the situation you've just encountered it's nice being able to version control .bashrc, etc.
Since the shell probably expands the parameter, you can't really catch 'dangerous' names like that.
You could alias 'rm -rf' to 'rm -rfi' (interactive), but that can be pretty tedious if you actually mean 'rm -rf *'.
You could alias 'rm' to 'mv $# $HOME/.thrash', and have a separate command to empty the thrash, but that might cause problems if you really mean to remove the files because of disk quotas or similar.
Or, you could just keep proper backups or use a file system that allows "undeletion".
Accidents do happen. You only can reduce the impact of them.
Both version control (regular checkins) and backups are of vital importance here.
If I can't checkin (because it does not work yet), I backup to an USB stick.
And if the deadline aproaches, the backup frequency increases because Murphy strikes at the most inapropriate moment.
One thing I do is always have a file called "-i" in my $HOME.
My other tip is to always use "./*" or find instead of plain "*".
The version control suggestion gets an upvote from me. I'd recommend that for everything, not just source.
Another thought is a shared drive on a server that's backed up and archived.
A third idea is buying everyone an individual external hard drive that lets them back up their local drive. This is a good thing to do because there are two kinds of hard drives: those that have failed and those that will in the future.
You could also create an alias from rm that runs through a simple script that escapes all characters, effectively stopping you from using wildcards. Then create another alias that runs through real rm without escaping. You would only use the second if you are really sure. Bu then again, that's kinda the point of rm -rf.
Another option I personally like is create an alias that redirects through a script and then passes everything on to rm. If the script finds any dangerous characters, it prompts you Y/N if you want to continue, N cancelling the operation, Y continuing on as normal.
One company where I worked we had a cron job which ran every half an hour which copied all the source code files from everyone's home directory to backup directory structure elsewhere on the system just using find.
This wouldn't prevent actual deletion but it did minimise the work lost on a number of occasions.
This is pretty odd behaviour really - why is bash expanding twice?
Once * has expanded to
old~
this~
~
then no further substitution should happen!
I bravely tested this on my mac, and it just deleted ~, and not my home directory.
Is it possible your colleague somehow wrote code that expanded it twice?
e.g.
ls | xargs | rm -rf
You may disable file name generation (globbing):
set -f
Escaping special chars in file paths could be done with Bash builtins:
filepath='/abc*?~def'
filepath="$(printf "%q" "${filepath}")"
filepath="${filepath//\~/\\~}"
printf "%s\n" "${filepath}"
I use this in my ~/.basrc
alias rm="rm -i"
rm prompts before deleting anything, and the alias can be circumvented either with the -f flag, or by escabing, e.g.
\rm file
Degrades the problem yes; solves it no.

Resources