Script in Bash to delete - bash

I am trying to delete all files containing the name TRAR in the filename. This is for a Linux system and this is my first time doing such a script, below is what I have tried, but it does not work
cd /appl/virtuo/gways/input_d
rm -rf TRAR*
When I manually enter the directory and run rm -rf TRAR* , all the files are removed, I need this script to work so that it can be added to run via a cronjob..
VENDOR=ericsson-msc
RELEASE=R13.2
BASE_DIR=/appl/virtuo/gways
RAW_DIR=${BASE_DIR}/config/${VENDOR}/${RELEASE}/trdipfile_raw_landing_area
#rm -rf $RAW_DIR/*
cd ${RAW_DIR}
ssh netperf#10.76.26.1 "cd /var/opt/ericsson/sgw/outputfiles/apgfiles/oms ; find . -newer ~/msc- trdif-timestamp -type f | egrep TRDIP | cpio -oc ; touch ~/msc-trdif-timestamp" 2>/dev/null | cpio - icdu 2>/dev/null

If you run this script by crontab then you should add
!#/bin/sh to first line of file. Anc change permissions for this file. For example
chmod 755 script.sh
Or you can add to crontab command as /bin/sh /<folder with scripts>/script.sh

Related

How to delete a sub directory out of many directories using shell script

I want to delete a subdirectory which can be in any of the directories using shell script
For eg
The main directory has 3 directories a , b and c and the test folder can be in any of the 3 directories ie a , b, c. so now i want to delete the test directory.So how can we do this
From within main directory:
find . -type d -name 'test' -exec rm -rf {} \;
You have different options to do it but i like use the globstar:
rm -r **/subfolder
Full example:
$ cd /tmp
$ mkdir foo
$ cd foo/
$ mkdir -p bar/zzz
$ mkdir -p bar/aaa
$ mkdir -p bar/bbb
$ mkdir -p xxx/aaa
$ mkdir -p xxx/ccc
$ mkdir -p xxx/ddd
$ rm -r **/aaa
$ ls
bar xxx
You can try to find and then delete it like this:
find . -name test -type d -print0|xargs -0 rm -r --
using find:
find -type d -a -name test
will list all directories with name test, then you can
find -type d -a -name test|xargs rm -r
to remove
If your directories are so similar, you don't need a complex find pipeline, you can use pathname expansion directly:
$ rm -r [abc]/test

Ruby chmod works, but not for one directory called "js/"

I've been putting together a ruby script that delpoys a git repository on my webserver (running gitolite) with a post-recieve hook.
After checking out the files I try to chmod the directories first, and the files after like this:
FileUtils.chmod_R(0755, Dir.glob("#{deploy_to_dir}/**/*/"))
FileUtils.chmod_R(0644, Dir.glob("#{deploy_to_dir}/**/*"))
The first command works for all directories but one: js/. It just dosn't set the +x to this directory – while at the same time setting the +r.
Here's what happens:
Before: dr-------- js/
Skript does chmod 755 on js/
After: drw-r--r-- js/
Expected: drwxr-xr-x js/
Ich checked the arrtibutes with lsattr. It gives only -----------------e- ./js/ which shows nothing special. Is there anything else that could be wrong?
Changing it in bash directly works fine. What does Ruby do to this single directory?
Try reversing the order:
FileUtils.chmod_R(0644, Dir.glob("#{deploy_to_dir}/**/*"))
FileUtils.chmod_R(0755, Dir.glob("#{deploy_to_dir}/**/*/"))
Otherwise all files and directories will be matched by 0644 chmod and undo your execute bit.
In the end it was the problem, that js/ or en/ were the first directories in the glob. => Now it's a bash script and it works.
#!/bin/bash
# post-receive
# 1. Read STDIN (Format: "from_commit to_commit branch_name")
read from to branch
if [[ $branch =~ master$ ]] ; then
deploy_to_dir='/var/www/virtual/whnr/vectoflow'
GIT_WORK_TREE="$deploy_to_dir" git checkout -f master
elif [[ $branch =~ development$ ]] ; then
deploy_to_dir='/var/www/virtual/whnr/vectotest'
GIT_WORK_TREE="$deploy_to_dir" git checkout -f development
else
echo "Received branch $branch, not deploying."
exit 0
fi
# 3. chmod +r whole deploy_to_dir
find $deploy_to_dir -type d -print0 | xargs -0 chmod 755
echo "DEPLOY: Changed Permissions on all directories 755"
find $deploy_to_dir -type f -print0 | xargs -0 chmod 644
echo "DEPLOY: Changed Permissions on all files 644"

Delete script in Linux

I am trying to edit the following script to add a part to remove all files within a folder containing the word TRAR in their filename. This is a Linux system, I want to add in a part like this :
cd /appl/virtuo/gways/config/input_d
rm -rf TRAR*
The above I cannot get to run, but when I try manually, I am able to do it, I want to add into the below script and am quite lost, as this is my first time writing such a script
VENDOR=ericsson-msc
RELEASE=R13.2
BASE_DIR=/appl/virtuo/gways
RAW_DIR=${BASE_DIR}/config/${VENDOR}/${RELEASE}/trdipfile_raw_landing_area
cd ${RAW_DIR}
ssh netperf#10.76.26.1 "cd /var/opt/ericsson/sgw/outputfiles/apgfiles/oms ; find . -newer ~/msc- trdif-timestamp -type f | egrep TRDIP | cpio -oc ; touch ~/msc-trdif-timestamp" 2>/dev/null | cpio -icdu 2>/dev/null
i tried it both without quotes and with, try quotes and it should work, just tested, also if you want to delete all instances containing "TRAR" you should use wildcard before and after:
rm -rf *"TRAR"*
you can also try to save it to variable and test:
delit="TRAR"
rm -rf "$delit"*

Parallel processing of untar/remove in unix shell script

Question:
I want to untar a tarfile which has many tar files within itself and remove the files in all the tar files and I want all of these processes to run in parallel in Unix bash scripting.
Conditions:
The script should return an error if any untar/remove process has any error.
It should only return success after all N (untar and remove) processes complete successfully.
Proposed solution:
mkdir a
tar -C a -xvf b.tar
cd a
for i in *
do
rm -r $i &
done
If you have GNU Parallel http://www.gnu.org/software/parallel/ installed you can do this:
tar xvf foo.tgz | perl -ne 'print $l;$l=$_;END{print $l}' | parallel rm
It is useful if you do not have space to extract the full tar.gz file, but you need to process files as you unpack them:
tar xvf foo.tgz | perl -ne 'print $l;$l=$_;END{print $l}' | parallel do_stuff {}\; rm {}
You can install GNU Parallel simply by:
wget http://git.savannah.gnu.org/cgit/parallel.git/plain/src/parallel
chmod 755 parallel
cp parallel sem
Watch the intro videos for GNU Parallel to learn more:
https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
mkdir a
tar -C a -xvf b.tar
cd a
success=$(for i in *
do
rm -r $i || echo failed & # if a job fails false will be echoed
done
wait)
# if any of the jobs failed, success will be set to a value other than ""
[[ -z "$success" ]] && exit 0 || exit 1
The answer tar xvf a.tar | tac | xargs -P 4 rm -rv is inspired from Burton Samograd's comment about xargs -P
$ mkdir -p a/b/c/d
mkdir: created directory `a'
mkdir: created directory `a/b'
mkdir: created directory `a/b/c'
mkdir: created directory `a/b/c/d'
$ touch a/1 a/2 a/3 a/b/4 a/b/5
$ tar cf a.tar a
$ rm -rfv a
removed directory: `a/b/c/d'
removed directory: `a/b/c'
removed `a/b/4'
removed `a/b/5'
removed directory: `a/b'
removed `a/3'
removed `a/1'
removed `a/2'
removed directory: `a'
$ tar xvf a.tar | tac | xargs -P 4 rm -rv
removed `a/2'
removed `a/1'
removed `a/3'
removed `a/b/5'
removed `a/b/4'
removed directory: `a/b/c/d'
removed directory: `a/b/c'
removed directory: `a/b'
removed directory: `a'

How do I write a shell script to remove the unzipped files in a wrong directory?

I accidentally unzipped files into a wrong directory, actually there are hundreds of files... now the directory is messed up with the original files and the wrongly unzip files. I want to pick the unzipped files and remove them using shell script, e.g.
$unzip foo.zip -d test_dir
$cd target_dir
$ls test_dir | rm -rf
nothing happened, no files were deleted, what's wrong with my command ? Thanks !
The following script has two main benefits over the other answers thus far:
It does not require you to unzip a whole 2nd copy to a temp dir (I just list the file names)
It works on files that may contain spaces (parsing ls will break on spaces)
while read -r _ _ _ file; do
arr+=("$file")
done < <(unzip -qql foo.zip)
rm -f "${arr[#]}"
Right way to do this is with xargs:
$find ./test_dir -print | xargs rm -rf
Edited Thanks SiegeX to explain to me OP question.
This 'read' wrong files from test dir and remove its from target dir.
$unzip foo.zip -d /path_to/test_dir
$cd target_dir
(cd /path_to/test_dir ; find ./ -type f -print0 ) | xargs -0 rm
I use find -0 because filenames can contain blanks and newlines. But if not is your case, you can run with ls:
$unzip foo.zip -d /path_to/test_dir
$cd target_dir
(cd /path_to/test_dir ; ls ) | xargs rm -rf
before to execute you should test script changing rm by echo
Try
for file in $( unzip -qql FILE.zip | awk '{ print $4 }'); do
rm -rf DIR/YOU/MESSED/UP/$file
done
unzip -l list the content with a bunch of information about the zipped files. You just have to grep the file name out of it.
EDIT: using -qql as suggested by SiegeX
The following worked for me (bash)
unzip -l filename.zip | awk '{print $NF}' | xargs rm -Rf
Do this:
$ ls test_dir | xargs rm -rf
You need ls test_dir | xargs rm -rf as your last command
Why:
rm doesn't take input from stdin so you can't pipe the list of files to it. xargs takes the output of ls command and presents it to rm as input so that it can delete it.
Compacting the previous one. Run this command in the /DIR/YOU/MESSED/UP
unzip -qql FILE.zip | awk '{print "rm -rf " $4 }' | sh
enjoy

Resources