Save the current bash script again - bash

Is it possible to resave the running bash script?
I am running a loop in the bash file to do certain operations. After the execution of operations are completed, I want to resave the current bash file without adding any extra code. I just want the file last modified date to be changed.
Current code run.sh looks something like
#!/bin/bash
FILES=/home/shell/test/*
for f in $FILES
do
if [[ "$f" != *\.* ]]
then
DO STUFF
fi
done
After done, I want run.sh to have current date and possible to do this internally?

You can use the touch command in your script:
touch "$0"

Related

Getting the exit code of a Python script launched in a subshell with Bash

I want to run a Bash script every minute (through a CRON entry) to launch a series of Python scripts in a granular (time-wise) fashion.
So far, this is the script I've made:
# set the current date
DATE=`date +%Y-%m-%d`
# set the current system time (HH:MM)
SYSTIME=`date +%H-%M`
# parse all .py script files in the 'daily' folder
for f in daily/*.py; do
if [[ -f $f ]]; then
# set the script name
SCRIPT=$(basename $f)
# get the script time
SCRTIME=`echo $SCRIPT | cut -c1-5`
# execute the script only if its intended execution time and the system time match
if [[ $SCRTIME == $SYSTIME ]]; then
# ensure the directory exists
install -v -m755 -d done/$DATE/failed
# execute the script
python3 evaluator.py $f > done/$DATE/daily-$SCRIPT.log &
# evaluate the result
if [ $? -eq 0 ]; then
# move the script to the 'done' folder
cp $f done/$DATE/daily-$SCRIPT
else
# log the failure
mv done/$DATE/daily-$SCRIPT.log done/$DATE/failed/
# move the script to the 'retry' folder for retrial
cp $f retry/daily-$SCRIPT
fi
fi
fi
done
Let's say we have the following files in a folder called daily/ (for daily execution):
daily/08-00-script-1.py
daily/08-00-script-2.py
daily/08-05-script-3.py
daily/09-20-script-4.py
The idea for granular execution is that a CRON task runs this script every minute. I fetch the system time and extract the execution time for each script and when the time matches between the system and the script file, it gets handed over to Python. So far, so good.
I know this script is not right in the sense that it gets a subshell for each script that's going to be executed but the following code is wrong as I know Bash automatically returns 0 on subshell invocation (if I did read correctly while searching on Google) and what I need is for each subshell to execute the code below so, if it fails, it gets sent to another folder (retry/) which is controlled by another Bash script running checks every 30 minutes for retrial (it's the same script as this one minus the checking part).
So, basically, I need to run this:
# evaluate the result
if [ $? -eq 0 ]; then
# move the script to the 'done' folder
cp $f done/$DATE/daily-$SCRIPT
else
# log the failure
mv done/$DATE/daily-$SCRIPT.log done/$DATE/failed/
# move the script to the 'retry' folder for retrial
cp $f retry/daily-$SCRIPT
fi
For every subshell-ed execution. How can I do this the right way?
Bash may return 0 for every sub-shell invocation, but if you wait for the result, then you will get the result (and I see no ampersand). If the python3 commands is relaying the exit code of the script, then your code will work. If your code does not catch an error, then it is the fault of python3 and you need to create error communication. Redirecting the output of stderr might be helpful, but first verify that your code does not work.

How to make folders for individual files within a directory via bash script?

So I've got a movie collection that's dumped into a single folder (I know, bad practice in retrospect.) I want to organize things a bit so I can use Radarr to grab all the appropriate metadata, but I need all the individual files in their own folders. I created the script below to try and automate the process a bit, but I get the following error.
Script
#! /bin/bash
for f in /the/path/to/files/* ;
do
[[ -d $f ]] && continue
mkdir "${f%.*}"
mv "$f" "${f%.*}"
done
EDIT
So I've now run the script through Shellcheck.net per the suggestion of Benjamin W. It doesn't throw any errors according to the site, though I still get the same errors when I try running the command.
EDIT 2*
No errors now, but the script does nothing when executed.
Assignments are evaluated only once, and not whenever the variable being assigned to is used, which I think is what your script assumes.
You could use a loop like this:
for f in /path/to/all/the/movie/files/*; do
mkdir "${f%.*}"
mv "$f" "${f%.*}"
done
This uses parameter expansion instead of cut to get rid of the file extension.

loop over files with bash script

I have a js script that converts kml location history files to csv. I wrote a bash script to loop through all the files in a directory. The script works when I execute it from command line ./index.js filename.kml > filename.csv
But nothing happens when I execute the bash file that is supposed to loop through all files.
I know it probably is a simple mistake but I can't spot it.
#!/bin/bash
# file: foo.sh
for f in *.kml; do
test -e "${f%.kml}" && continue
./index.js "$f" > "-fcsv"
done
Just delete the "&& continue", if I'm not wrong you're skipping the current iteration with the "continue" keyword, that's why nothing happens
EDIT
Also, you shouldn't test if the file exists, the for loop is enough to be sure that "f" will be a valid .kml file. Anyways, if you still want to do it you have to do it like:
#!/bin/bash
# file: foo.sh
for f in *.kml; do
if [ -e "$f" ]; then
./index.js "$f" > "$f.csv"
fi;
done

Multiple bash script with different parameters

I have the following bash script, that I launch using the terminal.
dataset_dir='/home/super/datasets/Carpets_identification/data'
dest_dir='/home/super/datasets/Carpets_identification/augmented-data'
# if dest_dir does not exist -> create it
if [ ! -d ${dest_dir} ]; then
mkdir ${dest_dir}
fi
# for all folder of the dataset
for folder in ${dataset_dir}/*; do
curr_folder="${folder##*/}"
echo "Processing $curr_folder category"
# get all files
for item in ${folder}/*; do
# if the class dir in dest_dir does not exist -> create it
if [ ! -d ${dest_dir}/${curr_folder} ]; then
mkdir ${dest_dir}/${curr_folder}
fi
# for each file
if [ -f ${item} ]; then
# echo ${item}
filename=$(basename "$item")
extension="${filename##*.}"
filename=`readlink -e ${item}`
# get a certain number of patches
for i in {1..100}
do
python cropper.py ${filename} ${i} ${dest_dir}
done
fi
done
done
Given that it needs at least an hour to process all the files.
What happens if I change the '100' with '1000' in the last for loop and launch another instance of the same script?
Will the first process count to 1000 or will continue to count to 100?
I think the file will be readonly when a bash process executes it. But you can force the change. The already running process will count to its original value, 100.
You have to take care about the results. You are writing in the same output directory and have to expect side effects.
"When you make changes to your script, you make the changes on the disk(hard disk- the permanent storage); when you execute the script, the script is loaded to your memory(RAM).
(see https://askubuntu.com/questions/484111/can-i-modify-a-bash-script-sh-file-while-it-is-running )
BUT "You'll notice that the file is being read in at 8KB increments, so Bash and other shells will likely not load a file in its entirety, rather they read them in in blocks."
(see https://unix.stackexchange.com/questions/121013/how-does-linux-deal-with-shell-scripts )
So, in your case, all your script is loaded in the RAM memory by the script interpretor, and then executed. Meaning that if you change the value, then execute it again, the first instance will still have the "old" value.

read the contents of a directory using shell script

I'm trying to get the contents of a directory using shell script.
My script is:
for entry in `ls`; do
echo $entry
done
However, my current directory contains many files with whitespaces in their names. In that case, this script fails.
What is the correct way to loop over the contents of a directory in shell scripting?
PS: I use bash.
for entry in *
do
echo "$entry"
done
don't parse directory contents using ls in a for loop. you will encounter white space problems. use shell expansion instead
for file in *
do
if [ -f "$file" ];then
echo "$file"
fi
done

Resources