Bash script to run chmod and skip wile with badstr - bash

I apologize for my question but I am a beginner and I am starting to code and learn and I have no clue what am I doing but still I am learning. I took community course and struggled with my homework. 2 of the 3 assignments I've done and I'm struggling with no 3.
Assignment is:
"write BASH script to run CHMOD 644 command on file /folder/file1, /folder/file2 up to file /folder/file28 and skip all files containing string badstr. I have no clue how to do it, I am searching and reading all morning and still didn't figure it out. Can someone please help me?

Go step by step:
Create a loop over the files (which means that you have to generate that list of files) -> for
Check if every file being processed contains the forbidden string -> grep
Perform the chmod if the file passes the test -> if, test, chmod

So based on your answer i got this now. Hopefully i am on right track
#!/bin/bash
FILES=/path/to/*
for f in $FILES
do
echo "Processing $f file..."
cat $f
while read -r str
do
echo "$str"
grep "$str" /path/to/other/files
done < inputfile
chmod g+w `cat inputfile

Related

How to make folders for individual files within a directory via bash script?

So I've got a movie collection that's dumped into a single folder (I know, bad practice in retrospect.) I want to organize things a bit so I can use Radarr to grab all the appropriate metadata, but I need all the individual files in their own folders. I created the script below to try and automate the process a bit, but I get the following error.
Script
#! /bin/bash
for f in /the/path/to/files/* ;
do
[[ -d $f ]] && continue
mkdir "${f%.*}"
mv "$f" "${f%.*}"
done
EDIT
So I've now run the script through Shellcheck.net per the suggestion of Benjamin W. It doesn't throw any errors according to the site, though I still get the same errors when I try running the command.
EDIT 2*
No errors now, but the script does nothing when executed.
Assignments are evaluated only once, and not whenever the variable being assigned to is used, which I think is what your script assumes.
You could use a loop like this:
for f in /path/to/all/the/movie/files/*; do
mkdir "${f%.*}"
mv "$f" "${f%.*}"
done
This uses parameter expansion instead of cut to get rid of the file extension.

Trying to parse a file, check its size, then send report.. this fails. :(

Noob question i'm sure.. But Noob needs help!
I have a script that downloads a ham radio file, does some stuff with it, then sends me an email with its final work. The problem is, Sometime's it comes up short. I'm not sure where it is dropping some data, and quite frankly i'm not sure I care (though I probably should..). My solution was to run the script, check the output of the file, and if it's over 96k, email it. If not.. re-run the script.
It fails on the 'until' process even if the file is above the correct size.
While i'm sure it can be done in other languages, Bash is what i'm currently familiar enough with to try and make this work better. So a bash solution is what i'm looking for. I am also alright with any streamlining that could be done, though it's not intensive by any means to run this currently!
Here's what I have..
dt=`date '+%D %T.%6N'`
#
wget -O ~/file1 "https://www.radioid.net/static/users.csv"
egrep 'Washington,United States|Oregon,United States|Idaho,United States|Montana,United States|British Columbia' ~/file1 > ~/PNW1
awk 'BEGIN {FS=OFS=","} {sub(/ .*/, "", $3)} {gsub("Washington", "WA",$5}{gsub("Idaho", "ID",$5)} {gsub("Montana", "MT",$5)} {gsub("Oregon", "OR",$5)} {gsub("Brit$
sed "s/'/ /g" ~/PNW_Contact.txt > ~/PNW_Contacts.txt
rm ~/PNW_Contact.txt
rm ~/file1
rm ~/PNW1
sudo cp ~/PNW_Contacts.txt /var/www/html/PNW_Contacts.txt
until [[ $(find /home/kc7aad/PNW_Contacts.txt -type f -size +96000c 2>/dev/null) ]]; do
echo "$dt" - Failed >> ~/ids.log
sleep 10
done
echo "$dt" - Success >> ~/ids.log
mail -s "PNW DMR Contacts Update" kc7aad#gmail.com -A ~/PNW_Contacts.txt < /home/kc7aad/PNW_Message.txt
If I run this script manually, it does succeed. If I let Cron try to complete this, it fails.
I think that's all the detail that is needed. Please let m eknow if there are any questions!
Thanks.

Bash script sudo and variables [duplicate]

This question already has an answer here:
Bash foreach loop works differently when executed from .sh file [duplicate]
(1 answer)
Closed 4 years ago.
Totally new to Bash here, actually I've avoided it like a plague for 10 years.
Today, there is no way around it.
After a few hours of beating my head against the keyboard, I discovered that sudo and any bash variable in a command gets stripped out.
So I have something like
somescript.sh
for i in {1..5}
do
filename=somefilenumber"$i".txt
echo $filename
done
on the command line now if I run it
user#deb:~$ ./somescript.sh
I get the expected
somefilenumber1.txt
somefilenumber2.txt
somefilenumber3.txt
somefilenumber4.txt
somefilenumber5.txt
but if I run with sudo, like
user#deb:~$ sudo ./somescript.sh
I'll get this
somefilenumber{1..5}.txt
This is a huge problem because I'm trying to cp files and rm files in a loop with the variable.
So here is the code with cp and rm
for i in {1..10}
do
filename=somefilenumber"$i".txt
echo $filename
cp "$filename" "someotherfilename.txt"
rm "$filename"
done
I end up getting
cp: cannot stat 'somefilenumber{1..5}.txt': No such file or directory
rm: cannot remove 'somefilenumber{1..5}.txt': No such file or directory
I need to run sudo also because of other programs that require it.
Is there any way around this?
Even if nothing else require sudo, and I don't use it, the rm command will prompt me for every file if I'm sure that I want to remove it or not. The whole point is to not be sitting here tied to the computer while it runs through hundreds of files.
You could try to replace {1..10} with seq 1 10:
for i in `seq 1 10`
do
filename=somefilenumber"$i".txt
echo $filename
cp "$filename" "someotherfilename.txt"
rm "$filename"
done
Your problem sounds like the environment has something wrong for root, do you start the script with:
#!/bin/bash
?

Looping though all .sh files in a directory to find specific text in each one

I have a bunch of .sh files in a directory. I need to find out how many stored procedures each one of them calls. What would be the best way to do that?
I'm REALLY new with bash scripts so this is all very new to me. From what I looked online I hacked up a starting point (I think) but I have no idea how I would open each file, and find "something.sql" in it and then out put number of times that was found in each file.
Here's what I have:
#!/bin/sh
for i in 'ls *.sh'
echo -e "\n **** START****"
do
echo -e " \n Filename: $i"
done
echo -e "\n **** END ****"
done
Thanks for any help!
Try this:
grep -nc sql *.sh
See how that moves you. You can add -i if you name sql files in as file.SQL too. Or if they all have a .sql extension.
grep -nc '\.sql' *.sh
For you comment you added, try this:
for i in *.sh
grep -Hc '\.sql' $i
grep '\.sql' $i
done

Automate Simple Terminal Commands

So here's an easy one for all you talented people out there. :) What I would like to do is create an Automator application that performs the following simple terminal commands when you drop a .command file on it, whose file name starts with "abc-123", and it's important that it throws an error if these criteria are not met and doesn't try to run the script.
chmod 777 file.command
./file.command
That's it! I don't have that much experience with this and having tried to Google my way to answer for 2 hours now, I thought I'd just ask, since it's probably quite simple...me hopes. :)
Like this in bash shell :
for f in "$#"
do
name=${f##*/};
if [[ "$name" = abc-123*.command ]];then
chmod 777 "$f"
"$f"
else
exit 1;
fi
done

Resources