.SH/.COMMAND Bash File Mac OSX Not working - macos

I keep getting this error for my script below when I try to run it
The file “test.command” could not be executed because you do not have appropriate access privileges.
Tried changing the read only to read/write. Didn't work
for domain in $(pwgen -1A0B 6 10000); do echo -ne "$domain.com "; if [ -z "$(whois $domain.com | grep -o 'No match for')" ]; then echo -ne "Not "; fi; echo "Available for register"; done >> domains.txt
Any ideas?

You need to give execute permission as well to execute it.
chmod u+x file

Related

How to know if file exsits ina Samba share

I wrote a shellscript as follows, to check for a file in a samba share:
date_gen=$(date --date="3 days ago" +"%-Y%m%d")
fileName=${date_gen}"_Combined Reg Report.xlsx"
if [ ! -f smb://nfs/carboard/"${fileName}" -U ]
then
echo "File does not exist in Bash"
else
echo ${fileName}
fi
exit 1
Can someone please help me what is wrong with this, I am always getting "File does not exist in Bash". File is there in the folder.
Thanks,
Art
You should check if it's mounted and then check for the file
if mount | grep -q /nfs/cardboard
then
if [[ ! -f /nfs/cardboard/"${fileName}" ]]
then
...
fi
else
echo "not mounted"
fi
Checking the existence of a file with smbclient:
filename="$(date --date='3 days ago' '+%Y%m%d')_Combined Reg Report.xlsx"
if smbclient -A smbauth.conf '//nfs/carboard' -c "ls \"$filename\"" > /dev/null 2>&1
then
echo the file exists
else
echo the file is not there
fi
where smbauth.conf is a file storing your credentials in the following format:
username=myuser
password=mypassword
domain=MYDOMAIN
I don't know how escaping exactly works with smbclient (it seems like some characters like " are impossible to escape), but in your case, double-quoting is enough.

shell script runs manually, but fails on crontab

I have written a shell script to copy a file from a remote server and convert the same file to another format. After conversion, i will edit the file using sed command .The script runs successfully when executed manually but fails when executed through crontab.
Crontab entry is:
*/1 * * * * /script/testshell.sh
Below is the shell script code:
#!/bin/bash
file="/script/test_data.csv" if [ -f "$file" ] then
echo " file is present in the local machine " else
echo " file is not present in the local machine "
echo " checking the file is present in the remote server "
ssh user#IP 'if [ -f /$path ]; then echo File found ; else echo File not found; fi' fi
if [ -f "$file"] then
rm -f test_data.csv fi
scp -i /server.pem user#IP:/$path
file="/script/test_data.csv" if [ -f "$file" ] then
echo "$file found." else
echo "$file not found." fi
if [ -f "$file" ] then echo " converting csv to json format ....." fi
./csvjson.sh input.csv output.json
sed -e '/^$/d' -e 's/.*/&,/; 1 i\[' ./output.json | sed ' $ a \]'
hello.json
After running the script manually, it works perfectly. But not working for crontab.
What doesn't work from cron, whats the output are there any errors from cron?
A few things to try:
cron doesn't run your profile by default so if your script needs anything set from it include it in the crontab command e.g.
". ./.bash_profile;/script/testshell.sh"
I can't see $path set anywhere, although its used to test for a file existing, are you setting that manually somewhere so won't be set from cron?.
Some of your scripts and files are specified as being in current dir (./), from cron that will be your home folder, is that right or do you need to Change directory in the script or use a path for them?
Hope that helps

Create an auto cPanel backup script

I want to create a script at a cPanel / linux server.
I want it to check on all possible errors.
#/bin/bash
cpanel_username=$1
domain=$2
if [ -z "$cpanel_username" ]
then
echo "no username given"
elif [ -z "$domain" ]
then
echo "no domain given"
else
/scripts/pkgacct $cpanel_username /backup/
mv /backup/cpmove-$cpanel_username.tar.gz /backup/$cpanel_username.tar.gz
FILE="/backup/$cpanel_username.tar.gz"
FTPFILE="$cpanel_username.tar.gz"
if [ -f "$FILE" ]
then
### FTP login credentials bellow ###
FTPU="ftpuser"
FTPP="ftpass"
FTPS="host"
lftp -u $FTPU,$FTPP -e "cd /hosting_backups;put $FILE;quit" $FTPS
######HERE I WANT TO CHECK IF THE FILE IS UPLOADED
rm /backup/$cpanel_username.tar.gz
else
echo "couldnt take backup of $domain"
fi
fi
Also i would like to know if i can check if there is enough space for me to take backup. I want this to be done before the backup command.
I have tried many hours , but i couldnt make it . Any help is appreciated

How do I check to see if a file exists on a remote server using shell

I have done a lot of searching and I can't seem to find out how to do this using a shell script. Basically, I am copying files down from remote servers and I want to do something else if it doesn't exist. I have an array below, but I tried to reference it directly, but it is still returning false.
I am brand new at this, so please be kind :)
declare -a array1=('user1#user1.user.com');
for i in "${array1[#]}"
do
if [ -f "$i:/home/user/directory/file" ];
then
do stuff
else
Do other stuff
fi
done
Try this:
ssh -q $HOST [[ -f $i:/home/user/directory/file ]] && echo "File exists" || echo "File does not exist";
or like this:
if ssh $HOST stat $FILE_PATH \> /dev/null 2\>\&1
then
echo "File exists"
else
echo "File not exist"
fi
Assuming you are using scp and ssh for remote connections something like this should do what you want.
declare -a array1=('user1#user1.user.com');
for i in "${array1[#]}"; do
if ssh -q "$i" "test -f /home/user/directory/file"; then
scp "$i:/home/user/directory/file" /local/path
else
echo 'Could not access remote file.'
fi
done
Alternatively, if you don't necessarily need to care about the difference between the remote file not existing and other possible scp errors then the following would work.
declare -a array1=('user1#user1.user.com');
for i in "${array1[#]}"; do
if ! scp "$i:/home/user/directory/file" /local/path; then
echo 'Remote file did not exist.'
fi
done

how to find a file exists in particular dir through SSH

how to find a file exists in particular dir through SSH
for example :
host1 and dir /home/tree/TEST
Host2:- ssh host1 - find the TEST file exists or not using bash
ssh will return the exit code of the command you ask it to execute:
if ssh host1 stat /home/tree/TEST \> /dev/null 2\>\&1
then
echo File exists
else
echo Not found
fi
You'll need to have key authentication setup of course, so you avoid the password prompt.
This is what I ended up doing after reading and trying out the stuff here:
FileExists=`ssh host "test -e /home/tree/TEST && echo 1 || echo 0"`
if [ ${FileExists} = 0 ]
#do something because the file doesn't exist
fi
More info about test: http://linux.die.net/man/1/test
An extension to Erik's accepted answer.
Here is my bash script for waiting on an external process to upload a file. This will block current script execution indefinitely until the file exists.
Requires key-based SSH access although this could be easily modified to a curl version for checks over HTTP.
This is useful for uploads via external systems that use temporary file names:
rsync
transmission (torrent)
Script below:
#!/bin/bash
set -vx
#AUTH="user#server"
AUTH="${1}"
#FILE="/tmp/test.txt"
FILE="${2}"
while (sleep 60); do
if ssh ${AUTH} stat "${FILE}" > /dev/null 2>&1; then
echo "File found";
exit 0;
fi;
done;
No need for echo. Can't get much simpler than this :)
ssh host "test -e /path/to/file"
if [ $? -eq 0 ]; then
# your file exists
fi

Resources