cron chmod script does not change owner - bash

i need to run cron job that changes owner and group for selected files.
i have a script for this:
#!/bin/bash
filez=`ls -la /tmp | grep -v zend | grep -v textfile | awk '$3 == "www-data" {print $8}'`
for ff in $filez; do
/bin/chown -R tm:tm /tmp/$ff
done
if i run it manually - it works perfectly. if i add this to root's cron
* * * * * /home/scripts/do_script
it does not change owner/group. file has permissions "-rwsr-xr-x".
any idea how this might be solved?

On my system, field $8 is the hour/year, not the filename. Maybe that's the case for your root user as well. This is why you should never try to parse ls. Even if you fix this issue, half a dozen more will remain to break the system in the future.
Use find instead:
find /tmp ! -name '*zend*' ! -name '*textfile*' -user www-data \
-exec chown -R tm:tm {} \;

if you are adding to root's cron (/etc/crontab) then be aware that the syntax is different from a normal user's crontab.
# m h dom mon dow user command
* * 1 * * root /usr/bin/selfdestruct --immediately
Also give the whole path to your command: Cron has not really a rich environment.
Make sure that the commands in your script also have the full path and don't use environment variables.

Related

sh script works manually but not via cron

I'm trying to run a sh script from crontab. If I run script manually it works perfect, but when I run it in crontab, I get errors.
The script:
#!/bin/bash
sudo tar -zcvf /var/www/nextcloud/data/nextcloud/files/backup.tar.gz /home/beno/stuff/
sudo -u www-data /usr/bin/php /var/www/nextcloud/occ file:scan --all >> /var/www/nextcloud/data/nextcloud/files/backup_log.txt
The script is supposed to make a tar backup of a folder and put it in nextcloud folder and run command files:scan, so nextcloud rescans filesystem and starts synchronization...as I read it here:
https://doc.owncloud.org/server/9.0/admin_manual/configuration_server/occ_command.html#file-operations-label
When crontab runs the script, backup.tar.gz is created, then I get following error:
An unhandled exception has been thrown:
Doctrine\DBAL\DBALException: Failed to connect to the database: An exception occured in driver: SQLSTATE[HY000] [2002] No such file or directory in /var/www/nextcloud/lib/private/DB/Connection.php:60
I'm using ubuntu16 and nextcloud11. Please help!
Following you'r request
Instead of using you'r own crontab, you have to use specic user's crontab, by sudo your crontab command:
sudo -s
then
crontab -e
now you coul add the root's 1st line:
tar -zcvf /var/www/nextcloud/data/nextcloud/files/backup.tar.gz /home/beno/stuff/
and once finished
exit
sudo -s www-data
... and so on...
or (if you could'nt run a shell under www-data):
sudo www-data crontab <<<'01 2 * * * /usr/bin/php /var/www...'
Note: This will overwrite www-data's crontab by this only line!
To prevent this, you could:
sudo www-data crontab -l
to see what contain actual crontab, then
sudo www-data crontab -l |
sed -e '$a 01 2 * * * /usr/bin/php /var/www...'
For adding you'r line, and finally
sudo www-data crontab -l |
sed -e '$a 01 2 * * * /usr/bin/php /var/www...' |
sudo www-data crontab
For replacing actual crontab by modified one.
But it could be simplier to
sudo vi /etc/cron.d/mybackups
Then add you'r rule by following time spec by user name:
21 22 * * * root tar -zcvf /var/w...
22 23 * * * www-data /usr/bin/php /var/www...
Further doc
see man -P'less +/SYSTEM' 5 crontab

Crontab script su root / user -c does not execute

if been trouble-checking for hours and can't find out why my shell script won't execute properly when using a root crontab.
I'm on a vServer eqipped with
Ubuntu 14.04.4 LTS
3.13.0-042stab113.11.
my script is a chmod 711 file:
/usr/local/sbin/bckup_script
and looks like this:
#!/bin/bash
DATE=`date +%Y-%m-%d_%H_%M_%S`
su - -c "chgrp postgres /backup/db"
su - -c "chmod 770 /backup/db"
su - -c "chown user /backup/db"
su - postgres -c "pg_dump db_name > /backup/db/${DATE}db_name.sql && pg_dumpall > /backup/db/${DATE}_all_db.out"
su - -c "rsync -a /home/user/value /backup/"
The crontab is started using
crontab -e
as
root
user
The crontab executes as far as I can tell from syslog.
When executed as root user (no crontab), the file will do what it's told to. Also my PATH is set properly and working.
I have no idea what am doing wrong.
Solution:
Thx to Jay jargot I found out what was wrong. To complete the question, here are the outputs you "asked" for:
crontab -l
#m h dom mon dow command
* * * * * bckup_script
Output of crontab was
/bin/sh: bckup_script: command not found
which lead me to the conclusion to use the absolute Path to the file which solved the problem.
my crontab -l now looks like follows and everything works like a charm!
# m h dom mon dow command
49 20 * * 1-5 /usr/local/sbin/bckup_script
Thx very much!

Cron - gsutil not found

gsutil has been installed here:
/usr/local/bin/gsutil
My crontab looks like this (i'm logged in as root):
*/1 * * * * /home/deploy/cron/job.sh >> /home/deploy/cron/test.log 2>&1
job.sh:
#!/bin/sh
PATH="$PATH":/usr/local/bin/gsutil
now=$(date +"%m_%d_%y_%R");
cp /home/deploy/testfile.txt /tmp/testfile_$now.txt;
gsutil cp /home/deploy/testfile.txt gs://testbucket/testfile_$now.txt;
echo "saved file at $now";
When I look in my log file I see this:
/home/deploy/cron/job.sh: 5: /home/deploy/cron/job.sh: gsutil: not found
saved file at 07_20_15_13:03
Any idea what I'm doing wrong?
Had the same issue, you need to specify the full path when you call gsutil.
In your case:
/usr/local/bin/gsutil/gsutil cp /home/deploy/testfile.txt gs://testbucket/testfile_$now.txt;
yes, you have use the full path gsutil command to use in crontab
/root/gcloud/gsutil cp ...

Searching Linux Files for a string (i.e.root credentials)

As a part of our audit policy. I need to search all files on a linux machine for any file that contains the root credentials.
This command will be run by a non-root account, thus, the result will include many "Permission denied" statements.
Any suggestion of the proper syntax to search all files ans filter the result to show useful links only !
I tried:
grep - "root" / | grep "password"
However, as this command is run using non root accounts, the big part of the result is "permission denied"
Thanks
The permission errors are outputed to stderr, so you could simply redirect that to /dev/null/. E.g.:
grep -R "root" . 2> /dev/null
You would go:
grep -lir "root" /
The -l switch outputs only the names of files in which the text occurs (instead of each line containing the text), the -i switch ignores the case, and the -r descends into subdirectories.
EDIT 1:
As running it as not root will be fine, as long as you're not trying to read other users' files.
EDIT 2:
To have only useful links, go with:
grep -lir -v "Permission denied" "root" /
The -v switch is for inverting the sense of matching, to select non-matching lines.
However, as this command is run using non root accounts, the big part
of the result is "permission denied"
Use sudo to run this recursive grep:
cd /home
sudo grep -ir 'root' *
You can suppress the warnings with a redirection to /dev/null.
This solution uses find to walk the whole (accessible) filesystem :
find / -readable -exec grep -H root '{}' \; 2>/dev/null | grep password

SHELL script to make ftp connection and get xml files

I need a shell script that will login to a remote FTP server, get the list of files present in only root folder and identify only xml files and get those files to local system.
Login credentials can be mentioned in the script it self. This script must be run once a day only.
Please help me with a UNIX BASH SHELL script.
Thanks
script:
#!/bin/bash
SERVER=ftp://myserver
USER=user
PASS=password
EXT=xml
DESTDIR=/destinationdir
listOfFiles=$(curl $SERVER --user $USER:$PASS 2> /dev/null | awk '{ print $9 }' | grep -E "*.$EXT$")
for file in $listOfFiles
do
curl $SERVER/$file --user $USER:$PASS -o $DESTDIR/$file
done
for scheduled run every day check the crontab:
crontab -e
for edit your current jobs and add for example:
0 0 * * * bash /path/to/script
that will mean run the script every day at midnight.
If you can install ncftpget, this is a one-line operation:
ncftpget -u user -p password ftp.remote-host.com /my/local/dir '/*.xml'

Resources