rm -r in shell script (my first *.sh) - bash

I would like to delete some folder on an Ubuntu 8.04 Server.
I would like to start a script to delete this folder.
I start an ssh session to the server.
My script looks like this:
#!/bin/bash
rm -r /var/lib/backuppc/pc/PC1/
rm -r /var/lib/backuppc/pc/PC2/
I run the script like this:
sh scriptname.sh
But I get this message:
rm: cannot remove `/var/lib/backuppc/pc/PC1/\r': No such file or directory
rm: cannot remove `/var/lib/backuppc/pc/PC1/\r': No such file or directory
I'm sorry but I don't use ever a shell script on linux.
I think it my fault because I don't know the basics :-(
Can somebody help me? I've to delete ~80 folder... :-(

It looks like there is some "junk" characters after your folder name (namely, \r). To be sure, type cat -A scriptname.sh and check if you can see some weird characters in the end of the lines. If so, I think the easiest thing for you (since you have few lines) is to manually delete the ending of those lines and re-type again. (I'm talking about the last two or three characters only)
Type cat -A scriptname.sh and see if the characters disappeared. If so, you should be good to go with your code.

Related

Bash script not copying files

I have a bash script which is pretty simple (or so I thought - but I don't write them very often):
cp -f /mnt/storage/vhosts/domain1.COM/private/auditbaseline.php /mnt/storage/vhosts/domain1.COM/httpdocs/modules/mod_monitor/tmpl/audit.php
cp -f /mnt/storage/vhosts/domain1.COM/private/auditbaseline.php /mnt/storage/vhosts/domain2.org/httpdocs/modules/mod_monitor/tmpl/audit.php
The script copies the contents of auditbaseline to both domain 1 and domain 2.
For some reason it won't work. When I have the first line in on its own it's okay but when I add the second line I can't get it to work it locks up the scripts and they can't be accessed.
Any help would be really appreciated.
Did you perhaps create this script on a Windows machine? You should make sure that there are no CRLF line breaks in the file. Try using dos2unix (http://www.linuxcommand.org/man_pages/dos2unix1.html) to convert the file in that case.

Find and replace with CGI script

I'm trying to use a CGI script to run a find and replace command on a specific text file.
I currently have a CGI script (foo.sh) which then executes a non-CGI shell script (bar.sh). In the non-CGI shell script (bar.sh), I'm able to perform a number of simple bash commands such as wget, mkdir and cd, and I'm also able to execute a .js file with the standard dot-slash bash syntax.
However, I can't get any find and replace commands to work when executed with CGI. I've tried sed, awk and perl, all of which work perfectly when used either directly on the command line or if I execute bar.sh from the command line. But once I try to execute the CGI script from the browser, the find and replace commands no longer work.
The syntax I've tried is below. Any suggestions appreciated.
sed -i 's/foo/bar/g' text.txt
{ rm text.txt && awk '{gsub("foo", "bar", $0); print}' > text.txt; } < text.txt
perl -p -i -e 's/foo/bar/g' text.txt
just a few thoughts:
if run by the web server, the script will probably be executed as a different user which could lead to permission-related problems.
maybe the awk/sed etc. commands are not in the path used by the web server process (try to use an absolute path here as well)
is there anything in the web server's error log?
Thanks for the input Michael, and sorry for the late response. You were correct in that it was a permissions issue caused by the fact that the CGI script runs not as me but as the "www-data" user.
The problem occurred because that user didn't have write privileges to execute a sed find and replace command on the target folder (in this case a directory within /usr/share/).
I changed permissions on the target folder to full read/write everyone, and now the script runs successfully.
Thanks again.

nemo script for torrents

Hi I am new to scripting and I do mean a complete noobie. I am working on a script to automatically make a torrent with nemo scripts.
#!/bin/bash
DIR="$NEMO_SCRIPT_SELECTED_FILE_PATHS"
BNAME=$(basename "$DIR")
TFILE="$BNAME.torrent"
TTRACKER="http://tracker.com/announce.php"
USER="USERNAME"
transmission-create -o "/home/$USER/Desktop/$TFILE" -t $TTRACKER "$DIR"
It does not work.
However if I replace
DIR="$NEMO_SCRIPT_SELECTED_FILE_PATHS"
with
DIR="absolutepath"
than it works like a charm. It creates it on the desktop with the tracker I want. I think this would come in handy for many people. I dont really know what to put. Have questions please ask. Again complete noobie.
The $NEMO_SCRIPT_SELECTED_FILE_PATHS is the same as $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS. It's populated by nemo/nautilus when you run the script and contains a newline-delimited (I think) list of the selected files/folders. Assuming you are selecting only one file or folder, I don't really see why it wouldn't work - unless the newline character is in there and causing problems. If that's the case, you may be able to strip it with sed. Not running nemo or nautilus, so I can't test it.
I finally found the solution to yours and my problem [https://askubuntu.com/questions/243105/nautilus-scripts-nautilus-script-selected-file-paths-have-problems-with-spac][1]
The variable $NEMO_SCRIPT_SELECTED_FILE_PATH/$NAUTILUS_SCRIPT_SELECTED_FILE_PATH is a list of paths/filenames seperated by a Newline. This messes up anything that assumes its just one filename, even if it is.
#!/bin/bash
echo "$NEMO_SCRIPT_SELECTED_FILE_PATHS" | while read DIR; do
BNAME=$(basename "$DIR")
TFILE="$BNAME.torrent"
TTRACKER="http://tracker.com/announce.php"
USER="USERNAME"
transmission-create -o "/home/$USER/Desktop/$TFILE" -t $TTRACKER "$DIR"
done
Notice it seems to do an extra pass for the newline. You either need to filter that out or put an if the file/folder exists

Remove whitespaces in shell .SH file

Due to processes out of my control I need run multiple SH files which contains lengthy CURL commands. Problem is that whichever process created these commands seems to have included one line of whitespace at the very end. If I call it as is - it fails. If I physically open the file and hit backspace on the first full empty line and save the file - it works perfectly.
Any way to put some kind of command into the SH file so that it removes any unnecessary stuff?
More info would be helpful, but the following might work:
If you need to put something into each of the files that contain the curl commands as you mention, you could try putting exit as the last line of the curl script (also depends on how you're calling the 'curl files'
exit
If you can run a separate script against the files that have a blank line, perhaps sed the blank lines away?
sed -i s/^\s$// $fileWithLineOfSpaces
edit:
Or (after thinking about it), perhaps simply delete the last line of the file....
sed -i '$d' $file

Running bash shell in Maemo

I have attempted to run the following bash script on my internet tablet (Nokia N810 running on Maemo Linux). However, it doesn't seem that it is running, and I have no clue of what's wrong with this script (it runs on my Ubuntu system if I change the directories). It would be great to receive some feedback on this or similar experiences of this issue. Thanks.
WORKING="/home/user/.gpe"
SVNPATH="/media/mmc1/gpe/"
cp calendar categories contacts todo $WORKING
What actually happens when you run your script? It's helpful if you include details of error messages or behavior that differs from what's expected and in what way.
If $WORKING contains the name of a directory, hidden or not, then the cp should copy those four files into it. Then ls -l /home/user/.gpe should show them plus whatever else is in there, regardless of whether it's "hidden".
By the way, the initial dot in a file or directory name doesn't really "hide" the entry, it's just that ls and echo * and similar commands don't show them, while these do:
ls -la
ls -d .*
ls -d {.*,*}
echo .*
echo {.*,*}
The bash cp command can copy multiple sources to a single destination, if it's a directory.
Does the directory /home/user/.gpe exist?
Bear in mind that the leading dot in the name can make it hidden unless you use ls -a
I tried your commands in cygwin:
But I used .gpe instead of /home/user/.gpe
I did a touch calendar categories contacts todo to create the files.
It worked fine.
If that's the entirety of your script, it's missing two. possible three, things:
A shebang line, such as #!/bin/sh at the start
Use of $SVNPATH. You probably want to cd $SVNPATH before the cp command. Your script should not assume the current working directory is correct.
Possibly execute permission on the script: chmod a+x script
Do you already have the /home/user/.gpe directory present? And also, try adding a -R parameter so that the directories are copied recursively.

Resources