how to correctly call unix command from other dirs - bash

I have a relatively simple question that I cant figure out and I cant figure out the right search query to find the info I need on google so I thought I would ask the collective.
In short:
cd /var/www/config
./deploy.sh - works!
but
./var/www/config/deploy.sh
doesnt :(
deploy.sh calls another bash script and it seems that the called script cant find the libs it needs because it searches relative to where it was called from which in this case would be / instead of /var/www as it expects.
I'm trying to call this from a capistrano script therefore need to find a way to call it without having to cd first. Does anyone know a simple way to achieve this?
EDIT: Thanks for your quick suggestions, its still playing up. deploy.sh calls another bash file called sake. I have uploaded a copy here http://tinypaste.com/25fc8
Cheers guys!

Don't put a . (period) in front of your command. Just use:
$ /var/www/config/deploy.sh

You can also wrap it too so you can return to existing dir, sometimes proggies like to pick up the PWD in which to work so might be worth setting it explicitly:
( cd /var/www/config/ && ./deploy.sh )

If you want to remain where you are after the command is done:
(cd /var/www/config; ./deploy.sh)

Related

How to run shell script within shell script with fixed arguments

I have a simple script that creates a loop around another script and directly gives the parameters and arguments to that script - here comes the loop into play since the script is supposed to run over several files. The way I wrote it it's currently not working so how should I attach these parameters? I'm fairly new to bash so any help will be appreciated a lot!
#!/bin/bash
SCRIPT_PATH="xx.sh"
for x in {001..031}; do
"$SCRIPT_PATH" /data/raw/"$x"_AE data/processed/"$x"_AE 5 --info
done
There may have a syntax issue in your script, the first path is starting with '/' (/data/raw/...) so it is absolute, but it is NOT the case of the second one data/processed/...; is is intentional?
Ensure there is NO directory/path issue (where is xx.sh located ?)
Ensure the user who launches the script has access permissions on /data directories and sub-directories
Let me know if it fixes your issue?

Change directory to similar numbered folder. (shell scripting)

I need a solution(s) on how to change to the latest directory that has similar naming conventions as others. For example:
Folder names are as follows...
parent_directory ---> folder100.000.200
----------------------------folder100.000.201
----------------------------folder100.000.202
----------------------------folder100.000.203
Using linux shell, I'm trying to navigate to the parent_directory then to the latest folder (folder100.000.203) without having to change the script every single day (folders are added regularly).
The old solution was:
FOLDER=folder100.000.$1
cd parent_folder/${FOLDER}
But this no longer works. I understand that $1 is a positional parameter, but fail to understand why it worked previously and not now. Any help with this would be greatly appreciated.
EDIT:
So I figured out the old script was passing a value into the script, which is where $1 was coming from. When I ran the script, I forgot I had to pass in the version number, but is there a way around me having to specify this?
./script.sh 203
This is the current working solution, but I still want to automate finding the number and passing it in.
If the folder you want is the most recently created (regardless of its name) then you can use ls -c to sort by ctime descending. So, perhaps something like:
cd $(ls -c | head -1)
I think I have found a solution, but I wanted more eyes on it to verify.
cd $(ls -dr */ | grep 'folder100.000' | head -1
This solution returns the directory (-d */) that has 'folder100.000' (grep 'folder100.000') in it's name and sorts it in reverse order (-r), which returns the highest build number (head -1). So this code returns:
folder100.000.203
Since the only thing that changes is the final 3 numbers, is this a proper solution or should I find another way of finding the latest version folder?
Side note: These commands will be running in ftp

BASH variable in string

This script1 is not working as intended. I will explain below:
#!/bin/bash
### SETUP ###
USER="MYUSER"
DIRS="MYDIR"
BUCKET="mybucket"
DOACCESS="ACCESSKEY"
DOSECRET="SECRETKEY"
NAME="FILENAME"
EXPIRE="7 days"
NOW=$(date +"%d-%m-%Y")
DAY=$(date +"%a")
# ...lots of code that is working great...
### CLEAN OLD FILE FROM BUCKET ###
### This is the line that I am having issues with.
sh ./s3-del-old.sh ''"${BUCKET}"'/backup' '"${EXPIRE}"'
END
The script2 got copied from here.
What I had prior to following some instructions on Bash: Variable in single quote linked below was:
sh ./s3-del-old.sh "$BUCKET/backup" "$EXPIRE"
This did not work and was ignored when running the bash script.
I attempted to leave out the stuff that doesn't matter to the question below, although I believe I may have confused things. For this I apologize. Very simply, I have a line in script1 that calls another script2. I use variables to meet the needs of the script. To which it is not working and I cannot find a easy to understand solution online, thus the need to post the question.
----END OF UPDATE----
I have looked at some of the answered questions, but I am not finding a solution that fits my needs or one that I can understand fully to use for my needs.
I have tried following this, although I need a little more help.
This is what I am trying to do:
I have a backup script that uses DreamHost's DreamObjects to store my backups. The annoying part with DreamObjects is that it doesn't have any built in features for removing files created x days ago. Hence my problem. I would like to add a call to a bash file from my bash file. If that makes sense. :) If not, the code in question is below, you should be able to understand then.
I would really like to be able to add the code to my current script instead of using a separate file. I just don't know how to rewrite it properly without spending more time than I have on it. I found the code at.
My variables that matter for this problem:
BUCKET="mybucket"<br>
EXPIRE="7 days"
This is the line that calls the file:
sh ./s3-del-old.sh ''"${BUCKET}"'/backup' '"${EXPIRE}"'
This provides me with an error of date:
invalid date `-"${EXPIRE}"'
The file uses the following syntax to work:
s3-del-old "bucket" "30 days"
It does work perfectly when I use it in the command line on it's own, I just would like to add the call to one file so that I can use one cronjob instead of two. Plus, this way I can use the script with any of my domains/buckets by changing the variables. :)
The "other script" that you need to call is a bash script.
A bash script (usually) will not work if called as sh, as you are doing:
sh ./s3-del-old.sh ''"${BUCKET}"'/backup' '"${EXPIRE}"'
Please call the script with bash:
bash ./s3-del-old.sh "${BUCKET}"/backup "${EXPIRE}"
Or even better, let the script choose the shell that should run it:
./s3-del-old.sh "${BUCKET}"/backup "${EXPIRE}"
With the shebang of the file s3-del-old.sh:
#!/bin/bash
Sometimes I amaze myself at how difficult I try to make things...s3cmd has an expire function for files by create date...that will be a lot easier...
Really a big thank you to all that helped!
My bad this was suppose to be a comment not an answer. :)

Using variables between files in shell / bash scripting

This question has been posted here many times, but it never seems to answer my question.
I have two scripts. The first one contains one or multiple variables, the second script needs those variables. The second script also needs to be able to change the variables in the first script.
I'm not interested in sourcing (where the first script containing the variables runs the second script) or exporting (using environment variables). I just simply want to make sure that the second script can read and change (get and set) the variables available in the first script.
(PS. If I misunderstood how sourcing or exporting works, and it applies to my scenario, please let me know. I'm not completely closed to those methods, after what I've read, I just don't think those things will do what I want)
Environment variables are per process. One process can not modify the variables in another. What you're asking for is not possible.
The usual workaround for scripts is sourcing, which works by running both scripts in the same shell process, but you say you don't want to do that.
I've also given this some thought. I would use files as variables. For example in script 1 you use for writing variable values to files:
echo $varnum1 > /home/username/scriptdir/vars/varnum1
echo $varnum2 > /home/username/scriptdir/vars/varnum2
And in script 2 you use for reading values from files back into variables:
$varnum1=$(cat /home/username/scriptdir/vars/varnum1)
$varnum2=$(cat /home/username/scriptdir/vars/varnum2)
Both scripts can read or write to the variables at any given time. Theoretically two scripts can try to access the same file at the same time, I'm not sure what exactly would happen but since each file only contains one value, the time to read or write should be extremely short.
In order to even reduce those times you can use a ramdisk.
I think this is much better than scripts editing each other (yuk!). Live editing of scripts can mess up scripts and only works when you initiate the script again after the edit was made.
Good luck!
So after a long search on the web and a lot of trying, I finally found some kind of a solution. Actually, it's quite simple.
There are some prerequisites though.
The variable you want to set already has to exist in the file you're trying to set it in (I'm guessing the variable can be created as well when it doesn't exist yet, but that's not what I'm going for here).
The file you're trying to set the variable in has to exist (obviously. I'm guessing again this can be done as well, but again, not what I'm going for).
Write
sudo sed -i 's/^\(VARNAME=\).*/\1VALUE/' FILENAME
So i.e. setting the variable called Var1 to the value 5, in the file
test.ini:
sudo sed -i 's/^\(Var1=\).*/\15/' test.ini
Read
sudo grep -Po '(?<=VARNAME=).*' FILENAME
So i.e. reading the variable called Var1 from the file test.ini
sudo grep -Po '(?<=Var1=).*' test.ini
Just to be sure
I've noticed some issues when running the script that sets variables from a different folder than the one where your script is located.
To make sure this always go right, you can do one of two things:
sudo sed -i 's/^\(VARNAME=\).*/\1VALUE/' `dirname $0`/FILENAME
So basically, just put `dirname $0`/ (including the backticks) in front of the filename.
The other option is to make `dirname $0`/ a variable (again including the backticks), which would look like this.
my_dir=`dirname $0`
sudo sed -i 's/^\(VARNAME=\).*/\1VALUE/' $my_dir/FILENAME
So basically, if you've got a file named test.ini, which contains this line: Var1= (In my tests, the variable can start empty, and you will still be able to set it. Mileage may vary.), you will be able to set and get the value for Var1
I can confirm that this works (for me), but since you all, with way more experience in scripting then me, didn't come up with this, I'm guessing this is not a great way to do it.
Also, I couldn't tell you the first thing about what's happening in those commands above, I only know they work.
So if I'm doing something stupid, or if you can explain to me what's happening in the commands above, please let me know. I'm very curious to find out what you guys think if this solution.

Running bash shell in Maemo

I have attempted to run the following bash script on my internet tablet (Nokia N810 running on Maemo Linux). However, it doesn't seem that it is running, and I have no clue of what's wrong with this script (it runs on my Ubuntu system if I change the directories). It would be great to receive some feedback on this or similar experiences of this issue. Thanks.
WORKING="/home/user/.gpe"
SVNPATH="/media/mmc1/gpe/"
cp calendar categories contacts todo $WORKING
What actually happens when you run your script? It's helpful if you include details of error messages or behavior that differs from what's expected and in what way.
If $WORKING contains the name of a directory, hidden or not, then the cp should copy those four files into it. Then ls -l /home/user/.gpe should show them plus whatever else is in there, regardless of whether it's "hidden".
By the way, the initial dot in a file or directory name doesn't really "hide" the entry, it's just that ls and echo * and similar commands don't show them, while these do:
ls -la
ls -d .*
ls -d {.*,*}
echo .*
echo {.*,*}
The bash cp command can copy multiple sources to a single destination, if it's a directory.
Does the directory /home/user/.gpe exist?
Bear in mind that the leading dot in the name can make it hidden unless you use ls -a
I tried your commands in cygwin:
But I used .gpe instead of /home/user/.gpe
I did a touch calendar categories contacts todo to create the files.
It worked fine.
If that's the entirety of your script, it's missing two. possible three, things:
A shebang line, such as #!/bin/sh at the start
Use of $SVNPATH. You probably want to cd $SVNPATH before the cp command. Your script should not assume the current working directory is correct.
Possibly execute permission on the script: chmod a+x script
Do you already have the /home/user/.gpe directory present? And also, try adding a -R parameter so that the directories are copied recursively.

Resources