I would like to get insight on how to get started or what general direction to look in when trying to make a script or makefile that will run 3 make commands at once that take in the same input. These three commands all ask for the same input but just output different excel files due to it manipulating the pulled data in different ways. Therefore If I were able to create a script or makefile that ran all three commands at once when giving the input one time it would SAVE ME A TON OF TIME.
This is all being done in putty pretty much (in terms of the commands)
Thanks,
NP
You want to use a shell script.
For instance, you can create run.sh with:
#!/bin/bash
make FLAG1=ON $*
make FLAG2=ON $*
make FLAG3=ON $*
Make it executable and do `./run.sh MYCOMMOFLAG1=ON MYCOMMONFLAG2=OFF...
Related
I have 3 scripts for example: first.ksh, second.ksh, third.ksh.
I run all of those scripts one by one manually, when the first is done I run the second and then also the third. those scripts take time to run, doing them manually is time-consuming because is required me to be in front of the computer.
how can write a script or query which runs those scripts one by one, after one is finished, automatically?
Assuming the scripts are in the current working directory,
./first.sh ; ./second.sh ; ./third.sh
Where ; is the separator for "sequential list"
The topic you're looking for is "shell programming". There are many Unix shells, and they share basic features defined by POSIX and Single Unix Specification. Most of them have additional features as well as online documentations.
I have a bash script that executes a series of commands, some involving redirection. See cyrus-mark-ham-spam.
I want the script to have a test mode, where all the commands run are printed instead of executing them. As you can see, I have tried to do that by just putting "echo" on the front of each command in test mode.
Unfortunately this doesn't deal with redirection - any redirections are still done, so the program leaves lots of temp files littered about the place when run in test mode.
I have tried various ways to get round this, like quoting the whole command and passing it to a function that either prints it or runs it, but either the redirections work in test mode, or they don't work in run mode.
I thought this must have come up before, and wonder if there is a known solution which does not involve every command being repeated with an if TEST round the pair?
Please note, this is NOT a duplicate of show commands without executing them because neither that question, nor its answers, covers redirection (which is the essence of this question).
I see that it is not a duplicate but there is not general solution to this. You need to look at each command separately.
As long as the command doesn't use arguments enclosed in spaces, like
cmd -a -b -c > filename
, you can quote it:
echo 'cmd -a -b -c > filename'
But real life code is more complex, sure.
This question has been posted here many times, but it never seems to answer my question.
I have two scripts. The first one contains one or multiple variables, the second script needs those variables. The second script also needs to be able to change the variables in the first script.
I'm not interested in sourcing (where the first script containing the variables runs the second script) or exporting (using environment variables). I just simply want to make sure that the second script can read and change (get and set) the variables available in the first script.
(PS. If I misunderstood how sourcing or exporting works, and it applies to my scenario, please let me know. I'm not completely closed to those methods, after what I've read, I just don't think those things will do what I want)
Environment variables are per process. One process can not modify the variables in another. What you're asking for is not possible.
The usual workaround for scripts is sourcing, which works by running both scripts in the same shell process, but you say you don't want to do that.
I've also given this some thought. I would use files as variables. For example in script 1 you use for writing variable values to files:
echo $varnum1 > /home/username/scriptdir/vars/varnum1
echo $varnum2 > /home/username/scriptdir/vars/varnum2
And in script 2 you use for reading values from files back into variables:
$varnum1=$(cat /home/username/scriptdir/vars/varnum1)
$varnum2=$(cat /home/username/scriptdir/vars/varnum2)
Both scripts can read or write to the variables at any given time. Theoretically two scripts can try to access the same file at the same time, I'm not sure what exactly would happen but since each file only contains one value, the time to read or write should be extremely short.
In order to even reduce those times you can use a ramdisk.
I think this is much better than scripts editing each other (yuk!). Live editing of scripts can mess up scripts and only works when you initiate the script again after the edit was made.
Good luck!
So after a long search on the web and a lot of trying, I finally found some kind of a solution. Actually, it's quite simple.
There are some prerequisites though.
The variable you want to set already has to exist in the file you're trying to set it in (I'm guessing the variable can be created as well when it doesn't exist yet, but that's not what I'm going for here).
The file you're trying to set the variable in has to exist (obviously. I'm guessing again this can be done as well, but again, not what I'm going for).
Write
sudo sed -i 's/^\(VARNAME=\).*/\1VALUE/' FILENAME
So i.e. setting the variable called Var1 to the value 5, in the file
test.ini:
sudo sed -i 's/^\(Var1=\).*/\15/' test.ini
Read
sudo grep -Po '(?<=VARNAME=).*' FILENAME
So i.e. reading the variable called Var1 from the file test.ini
sudo grep -Po '(?<=Var1=).*' test.ini
Just to be sure
I've noticed some issues when running the script that sets variables from a different folder than the one where your script is located.
To make sure this always go right, you can do one of two things:
sudo sed -i 's/^\(VARNAME=\).*/\1VALUE/' `dirname $0`/FILENAME
So basically, just put `dirname $0`/ (including the backticks) in front of the filename.
The other option is to make `dirname $0`/ a variable (again including the backticks), which would look like this.
my_dir=`dirname $0`
sudo sed -i 's/^\(VARNAME=\).*/\1VALUE/' $my_dir/FILENAME
So basically, if you've got a file named test.ini, which contains this line: Var1= (In my tests, the variable can start empty, and you will still be able to set it. Mileage may vary.), you will be able to set and get the value for Var1
I can confirm that this works (for me), but since you all, with way more experience in scripting then me, didn't come up with this, I'm guessing this is not a great way to do it.
Also, I couldn't tell you the first thing about what's happening in those commands above, I only know they work.
So if I'm doing something stupid, or if you can explain to me what's happening in the commands above, please let me know. I'm very curious to find out what you guys think if this solution.
i'm trying to create a shell script so that it calls two commands in 2 seperate directories and then shows their feedback, To call the a command i'm guessing it would be something like this ./directory/ ./script.sh
Thanks in advance for your replies.
If you want to sequentially invoke the commands:
/path/to/command1; /path/to/command2
If you want to call the second command only if the first one succeeded:
/path/to/command1 && /path/to/command2
If you want to run them in parallel:
/path/to/command1 &
/path/to/command2
The output of the commands will be the standard output (most likely the terminal). If you run the two commands in parallel and they produce some output, you might want to redirect it to different files.
I have a load of bash scripts that backup different directories to different locations. I want each one to run every day. However, I want to make they don't run simultaneously.
I've wrote a script that basically just calls each script in succession and sits in cron.daily, but I want a way for this script to work even if I add and remove backup scripts without having to manually edit it.
So what I need to go is generate a list of the scripts (e.g. "dir -1 /usr/bin/backup*.sh") and then run each script it finds in turn.
Thanks.
#!/bin/sh
for script in /usr/bin/backup*.sh
do
$script
done
#!/bin/bash
for SCRIPT in /usr/bin/backup*.sh
do
[ -x "$SCRIPT" ] && [ -f "$SCRIPT" ] && $SCRIPT
done
If your system has run-parts then that will take care of it for you. You can name your scripts like "10script", "20anotherscript" and they will be run in order in a manner similar to the rc*.d hierarchy (which is run via init or Upstart, however). On some systems it's a script. On mine it's a binary executable.
It is likely that your system is using it to run hourly, daily, etc., cron jobs just by dropping scripts into directories such as /etc/cron.hourly/
Pay particular attention, though, to how you name your scripts. (Don't use dots, for example.) Check the man page specific to your system, since file naming restrictions may vary.