How to write a wrapper script in unix that calls other shell scripts sequentially? - shell

I am trying to write a wrapper script that calls other shell scripts in a sequential manner
There are 3 shell scripts that pick .csv files of a particular pattern from a specified location and process them.
I need to run them sequentially by calling them from one wrapper script
Let's consider 3 scripts
a.ksh, b.ksh and c.ksh that run sequentially in the same order.
The requirement is that the script should fail if a.ksh fails but continue if b.sh fails.
Please suggest.
Thanks in advance!

Something like:
./a.ksh && ./b.ksh; ./c.ksh
I haven't tried this out. Do test with sample scripts that fail/pass before using.
See: http://www.gnu.org/software/bash/manual/bashref.html#Lists

Related

Bash Script: Wait Until Google Compute Job Done?

I have a bunch of bash scripts that I run sequentially. I'm going to consolidate to a single script but there's one part that's a bit tricky. Specifically, script C launches a Google Compute Engine job and I only want script D (the one immediately following it) to execute once that's done.
Is there a good way of doing this?
In case it helps, my new script would be:
source script_A.sh
source script_B.sh
source script_C.sh
**wait until cloud job has finished**
source script_D.sh
Thanks!
After gcloud ... & is called, use gcloudpid=$! (I don't think you have to export, but it wouldn't hurt) to grab its pid. then your main script will be
source script_C.sh
wait $gcloudpid
source script_D.sh

Basic Shell Script call command from another director with feedback

i'm trying to create a shell script so that it calls two commands in 2 seperate directories and then shows their feedback, To call the a command i'm guessing it would be something like this ./directory/ ./script.sh
Thanks in advance for your replies.
If you want to sequentially invoke the commands:
/path/to/command1; /path/to/command2
If you want to call the second command only if the first one succeeded:
/path/to/command1 && /path/to/command2
If you want to run them in parallel:
/path/to/command1 &
/path/to/command2
The output of the commands will be the standard output (most likely the terminal). If you run the two commands in parallel and they produce some output, you might want to redirect it to different files.

Redirect bash output called from batch file to console

I have a batch file (build.bat) which calls a bash script (makelibs.sh). The bash script contains several commands which build 20 libraries from source.
If I run makelibs.sh from MSYS, I get continuous output. If I call it from the batch file, then I see the full output only at the end of every single command.
This makes it difficult to assess the current status of the process.
Is it possible to redirect the output of makelibs.sh in order to get a continuous feedback on the execution?
I have a batch file (build.bat) which calls a bash script (makelibs.sh)
I strongly advise against doing this. You are calling a script with a script, when you could simply open up Bash and put
makelibs.sh
However if you insist on doing this then perhaps start would work
start bash.exe makelibs.sh
ref

Ensuring Programs Run In Ordered Sequence

This is my situation:
I want to run Python scripts sequentially in sequence, starting with scriptA.py. When scriptA.py finishes, scriptB.py should run, followed by scriptC.py. After these scripts have run in order, I need to run an rsync command.
I plan to create bash script like this:
#!/bin/sh
python scriptA.py
python scriptB.py
python scriptC.py
rsync blablabla
Is this the best solution for perfomance and stability ?
To run a command only after the previous command has completed successfully, you can use a logical AND:
python scriptA.py && python scriptB.py && python scriptC.py && rsync blablabla
Because the whole statement will be true only if all are true, bash "short-circuits" and only starts the next statement when the preceding one has completed successfully; if one fails, it stops and doesn't start the next command.
Is that the behavior you're looking for?
If you have some experience with python it will almost certainly be better to write a python script that imports and executes the relevant functions from the other script. That way you will be able to use pythons exceptions handling. Also you can run the rsync from within python.

batch script print the command that would be executed rather than executing

Is it possible to set a cmd.exe shell / batch file to print what would be executed but not actually execute it?
For example, given a batch file that takes some arguments, based on those arguments selects some other batch files to run, those batch files execute some commands, may or may not call other files/commands etc.
I would like to be able to run the top level batch file with all possible combinations of it's input arguments and capture what each arg combination would execute - without actually trying to execute it.
e.g. conceptually would want to be able to produce something like:
mybatchfile.bat 1 2 3 > mybatchfile_1_2_3.bat
mybatchfile.bat 99 3 42 > mybatchfile_99_3_42.bat
where mybatchfile_99_3_42.bat is the list of everything that WOULD be executed when running mybatchfile.bat 99 3 42 (NOT the output of executing those commands)
If this can't be done solely using cmd.exe is there someway to achieve this by running the batch script in cygwin bash shell
In bash we would use something like -x to print out all possible commands without executing them. how to make bash scripts print out every command before executing The problem is that to my knowledge there's no exact equivalent command for Batch Scripts. I would suggest you try placing:
#echo on
at the beginning of your script and:
#echo off
at the end of your script, that's the best starting place.
If you never want the batch file to actually execute the commands, you can insert echo before each command. It's not a perfect solution by any means, but it may be a work-around for fairly simple scripts.

Resources