Redirect bash output called from batch file to console - bash

I have a batch file (build.bat) which calls a bash script (makelibs.sh). The bash script contains several commands which build 20 libraries from source.
If I run makelibs.sh from MSYS, I get continuous output. If I call it from the batch file, then I see the full output only at the end of every single command.
This makes it difficult to assess the current status of the process.
Is it possible to redirect the output of makelibs.sh in order to get a continuous feedback on the execution?

I have a batch file (build.bat) which calls a bash script (makelibs.sh)
I strongly advise against doing this. You are calling a script with a script, when you could simply open up Bash and put
makelibs.sh
However if you insist on doing this then perhaps start would work
start bash.exe makelibs.sh
ref

Related

Perl script is running slow when calling bash

I am running a perl script that calls another script. Via command line. But it runs extremely slow. The back ticks makes it run via command line.
for my $results (#$data){
`/home/data/push_to_web $results->{file}`;
}
If i run the same command via bash /home/data/push_to_web book.txt the same script runs extremely fast. If i build a bash file that contains
/home/data/push_to_web book_one.txt
/home/data/push_to_web book_two.txt
/home/data/push_to_web book_three.txt
The code executes extremely fast. Is there any secret to speeding perl up via another perl script.
Your perl script fires up a new bash shell for each element in the array, whereas running the commands in bash from a file doesn't have to create any new shells.
Depending on how many files you have, and what's in your bash startup files, this could add a significant overhead.
One option would be to build a list of semicolon-separated commands in the for loop, and then run one system command at the end to execute them all in one bash process.

How to write a wrapper script in unix that calls other shell scripts sequentially?

I am trying to write a wrapper script that calls other shell scripts in a sequential manner
There are 3 shell scripts that pick .csv files of a particular pattern from a specified location and process them.
I need to run them sequentially by calling them from one wrapper script
Let's consider 3 scripts
a.ksh, b.ksh and c.ksh that run sequentially in the same order.
The requirement is that the script should fail if a.ksh fails but continue if b.sh fails.
Please suggest.
Thanks in advance!
Something like:
./a.ksh && ./b.ksh; ./c.ksh
I haven't tried this out. Do test with sample scripts that fail/pass before using.
See: http://www.gnu.org/software/bash/manual/bashref.html#Lists

Instead of giving command for batch mode, give .scm file path?

It is possible to supply batch commands directly with the -b flag, but if the commands become very long, this is no longer an option. Is there a way to give the path to an .scm script that was written to a file, without having to move the file into the scripts directory?
No as far as I know. What you give in the -b flag is a Scheme statement, which implies your function has already been loaded by the script executor process. You can of course add more directories that are searched for scripts using Edit>Preferences>Folders>Scripts.
If you write your script in Python the problem is a bit different since you can alter the Python path before loading the script code but the command line remains a bit long.

Getting continuous output from shell script that is run by Applescript in Cocoa

I have a shell script that is using echo to give a continuous output (the progress of an rsync) that I am using AppleScript to run with administrator privileges. Before I was using NSTask to run the shell script, but I couldn't find a way to run it with the privileges that it needed, so now I am using applescript to run it. When it was running via NSTask, I could use an output pipe and waitForDataInbackgroundAndNotify to get the continuous output and put it into a text field, but now that I am using AppleScript, I cannot seem to find a way to accomplish this. The shell script is still using echo, but it seems to get lost in the AppleScript "wrapper." How do I make sure that the AppleScript sees the output from the shell script and passes it on to the application? Remember, this isn't one single output, but continuous output.
Zero is correct. When you use do shell script, you can consider it similar to using backticks in perl. The command will be executed, and the everything sent to STDOUT will be returned as the result.
The only work around would be to have the your command write the output to a temporary file and then use do shell script "foo" without waiting. From there, you can read from the file sequentially using native AppleScript commands. It's clunky, but it'll work in a pinch.

batch script print the command that would be executed rather than executing

Is it possible to set a cmd.exe shell / batch file to print what would be executed but not actually execute it?
For example, given a batch file that takes some arguments, based on those arguments selects some other batch files to run, those batch files execute some commands, may or may not call other files/commands etc.
I would like to be able to run the top level batch file with all possible combinations of it's input arguments and capture what each arg combination would execute - without actually trying to execute it.
e.g. conceptually would want to be able to produce something like:
mybatchfile.bat 1 2 3 > mybatchfile_1_2_3.bat
mybatchfile.bat 99 3 42 > mybatchfile_99_3_42.bat
where mybatchfile_99_3_42.bat is the list of everything that WOULD be executed when running mybatchfile.bat 99 3 42 (NOT the output of executing those commands)
If this can't be done solely using cmd.exe is there someway to achieve this by running the batch script in cygwin bash shell
In bash we would use something like -x to print out all possible commands without executing them. how to make bash scripts print out every command before executing The problem is that to my knowledge there's no exact equivalent command for Batch Scripts. I would suggest you try placing:
#echo on
at the beginning of your script and:
#echo off
at the end of your script, that's the best starting place.
If you never want the batch file to actually execute the commands, you can insert echo before each command. It's not a perfect solution by any means, but it may be a work-around for fairly simple scripts.

Resources