Matlab script execution from bash - bash

I am able to execute the following from terminal :
matlab -nojvm < span.m
This works fine and produces the required output.
However, in the same directory, if I write a bash script:
#!/bin/bash
matlab -nojvm < span.m
I get the following error when I execute it:
wallShearStresswallsconstant=importdata("wallShearStress_wallBottom.raw");
|
Error: The input character is not valid in MATLAB statements or expressions.
Undefined function 'wallShearStresswallsconstant' for input arguments of type
'double'.
Please let me know what I am doing incorrectly.
The matlab script is as follows and it reads a file (wallShearStress_wallBottom.raw) with 6 columns and 45288 rows (all numbers), for testing purpose dosent matter what numbers are there.
clear all
clc
wallShearStresswallsconstant=importdata("wallShearStress_wallBottom.raw");
ly=110;%64; %nz
lx=407;%239;%nx
ShearStress=zeros(lx,5);
%Uinf=15.894579;
Uinf=16.77;
i=1;
j=1;
k=1;
while i<lx+1
while j<ly+1
ShearStress(i,1)=wallShearStresswallsconstant(k,1);
ShearStress(i,2)=wallShearStresswallsconstant(k,2);
ShearStress(i,3)=wallShearStresswallsconstant(k,3);
if wallShearStresswallsconstant(k,4) < 0
ShearStress(i,4)=ShearStress(i,4)+1;
else
ShearStress(i,5)=ShearStress(i,5)-1;
end
j=j+1;
k=k+1;
end
j=1;
i=i+1;
end
SS = ShearStress;
SS(:,5) = SS(:,4)-SS(:,5);
SS(:,4) = SS(:,4)./SS(:,5);
plot(SS(:,1),SS(:,4))
SS = SS';
fileID = fopen('new.txt', 'w');
fprintf(fileID,'%f %f %f %f %f\n',SS);

Please use the following code instead:
importdata('wallShearStress_wallBottom.raw');
And the common bash command for executing matlab script file is like this:
matlab -nodisplay -nojvm -nosplash -nodesktop -r \
"try, span, catch, exit(1), end, exit(0);"
where span is your .m filename.

Related

capture all bash commands as parameters for a custom runner

I have a custom 'runner'-script that I need to use to run all of my terminal commands. Below you can see the general idea in the script.
#!/usr/bin/env bash
echo "Running '$#'"
# do stuff before running cmd
$#
echo "Done"
# do stuff after running cmd
I can use the script in bash as follows:
$ ./run.sh echo test
Running 'echo test'
test
Done
$
I would like to use it like this:
$ echo test
Running 'echo test'
test
Done
$
Bash has the trap ... DEBUG and PROMPT_COMMAND, which lets me execute something before and after a command, but is there something that would allow me to execute instead of the command?
There is also the command_not_found_handle which would work if I had an empty PATH env variable, but that seems too dirty.
After some digging, I ended up looking at the source code and found that bash does not support custom executors. Below is a patch to add a new handle that works similarly as the command_not_found_handler.
diff --git a/eval.c b/eval.c
index f02d6e40..8d32fafa 100644
--- a/eval.c
+++ b/eval.c
## -52,6 +52,10 ##
extern sigset_t top_level_mask;
#endif
+#ifndef EXEC_HOOK
+# define EXEC_HOOK "command_exec_handle"
+#endif
+
static void send_pwd_to_eterm __P((void));
static sighandler alrm_catcher __P((int));
## -172,7 +176,15 ## reader_loop ()
executing = 1;
stdin_redir = 0;
- execute_command (current_command);
+ SHELL_VAR *hookf = find_function (EXEC_HOOK);
+ if (hookf == 0) {
+ execute_command (current_command);
+ } elseĀ {
+ char *command_to_print = make_command_string (current_command);
+ WORD_LIST *og = make_word_list(make_word(command_to_print), (WORD_LIST *)NULL);
+ WORD_LIST *wl = make_word_list(make_word(EXEC_HOOK), og);
+ execute_shell_function (hookf, wl);
+ }
exec_done:
QUIT;
One can then define function command_exec_handle() { eval $1; } which will be executed instead of the original command given in the prompt. The original command is fully in the first parameter. The command_exec_handle can be given in .bashrc and it works as expected.
Notice: this is very dangerous! If you mess up and put a bad command_exec_handler in your .bashrc, you might end up with a shell that does not execute commands. It will be quite hard to fix without booting from a live cd.
It seems you have the same problem listed here. If you want to run some commands if your original command was not found, the Bash 4's command_not_found_handler will certainly fit your needs.
Try to be more specific, maybe with some code snippets that do or do not work, in order to help us to help you...

Command not found when running a local executable from csh script

My current project involves the use of a .go executable written on Fortran 77 in the mid-eighties. My only access to it currently is through ssh to a server using csh. I have written the following script:
set inpdir = $argv[1]
mkdir ${inpdir}"_out"
set j = 1
while ($j <= 5)
set i = 0
while ($i <= 20)
"tms96-fnl.go <./"${inpdir}"/inp"${j}"0"${i}".d> ./"${inpdir}"_out/out"${j}"0"${i}
set i = i + 1
end
set j = j + 1
end
The result is the message:
tms96-fnl.go <./fftf/inp100.d> ./fftf_out/out100 -Command not found
Syntax error
If i were to key the contents of that message (sans the "-Command not found") while in the same working directory as the script it executes as expected.
The problem is the arrangement of quotes. You have:
"tms96-fnl.go <./"${inpdir}"/inp"${j}"0"${i}".d> ./"${inpdir}"_out/out"${j}"0"${i}
Which would interpret a command that looks like tms96-fnl.go <./. I would do:
tms96-fnl.go < ./"${inpdir}"/inp"${j}"0"${i}".d > ./"${inpdir}"_out/out"${j}"0"${i}"

How can I call GAP functions from a shell script?

I want to get the result of a function of the GAP software. This is an interactive command line tool mainly for mathematician who work on group theory related topics.
The documentation/faq states about 8.1: Can I call GAP functions from another programme? that it is in general not possible. However, running GAP as a child process and communicate with it using pipes, pseudo-ttys, UNIX FIFOs or some similar device it can be done.
An example session using a package called CrystCat (Crystallographic Groups Catalog) looks like:
$ gap
gap > LoadPackage( "CrystCat" );
gap > DisplaySpaceGroupType( "P1" );
#I Space-group type (3,1,1,1,1); IT(1) = P1; orbit size 1; fp-free
gap > quit;
$ # exited 'gap' and back in my shell
As I am not familiar with these techniques, can someone show me a minimal example having following functionality:
$ ./script.sh "P1"
#I Space-group type (3,1,1,1,1); IT(1) = P1; orbit size 1; fp-free
$
UPDATE: The accepted answer of this question doesn't work.
Answer by gap-support (using stdin read-in capability of gap)
#!/bin/sh
if [ "$#" != "1" ]; then
echo "Usage: test.sh <string>"
exit 1
fi;
gap -r -b -q << EOI
LoadPackage( "CrystCat" );
DisplaySpaceGroupType( "$1" );
EOI
It works exactly as asked, namely
$ ./script.sh P1
#I Space-group type (3,1,1,1,1); IT(1) = P1; orbit size 1; fp-free

Using R with in command bash terminal

I have a set of files *.txt in a specific directory. I have written an .r file code called SampleStatus.r which contains a unique function that reads, proceeses data and writes the results to an output file.
The function is like:
format_windpro(import_file="in.txt", export_file="out.txt")
I would like to use bash commands to read and compute every file in one command using my R file.
Use Rscript. Example code:
for f in ${INPUT_DIR}/*.txt; do \
base=$(basename $f) \
Rscript SampleStatus.R $f ${OUTPUT_DIR}/$base \
done
While in your SampleStatus.R you handle command line arguments like this:
#!/usr/bin/env Rscript
# ...
argv <- commandArgs(T)
# error checking...
import_file <- argv[1]
export_file <- argv[2]
# your function call
format_windpro(import_file, export_file)

bash time output processing

I know that time will send timing statistics output to stderr. But somehow I couldn't capture it either in a bash script or into a file via redirection:
time $cmd 1>/dev/null 2>file
$output=`cat file`
Or
$output=`time $cmd 1>/dev/null`
I'm only interested in timing, not the direct output of the command. I've read some posts overhere but still no luck finding a viable solution. Any suggestions?
Thanks!
Try:
(time $cmd) 1>/dev/null 2>file
so that (time $cmd) is executed in a subshell environment and you can then redirect its output.
(Using GNU time /usr/bin/time rather than bash builtin) (Thanks #Michael Krelin)
(Or invoke as \time) (Thanks #Sorpigal, if I ever knew this I'd entirely forgotten)
How about using the -o and maybe -a command line options:
-o FILE, --output=FILE
Do not send the results to stderr, but overwrite the specified file.
-a, --append
(Used together with -o.) Do not overwrite but append.
I had a similar issue where I wanted to bench optimizations. The idea was to run the program several times then output statistics on run durations.
I used the following command lines:
1st run: (time ./myprog)2>times.log
Next runs: (time ./myprog)2>>times.log
Note that my (bash?) built-in time outputs statistics in the form:
real 0m2.548s
user 0m7.341s
sys 0m0.007s
Then I ran the following Perl script to retrieve statistics:
#!/usr/bin/perl -w
open FH, './times.log' or die "ERROR: ", $!;
my $useracc1 = 0;
my $useracc2 = 0;
my $usermean = 0;
my $uservar = 0;
my $temp = 0;
while(<FH>)
{
if("$_" =~ /user/)
{
if("$_" =~ /(\d+)m(\d{1,2})\.(\d{3})s/)
{
$usercpt++;
$temp = $1*60 + $2 + $3*0.001;
$useracc1 += $temp;
$useracc2 += $temp**2;
}
}
}
if($usercpt ne 0)
{
$usermean = $useracc1 / $usercpt;
$userdev = sqrt($useracc2 / $usercpt - $usermean**2);
$usermean = int($usermean*1000)/1000;
$userdev = int($userdev*1000)/1000;
}
else
{
$usermean = "---";
$userdev = "---";
}
print "User: ", $usercpt, " runs, avg. ", $usermean, "s, std.dev. ", $userdev,"s\n";
Of course, regular expressions may require adjustements depending on your time output format. It can also be easily extended to include real and system statistics.

Resources