How do I write a shell script that repeats a java program a specific number of times? - bash

Essentially I am looking to write a shell script, likely using a for loop, that would allow me to repeat a program call multiple times without having to do it by hand (I don't know exactly how to explain this, but i want to perform the java TestFile.java command in the cmd window multiple times without doing it by hand).
I am trying to write it for the UNIX shell in bash, if that helps at all.
My program outputs a set of numbers that I want to look at to analyze end behavior, so I need to perform many tests for many different inputs and I want to streamline the process. I have a pretty basic understanding of shell scripting - i tried to teach myself today but I couldn't really understand the syntax of the for loop or the syntax of how to write a .java file call, but I would be able to write them in shell script with a little help.

This will do:
#!/bin/bash
javac Testfile.java # compile the program
for((i=1;i<=50;i++))
do
echo "Output of Iteration $i" >> outfile
java Testfile >> outfile
done
This will compile your java program and run it for 50 times and store the output in a file named outfile. Likewise, you can change the 50 for the number of iterations you want.

#!/bin/bash
for i in {1..10}
do
#insert file run command here
done

#!/bin/bash
LOOPS=50
for i IN {1 .. LOOPS}
do
java TestFile >> out.log
done

Related

Just partially creation of csv file using crontab command

I have some problem in automation the generation of a csv file. The bash code used to produce the csv works in parallel using 3 cores in order to reduce the time consumption; initially different csv files are produced, which are subsequently combined to form a single csv file. The core of the code is this cycle:
...
waitevery=3
for j in `seq 1 24`; do
if((j==1)); then
printf '%s\n' A B C D E | paste -sd ',' >> code${namefile}01${rr}.csv
fi
j=$(printf "%02d" $j)
../src/thunderstorm --mask-file=mask.grib const_${namefile}$j${rr}.grib surf_${namefile}$j${rr}.grib ua_${namefile}$j${rr}.grib hl_const.grib out &
if ! ((c % waitevery)); then
wait
fi
c=$((c+1))
done
...
where ../src/thunderstorm is a .F90 code which produce the second and successive files.
If I run this code manually it produces the right csv file, but if I run it by a programmed crontab command it generates a csv file with the only header A B C D E
Some suggestions?
Thanks!
cron runs your script in an environment, that often does not match your expectations.
check that the PATH is correct and that the script is called from the correct location: ../src is obviously relative, but to what?
I find cron-scripts to be much more reliable when using full paths for input, output and programs.
As #umläute points out, cron runs your scripts but does not run the typical initiallizations that you may have when you open a terminal session. Which means that you have to make no assumptions regarding your environment.
For scripts that may be invoked from the shell and may be invoked from cron I usually add at the beginning something like this:
BIN_DIR=/home/myhome/bin
PATH=$PATH:$BIN_DIR
Also, make sure you do not use relative paths to executables like ../src/thunderstorm. The working directory of the script invoked by cron may not be what you think. You may use $BIN_DIR/../src/thunderstorm. If you want to save typing add the relevant directories to the PATH.
The same logic goes for all other shell variables.
Doing a good initialization at the beginning of your script will allow you to run it from the shell for testing (or manual execution) and then run it as a cron job too.

First line in file is not always printed in bash script

I have a bash script that prints a line of text into a file, and then calls a second script that prints some more data into the same file. Lets call them script1.sh and script2.sh. The reason it's split into two scripts, is because I have different versions of script2.sh.
script1.sh:
rm -f output.txt
echo "some text here" > output.txt
source script2.sh
script2.sh:
./read_time >> output.txt
./run_program
./read_time >> output.txt
Variations on the three lines in script2.sh are repeated.
This seems to work most of the time, but every once in a while the file output.txt does not contain the line "some text here". At first I thought it was because I was calling script2.sh like this: ./script2.sh. But even using source the problem still occurs.
The problem is not reproducible, so even when I try to change something I don't know if it's actually fixed.
What could be causing this?
Edit:
The scripts are very simple. script1 is exactly as you see here, but with different file names. script 2 is what I posted, but then the same 3 lines repeated, and ./run_program can have different arguments. I did a grep for the output file, and for > but it doesn't show up anywhere unexpected.
The way these scripts are used is that script1 is created by a program (the only difference between the versions is the source script2.sh line. This script1.sh is then run on a different computer (linux on an FPGA actually) using ssh. Before that is done, the output file is also deleted using ssh. I don't know why, but I didn't write all of this. Also, I've checked the code running on the host. The only mention of the output file is when it is deleted using ssh, and when it is copied back to the host after the script1 is done.
Edit 2:
I finally managed to make the problem reproducible at a reasonable rate by stripping script2.sh of everything but a single line printing into the file. This also let me do the testing a bit faster. Once I had this I got the problem between 1 and 4 times for every 10 runs. Removing the command that was deleting the file over ssh before the script was run seems to have solved the problem. I will test it some more to be sure, but I think it's solved. Although I'm still not sure why it would be a problem. I thought that the ssh command would not exit before all the remove commands were executed.
It is hard to tell without seeing the real code. Most likely explanation is that you have a typo, > instead of >>, somewhere in one of the script2.sh files.
To verify this, set noclobber option with set -o noclobber. The shell will then terminate when trying to write to existing file with >.
Another possibility, is that the file is removed under certain rare conditions. Or it is damaged by some command which can have random access to it - look for commands using this file without >>. Or it is used by some command both as input and output which step on each other - look for the file used with <.
Lastly, you can have a racing condition with a command outputting to the file in background, started before that echo.
Can you grep all your scripts for 'output.txt'? What about scripts called inside read_time and run_program?
It looks like something in one of the script2.sh scripts must be either overwriting, truncating or doing a substitution on output.txt.
For example,there could be a '> output.txt' burried inside a conditional for a condition that rarely obtains. Just a guess, but it would explain why you don't always see it.
This is an interesting problem. Please post the solution when you find it!

Is it possible to start a program from the command line with input from a file without terminating the program?

I have a program that users can run using the command line. Once running, it receives and processes commands from the keyboard. It's possible to start the program with input from disk like so: $ ./program < startScript.sh. However, the program quits as soon as the script finishes running. Is there a way to start the program with input from disk using < that does not quit the program and allows additional keyboard input to be read?
(cat foo.txt; cat) | ./program
I.e., create a subshell (that's what the parentheses do) which first outputs the contents of foo.txt and after that just copies whatever the user types. Then, take the combined output of this subshell and pipe it into stdin of your program.
Note that this also works for other combinations. Say you want to start a program that always asks the same questions. The best approach would be to use "expect" and make sure the questions didn't change, but for a quick workaround, you can also do something like this:
(echo y; echo $file; echo n) | whatever
Use system("pause")(in bash it's just "pause") in your program so that it does not exit immediatly.
There are alternatives such as
dummy read
infinite loop
sigsuspend
many more
Why not try something like this
BODY=`./startScript.sh`
if [ -n "$BODY" ]
then cat "$BODY" |./program
fi
That depends on how the program is coded. This cannot be achieved from writing code in startScript.sh, if that is what you're trying to achieve.
What you could do is write a callingScript.sh that asks for the input first and then calls the program < startScript.sh.

Help with aliases in shell scripts

I have the following code, which is intended to run a java program on some input, and test that input against a results file for verification.
#!/bin/bash
java Program ../tests/test"$#".tst > test"$#".asm
spim -f test"$#".asm > temp
diff temp ../results/test"$#".out
The gist of the above code is to:
Run Program on a test file in another directory, and pipe the output into an assembly file.
Run a MIPS processor on that program's output, piping that into a file called temp.
Run diff on the output I generated and some expected output.
I made this shell script to help me automate checking of my homework assignment for class. I didn't feel like manually checking things anymore.
I must be doing something wrong, as although this program works with one argument, it fails with more than one. The output I get if I use the $# is:
./test.sh: line 2: test"$#".asm: ambiguous redirect
Cannot open file: `test0'
EDIT:
Ah, I figured it out. This code fixed the problem:
#!/bin/bash
for arg in $#
do
java Parser ../tests/test"$arg".tst > test"$arg".asm
spim -f test"$arg".asm > temp
diff temp ../results/test"$arg".out
done
It turns out that bash must have interpreted a different cmd arg for each time I was invoking $#.
enter code here
If you provide multiple command-line arguments, then clearly $# will expand to a list of multiple arguments, which means that all your commands will be nonsense.
What do you expect to happen for multiple arguments?

Making a command loop in shell with a script

How can one loop a command/program in a Unix shell without writing the loop into a script or other application.
For example, I wrote a script that outputs a light sensor value but I'm still testing it right now so I want it run it in a loop by running the executable repeatedly.
Maybe I'd also like to just run "ls" or "df" in a loop. I know I can do this easily in a few lines of bash code, but being able to type a command in the terminal for any given set of command would be just as useful to me.
You can write the exact same loop you would write in a shell script by writing it in one line putting semicolons instead of returns, like in
for NAME [in LIST ]; do COMMANDS; done
At that point you could write a shell script called, for example, repeat that, given a command, runs it N times, by simpling changing COMMANDS with $1 .
I recommend the use of "watch", it just do exactly what you want, and it cleans the terminal before each execution of the commands, so it's easy to monitor changes.
You probably have it already, just try watch ls or watch ./my_script.sh. You can even control how much time to wait between each execution, in seconds, with the -n option, and you can use -d to highlight the difference in the output of consecutive runs.
Try:
Run ls each second:
watch -n 1 ls
Run my_script.sh each 3 seconds, and highlight differences:
watch -n 3 -d ./my_script.sh
watch program man page:
http://linux.die.net/man/1/watch
This doesn't exactly answer your question, but I felt it was relavent. One of the great things with shell looping is that some commands return lists of items. Of course that is obvious, but a something you can do using the for loop is execute a command on that list of items.
for $file in `find . -name *.wma`; do cp $file ./new/location/ done;
You can get creative and do some very powerful stuff.
Aside from accepting arguments, anything you can do in a script can be done on the command line. Earlier I typed this directly in to bash to watch a directory fill up as I transferred files:
while sleep 5s
do
ls photos
end

Resources