Change variables of the subscript into the script in the loop - bash

I have a bash script that takes a file and performs an operation with this file. During the operation the out_file is produced. When it's done, I start the other script (script_2) into my script to perform another operation on the out_file. But the problem that I have is to pass parameters to the script_2, which are different for each initial file:
#/bin/bash
for i in $(ls folder); do
.\*operation*.sh folder/$i # this step produces the *out_file.$i*
.\script_2 *out_file.$i* parameter_1 parameter_2
done
Thus, the parameter_1 and parameter_2 should be different for each out_file. So, is it possible to pass different parameters every time inside the loop and don't launch the script_2 separately, every time for each file?

without any more information it's hard to know what your purpose is:
$ ls
script1.sh script2.sh script3.sh testfiles
ls ./testfiles/
file1.txt file2.txt
$ cat script1.sh
#!/bin/bash
for i in $(ls ./testfiles/); do
./script2.sh $i
./script3.sh ./testfiles/out_file.$i parameter_1 parameter_2
done
$ cat script2.sh
#!/bin/bash
touch ./testfiles/out_$1.txt
exit
$ cat script3.sh
#!/bin/bash
echo "dollar1: $1
dollar2 $2
dollar3 $3 "
$ ./script1.sh
dollar1: ./testfiles/out_file.file1.txt
dollar2 parameter_1
dollar3 parameter_2
dollar1: ./testfiles/out_file.file2.txt
dollar2 parameter_1
dollar3 parameter_2
$ ls ./testfiles/
file1.txt file2.txt out_file1.txt.txt out_file2.txt.txt
As you can see it loops through all files in the folder, creates the out file and then passes this into script 3.
I wouldn't advise you run the script again in the current format (it'll loop through the out files then) but you get the idea.

Related

bash : cat commands output only if it success

I'm trying to redirect a command's output in a file only if the command has been successful because I don't want it to erase its content when it fails.
(command is reading the file as input)
I'm currently using
cat <<< $( <command> ) > file;
Which erases the file if it fails.
It's possible to do what I want by storing the output in a temp file like that:
<command> > temp_file && cat temp_file > file
But it looks kinda messy to me, I want avoid manually creating temp files (I know <<< redirection is creating a temp file)
I finally came up with this trick
cat <<< $( <command> || cat file) > file;
Which will not change the contents of the file... but which is even more messy I guess.
Perhaps capture the output into a variable, and echo the variable into the file if the exit status is zero:
output=$(command) && echo "$output" > file
Testing
$ out=$(bash -c 'echo good output') && echo "$out" > file
$ cat file
good output
$ out=$(bash -c 'echo bad output; exit 1') && echo "$out" > file
$ cat file
good output
Remember, the > operator replaces the existing contents of the file with the output of the command. If you want to save the output of multiple commands to a single file, you’d use the >> operator instead. This will append the output to the end of the file.
For example, the following command will append output information to the file you specify:
ls -l >> /path/to/file
So, for log the command output only if it success, you can try something like this:
until command
do
command >> /path/to/file
done

loop over files with bash script

I have a js script that converts kml location history files to csv. I wrote a bash script to loop through all the files in a directory. The script works when I execute it from command line ./index.js filename.kml > filename.csv
But nothing happens when I execute the bash file that is supposed to loop through all files.
I know it probably is a simple mistake but I can't spot it.
#!/bin/bash
# file: foo.sh
for f in *.kml; do
test -e "${f%.kml}" && continue
./index.js "$f" > "-fcsv"
done
Just delete the "&& continue", if I'm not wrong you're skipping the current iteration with the "continue" keyword, that's why nothing happens
EDIT
Also, you shouldn't test if the file exists, the for loop is enough to be sure that "f" will be a valid .kml file. Anyways, if you still want to do it you have to do it like:
#!/bin/bash
# file: foo.sh
for f in *.kml; do
if [ -e "$f" ]; then
./index.js "$f" > "$f.csv"
fi;
done

Why am I not finding any output files in the desired location?

I am trying to write a processing script and I am stuck at the beginning. It does not seem to be wrong but I cannot simply understand where the error is as it is completing the execution but not giving any output. Any debugging help?
#!/bin/sh
#
# Call with following arguments
# sh test.sh <output_basename> <fastq folder> <output_folder_loc>
#
#
bn=$1
floc=$2
outloc=$3
#Create an output directory
#opdir=$bn"_processed"
mkdir $outloc/$bn"_processed"
echo "output directory for fastq $outloc/$bn"_processed" ..."
fout=$outloc/$bn"_processed"
echo "$fout ..."
echo "performing assembly to create one fastq file for each read mates ..."
zcat $floc/*R1*.fastq.gz > $fout/$bn_R1.fastq
zcat $floc/*R2*.fastq.gz > $fout/$bn_R2.fastq
echo "done"
Run command:
sh test.sh S_13_O1_122 /home/vdas/data/floc/Sample_S_13_O1_122_S12919 /home/vdas/data/OC/RNA-Seq/STAR_run/mut_out
I do not see any wrong in the code and it is also runnning without error but still am not getting any output. Can anyone point me the problem?
First try to change two lines like this:
mkdir -p "$outloc/${bn}_processed"
fout="$outloc/${bn}_processed"
mkdir -p is good when $outloc directory doesn't exist yet.
you could test your arguments (the following may be only in bash, but do work when bash is invoked as /bin/sh)
var=$1
if [ ${#var} -eq 0 ]; then
echo "var is not defined" >&2
exit 1
fi
that will test that the variable has some length, you might want to test other aspects as well, for instance does
ls $floc/*R1*.fastq.gz
produce any output?
#!/bin/sh
#
# Call with following arguments
# sh test.sh <output_basename> <fastq folder> <output_folder_loc>
#
#
bn=$1
floc=$2
outloc=$3
#Create an output directory
#opdir=$bn"_processed"
mkdir $outloc/$bn"_processed"
echo "output directory for fastq $outloc/$bn"_processed" ..."
fout=$outloc/$bn"_processed"
echo "$fout ..."
echo "performing assembly to create one fastq file for each read mates ..."
echo $floc/*R1*.fastq.gz
echo $fout/$bn_R1.fastq
zcat -v $floc/*R1*.fastq.gz > $fout/${bn}_R1.fastq
zcat -v $floc/*R2*.fastq.gz > $fout/${bn}_R2.fastq
echo "done"
`
this may be wath you want,

Retrieving path to dotted-in script

Getting at the path to a running script in bash is trivial via the $0 variable. However, it doesn't work if you are being dotted in via another script, instead you will get the path to the calling script. Consider the example:
#!/bin/bash
# script1.sh
echo $(readlink -f $0)
...
#!/bin/bash
# script2.sh
. /tmp/script1.sh
echo $(readlink -f $0)
The output from the above script is:
/tmp/script2.sh
/tmp/script2.sh
However, if $0 in a dotted-in script emitted the path to that script, the output would instead be:
/tmp/script1.sh
/tmp/script2.sh
How can I get that correct value?
To overcome this (specific to bash) you can use ${BASH_SOURCE[0]}
#!/bin/bash
# script1.sh
real_dollar_zero=${BASH_SOURCE[0]}
echo $(readlink -f $real_dollar_zero)
Now the output is:
/tmp/script1.sh
/tmp/script2.sh
Viola! - The strung musical instrument often confused with a violin!

ksh: How to pass arguments containing white space between scripts?

I have two scripts (one ksh and other Perl) and one calls the other. I have to handle a scenario when someone accidentally enters a white space in file name and report it as an error (only when the file does not exist). It looks like p.sh which uses $* to pass/forward all arguments to p.pl doesn't handle quoted arguments the way they should be? Any ideas how to fix this? Let's just say one could enter multiple spaces in the filename too.
p.sh:
#!/usr/bin/env ksh
/tmp/p.pl $* 1>/tmp/chk.out 2>&1
print "Script exited with value $?"
print "P.PL OUTPUT:"
cat /tmp/chk.out
exit 0
p.pl:
#!/usr/bin/perl -w
use Getopt::Std;
getopts ("i:", \ %options);
if ($options{i} && -e $options{i}) {
print "File $options{i} Exists!\n";
}
else {
print "File $options{i} DOES NOT exist!\n";
}
Test cases (when there is an actual file '/tmp/a b.txt' (with a space in it) on the system):
[test] /tmp $ p.pl -i /tmp/a b.txt
File /tmp/a DOES NOT exist!
[test] /tmp $ p.pl -i "/tmp/a b.txt"
File /tmp/a b.txt Exists!
[test] /tmp $ ./p.sh -i "/tmp/a b.txt"
Script exited with value 0
P.PL Check OUTPUT:
File /tmp/a DOES NOT exist!
[test] /tmp $ ./p.sh -i "/tmp/ a b.txt"
Script exited with value 0
P.PL Check OUTPUT:
File /tmp/ Exists!
It's the last two scenarios I'm trying to fix. Thank you.
To preserve whitespace that was passed into the script, use the $# parameter:
/tmp/p.pl "$#" 1>/tmp/chk.out 2>&1
The quotation marks are necessary to make sure that quoted whitespace is seen by p.pl.

Resources