I am working with a program that combines individuals files, and I am incorporating this program into a BASH pipeline that I'm putting together. The program requires a flag for each file, like so:
program -V file_1.g.vcf -V file_2.g.vcf -V file_3.g.vcf -O combined_output.g.vcf
In order to allow the script to work with any number of samples, I would like to read the individual files names within a directory, and expand the path for each file after a '-V' flag.
I have tried adding the file paths to a variable with the following, but have not had success with proper expansion:
GVCFS=('-V' `ls gvcfs/*.g.vcf`)
Any help is greatly appreciated!
You can do this by using a loop to populate an array with the options:
options=()
for file in gvcfs/*.g.vcf; do # Don't parse ls, just use a direct wildcard expression
options+=(-V "${file##*/}") # If you want the full path, leave off ##*/
done
program "${options[#]}" -O combined_output.g.vcf
printf can help:
options=( $(printf -- "-V %s " gvcfs/*.g.vcf ) )
Though this will not deal gracefully with whitespace in filenames.
Also consider realpath to generate absolute filenames.
Related
I want to read a list of file names stored in a file, and the top level directory is a macro, since this is for a script that may be run in several environments.
For example, there is a file file_list.txt holding the following fully qualified file paths:
$TOP_DIR/subdir_a/subdir_b/file_1
$TOP_DIR/subdir_x/subdir_y/subdir_z/file_2
In my script, I want to tar the files, but in order to do that, tar must know the actual path.
How can I get the string containing the file path to expand the macro to get the actual path?
In the code below the string value echoed is exactly as in the file above.
I tried using actual_file_path=`eval $file_path` and while eval does evaluate the macro, it returns a status, not the evaluated path.
for file_path in `cat $input_file_list`
do
echo "$file_path"
done
With the tag ksh I think you do not have the utility envsubst.
When the number of variables in $input_file_list is very limited, you can substitute vars with awk :
awk -v top_dir="${TOP_DIR}" '{ sub(/$TOP_DIR/, top_dir); print}' "${input_file_list}"
I was using eval incorrectly. The solution is to use an assignment on the right side of eval as follows:
for file_path in `cat $input_file_list`
do
eval myfile=$file_name
echo "myfile = $myfile"
done
$myfile now has the actual expansion of the macro.
Below is a simple bash program. It takes file types as command line arguments and it queries the current directory and prints the files of the type specified.
I would like to be able to query two different file types and therefore need two boolean expressions to represent this.
Below is my code for querying just one file type
#!/bin/bash
for x in $(ls *$1); do
echo $x;
done
Now what I would like to be able to do is (in pseudocode)
command line args fileName .sh .c
for x in (current directory files of *.sh) OR (in current directory files of *.c) do
print .sh files
print.c files
done
I've tried using || and I get syntax errors I can not find any evidence of being able to use || for two expressions in for loop.
I've tried using two nested for loops but they do not work and yield errors.
Is there any way I can accomplish this using the same for loop system.
Thank you.
Sounds like you want something like:
for extension in "$#"; do
printf 'Files ending in %s:\n' "$extension"
printf '%s\n' *"$extension"
done
Loop through all arguments passed to the script and print all files ending in each extension + a newline character.
Note that printf is a much more useful tool than echo, as it allows you to control the format of each thing is prints.
ls doesn't do anything useful either here; it is the shell which expands the * to the list of files matching the pattern.
I have a bash script which gets the arguments from external 'source.txt' file.
Sources file includes 10 rows of arguments for instance (mixed files and directories).
One function should use the source file entirely. I achieved this with $(<source.txt) and it works OK.
Whereas the second function should use the same 'source.txt' file partially, filtering the arguments with regex or something else.
Source file:
/etc/sysconfig/network-scripts/
/etc/ntp.conf
/etc/localtime
/etc/sysconfig/iptables-config
/etc/resolv.conf
/sbin/ifup-local
/sbin/ifdown-local
/usr/local/sbin
/var/spool/cron/
/boot
Second function must take only '^/etc/[a-z][A-Z]*' sources with all the content recursively.
How do i do it?
You can simply grep it, like this:
$(grep '^/etc/[a-z][A-Z]*'<source.txt)
Take a note through, that if your arguments happen to contain some spaces or quotes, the subshell approach (command substitution) might fail for you.
To workaround this you can use readarray (mapfile) instead:
readarray -t args < <(grep '^/etc/[a-z][A-Z]*' source.txt)
your_function "${args[#]}"
I have a shell script and i read all .s files in the specified folder first and then compile them to object file with a loop and after that link them to executable file.
this:
FILES=PTscalar_1.0/mibenchforpt/security/sha/*.s
for sfile in $FILES
do
echo "------------------------------------------------"
echo $sfile
objectFile="${sfile%.s}.o"
exefile="${objectFile%.o}.ex"
simplescalar/bin/sslittle-na-sstrix-as -o $objectFile $sfile
done
but I have a problem: in sha mibench program we have 2 files that each of them is in this flow:
.c -> .s -> .o
but at the last stage two .o files should be linked into one executable file.
how I can get two file names at the same time and create a command to link them.
main code is this:
simplescalar/bin/sslittle-na-sstrix-ld -o __sha.ex _sha.o _sha_driver.o
is there any way to see inside of FILES like this:
OFILES=PTscalar_1.0/mibenchforpt/security/sha/*.o
simplescalar/bin/sslittle-na-sstrix-ld -o $exefile OFILES[0] OFILES[1]
and after that doing that in a loop for all files with this pattern
first file is like *.o or *_main.o
second is: *_driver.o
Thanks
Obviously this is possible in shell. However many people find that the make utility is better for building software than shell scripts simply because of these dependencies. take a look at GNU Make. Its documentation contains numerous examples of what you're trying to do.
Caveat: Your tags "linux shell" do not specify a specific shell. POSIX sh, the standard specifying minimum required behavior for /bin/sh, does not support arrays; you should use a specific shell, such as bash or ksh, which does. To do this, you need to start your script with an appropriate shebang (such as #!/bin/bash instead of #!/bin/sh), and do any manual invocations with the correct shell (so bash -x myscript if you would otherwise use sh -x myscript... though if you've set the shebang correctly and have +x permissions, you can always just ./myscript)
# this is broken
FILES=PTscalar_1.0/mibenchforpt/security/sha/*.s
...does not create an array.
# this works in bash, ksh, and zsh
files=( PTscalar_1.0/mibenchforpt/security/sha/*.s )
does create an array, which can be expanded as "${files[#]}". So:
# this works in bash and ksh, and probably zsh
for file in "${files[#]}"; do
...
done
However, in this particular case, you don't have a reason to use an array at all:
# this works with absolutely any POSIX-compatible shell
for file in PTscalar_1.0/mibenchforpt/security/sha/*.s; do
echo "$sfile"
objectFile=${sfile%.s}.o
exefile=${objectFile%.o}.ex
simplescalar/bin/sslittle-na-sstrix-as -o "$objectFile" "$sfile"
done
Note a few corrections made in the above:
The right-hand-side of assignments in with no literal whitespace in their syntax do not need to be quoted.
All expansions (such as $objectFile) do need to be quoted, so, "$objectFile".
...yes, this does include echo; to test this, run s='*' and compare the output of echo $s to echo "$s".
To address the follow-up question you edited in:
ofiles=( PTscalar_1.0/mibenchforpt/security/sha/*.o )
simplescalar/bin/sslittle-na-sstrix-ld -o "$exefile" "${ofiles[0]}" "${ofiles[1]}"
...is a literal answer, but this would need to be edited if you had two or more outputs. Much better to do it this way instead:
ofiles=( PTscalar_1.0/mibenchforpt/security/sha/*.o )
simplescalar/bin/sslittle-na-sstrix-ld -o "$exefile" "${ofiles[#]}"
I created this file and it worked:
#!/bin/bash
#compile to assembly:
FILES=*_driver.s
for sdriverfile in $FILES
do
echo "------------------------------------------------"
# s file
echo $sdriverfile
sfile="${sdriverfile%_driver.s}.s"
echo $sfile
# object files
obj="${sfile%.s}.o"
obj_driver="${sdriverfile%.s}.o"
#exe file
exefile="${sfile%.s}_as.ex"
echo $exefile
#compile
/home/mahdi/programs/simplescalar/bin/sslittle-na-sstrix-as -o $obj $sfile
/home/mahdi/programs/simplescalar/bin/sslittle-na-sstrix-as -o $obj_driver $sdriverfile
#link
/home/mahdi/programs/simplescalar/bin/sslittle-na-sstrix-ld -o $exefile $obj $obj_driver -L /home/mahdi/programs/simplescalar/sslittle-na-sstrix/lib -lc -L /home/mahdi/programs/simplescalar/lib/gcc-lib/sslittle-na-sstrix/2.7.2.3/ -lgcc
done
thanks for answers.
These lines work when copy-pasted to the shell but don't work in a script:
ls -l file1 > /path/`echo !#:2`.txt
ls -l file2 > /path/`echo !#:2`.txt
ls -l file1 > /path/$(echo !#:2).txt
ls -l file2 > /path/$(echo !#:2).txt
What's the syntax for doing this in a bash script?
If possible, I would like to know how to do this for one file and for all files with the same extension in a folder.
Non-interactive shell has history expansion disabled.
Add the following two lines to your script to enable it:
set -o history
set -o histexpand
(UPDATE: I misunderstood the original question as referring to arguments to the script, not arguments to the current command within the script; this is a rewritten answer.)
As #choroba said, history is disabled by default in scripts, because it's not really the right way to do things like this in a script.
The preferred way to do things like this in a script is to store the item in question (in this case the filename) in a variable, then refer to it multiple times in the command:
fname=file1
ls -l "$fname" > "/path/$fname.txt"
Note that you should almost always put variable references inside double-quotes (as I did above) to avoid trouble if they contain spaces or other shell metacharacters. If you want to do this for multiple files, use a for loop:
for fname in *; do # this will repeat for each file (or directory) in the current directory
ls -l "$fname" > "/path/$fname.txt"
done
If you want to operate on files someplace other than the current directory, things are a little more complicated. You can use /inputpath/*, but it'll include the path along with each filename (e.g. it'd run the loop with "/inputpath/file1", "/inputpath/file2", etc), and if you use that directly in the output redirect you'll get something like > /path/inputpath/file1.txt (i.e. the two different paths will get appended together), probably not what you want. In this case, you can use the basename command to strip off the unwanted path for output purposes:
for fpath in /inputpath/*; do
ls -l "$fpath" > "/path/$(basename "$fpath").txt"
done
If you want a list of files with a particular extension, just use *.foo or /inputpath/*.foo as appropriate. However, in this case you'll wind up with the output going to files named e.g. "file1.foo.txt"; if you don't want stacked extensions, basename has an option to trim that as well:
for fpath in /inputpath/*.foo; do
ls -l "$fpath" > "/path/$(basename "$fpath" .foo).txt"
done
Finally, it might be neater (depending how complex the actual operation is, and whether it occurs multiple times in the script) to wrap this in a function, then use that:
doStuffWithFile() {
ls -l "$1" > "/path/$(basename "$1" "$2").txt"
}
for fpath in /inputpath/*.foo; do
doStuffWithFile "$fpath" ".foo"
done
doStuffWithFile /otherpath/otherfile.bar .bar