bash: trying to get all the files with a specific extension - bash

Hi I am trying to work on the files which have an extension like this .p8,.p16,*.p32.
./pack_vectors $#
for var in "$#"
do
if [ -f $var ];
then
pack_list="${var/.dat/.p}"
echo $pack_list
# below line doesn't work
for f in $pack_list+([:digit:]);do
what i am getting out is:
./wrapper.sh: line 10: syntax error near unexpected token `('
./wrapper.sh: line 10: `for f in $pack_list+([:digit:]);do'
Why?

An easier way to do is to use find:
find . -name "*.p8" -o -name "*.p16" -o -name "*.p32
The -o is the equivalent of boolean OR
To assign it to a variable, do this:
myvar=$(find . -name "*.p8" -o -name "*.p16" -o -name "*.p32")

Couple of possible issues. You have to have extglob on if you want to do the regex-style file matching you're trying to do in bash. So put
shopt -s extglob
before your for loop. You're also looking for [[:digit:]] if you want to use the posix character class in bash. So putting that together, try
shopt -s extglob
for f in ".p"+([[:digit:]]); do
Not quote sure what "$pack_list" is so replaced it with ".p" above.

It is working now, a bit of changing of BroSlow answer:
for f in "$pack_list+([[:digit:]])"; do

Related

Glob for file suffix not matching

I have some Unix command finding mov-files that have a corresponding jpg file:
find . -name '*.[Mm][Oo][Vv]' -exec sh -c '
for mov; do
for jpg in "${mov%.*}".[Jj][Pp][Gg]; do
if test -f "$jpg"; then
echo "$mov"
fi
break
done
done' sh {} +
The current code just searches for .jpg (or uppercase) as file extension, but I need to extend this to also support files that ends with ".jpeg".
I modified the code to say:
for jpg in "${mov%.*}".[Jj][Pp][Ee]?[Gg]; do
which I believed should make it possible to have an optional "E or e", but this does not work.
I was able to use this instead
for jpg in "${mov%.*}".[Jj][Pp]*[Gg]; do
which is not very safe because it will accept a lot more tha e and E in that position.
Any ideas how to modify expression to add the optional e/E in the reg exp?
The extglob feature suffices for this. Running shopt -s extglob when using bash (not sh) will let you use ?([Ee]) to refer to zero-or-one instances of [Ee].
Even better, while we're setting shopt flags, we can set nocaseglob so you can use *.jp?(e)g, without the explicit character classes. (The find equivalent for this is changing -name to -iname, which the following does in addition).
find . -iname '*.mov' -exec bash -c '
shopt -s extglob nocaseglob
for mov; do
for jpg in "${mov%.*}".jp?(e)g; do
if test -f "$jpg"; then
printf "%s\n" "$mov"
fi
break
done
done' bash {} +

bash: command works when inputted manually but not on script [duplicate]

Using the pattern match !("file1") does not work within a bash script but will work on the command line.
For example:
ls !("file1"|"file2")
This will list all files in directory except file1 and file2.
When that line is executed in a script this error is displayed:
./script.sh: line 1: syntax error near unexpected token `('
./script.sh: line 1: ` ls !("file1"|"file2") '
Regardless what is used rm -v !("file1"). The same error takes place. What is going on here why does this not work in a script?
The extended glob syntax you are trying to use is turned off by default; you have to enable it separately in each script where you want to use it.
shopt -s extglob
Scripts should not use ls though I imagine you were using it merely as a placeholder here.
Globbing doesn't work that way unless you enable extglob shell opt. Instead, I recommend using find:
find . -maxdepth 1 -not -name '<NAME>' -or -name '<NAME>' -delete
before running this command with -delete ensure the output is correct
Method with default settings and no external procs:
for f in *; do [[ $f =~ ^file[12]$ ]] || echo "$f"; done

BASH script returning command not found

I am very new to bash programming and wanted to create a script that would store each result of find individually into an array. Now I want the command variable to expand on the statement MYRA=($(${Command} $1))
Command = 'find . -iname "*.cpp" -o -iname "*.h"'
declare -a MYRA
MYRA=($(${Command} $1))
echo ${#MYRA[#]}
However when I try this script I get the result
$ sh script.sh
script.sh: line 1: Command: command not found
0
Any suggestions on how I can fix this ?
Shell assignment statements may not have whitespace around the =. This is valid:
Command='some command'
This is not:
Command = 'some command'
In the second form, bash will interpret Command as a command name.
All of the below requires a #!/bin/bash shebang (which should come as no surprise since you're using arrays, which are a bash-only feature).
Also, see http://mywiki.wooledge.org/BashFAQ/050 for comprehensive discussion.
A best-practices implementation would look something like this:
# commands should be encapsulated in functions where possible
find_sources() { find . '(' -iname '*.cpp' -o -iname '*.h' ')' -print0; }
declare -a source_files
while IFS= read -r -d '' filename; do
source_files+=( "filename" )
done < <(find_sources)
Now, if you really need to store the command in an array (maybe you're building it up dynamically), doing that would look like this:
# store literal argv for find command in array
# ...if you wanted to build this up dynamically, you could do so.
find_command=( find . '(' -iname '*.cpp' -o -iname '*.h' ')' -print0 )
declare -a source_files
while IFS= read -r -d '' filename; do
source_files+=( "filename" )
done < <("${find_command[#]}")

Syntax error: "(" unexpected assigning an array in bash

Within a bash script, I'm trying to pull all files with an extension '.jstd' into an array, loop over that array and carry out some action.
My script is failing to copy the path of each script into the array.
I have the following script.
#!/bin/bash
IFS=$'\n'
file_list=($(find '/var/www' -type f -name "*.jstd"))
for i in "${file_list[#]}"; do
echo "$i"
done
echo $file_list
unset IFS
The line file_list=($(find '/var/www' -type f -name "*.jstd")) works fine in the terminal, but fails in the script with:
Syntax error: "(" unexpected
I've googled, but failed. All ideas gratefully received.
edit: In case it helps in reproduction or clues, I'm running Ubuntu 12.04, with GNU bash, version 4.2.25(1)-release (i686-pc-linux-gnu)
This is precisely the error you would get if your shell were /bin/sh on Ubuntu, not bash:
$ dash -c 'foo=( bar )'
dash: 1: Syntax error: "(" unexpected
If you're running your script with sh yourscript -- don't. You must invoke bash scripts with bash.
That being given, though -- the better way to read a file list from find would be:
file_list=( )
while IFS= read -r -d '' filename; do
file_list+=( "$filename" )
done < <(find '/var/www' -type f -name "*.jstd" -print0)
...the above approach working correctly with filenames containing spaces, newlines, glob characters, and other corner cases.

Repeated input redirection to c++ executable in bash

I have written an executable in c++, which is designed to take input from a file, and output to stdout (which I would like to redirect to a single file). The issue is, I want to run this on all of the files in a folder, and the find command that I am using is not cooperating. The command that I am using is:
find -name files/* -exec ./stagger < {} \;
From looking at examples, it is my understanding that {} replaces the file name. However, I am getting the error:
-bash: {}: No such file or directory
I am assuming that once this is ironed out, in order to get all of the results into one file, I could simply use the pattern Command >> outputfile.txt.
Thank you for any help, and let me know if the question can be clarified.
The problem that you are having is that redirection is processed before the find command. You can work around this by spawning another bash process in the -exec call:
find files/* -exec bash -c '/path/to/stagger < "$1"' -- {} \;
The < operator is interpreted as a redirect by the shell prior to running the command. The shell tries redirecting input from a file named {} to find's stdin, and an error occurs if the file doesn't exist.
The argument to -name is unquoted and contains a glob character. The shell applies pathname expansion and gives nonsensical arguments to find.
Filenames can't contain slashes. The argument to -name can't work even if it were quoted. If GNU find is available, -path can be used to specify a glob pattern files/*, but this doesn't mean "files in directories named files", for that you need -regex. Portable solutions are harder.
You need to specify one or more paths for find to start from.
Assuming what you really wanted was to have a shell perform the redirect, Here's a way with GNU find.
find . -type f -regex '.*foo/[^/]*$' -exec sh -c 'for x; do ./stagger <"$x"; done' -- {} +
This is probably the best portable way using find (-depth and -prune won't work for this):
find . -type d -name files -exec sh -c 'for x; do for y in "$x"/*; do [ -f "$y" ] && ./stagger <"$y"; done; done' -- {} +
If you're using Bash, this problem is a very good candidate for just using a globstar pattern instead of find.
#!/usr/bin/env bash
shopt -s extglob globstar nullglob
for x in **/files/*; do
[[ -f "$x" ]] && ./stagger <"$x"
done
Simply escape the less-than symbol, so that redirection is carried out by the find command rather than the shell it is running in:
find files/* -exec ./stagger \< {} \;

Resources