Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am trying to adapt the loop code from bash to csh:
while read file; do
if ! UPLOAD_FILE=$(ls "$UPLOAD_ARCHIVE_DIR"/*$file* 2>/dev/null); then
echo "could not find $file"
(( MISSING_COUNTER++ ))
continue
fi
ls -l "$UPLOAD_FILE" | sed 's/^[^ ]* *[^ ]* *[^ ]* *[^ ]* *[^ ]* *//'
cp "$UPLOAD_FILE" "$PROCESS_FILES_DIR"
DOS_FILE=$(basename "$UPLOAD_FILE")
SOR_FILE=$(echo $DOS_FILE | sed 's/.dat/.sor/')
unix2dos "$PROCESS_FILES_DIR"/"$DOS_FILE" 1>/dev/null 2>&1
mv "$PROCESS_FILES_DIR"/"$DOS_FILE" "$UPLOAD_PROCESS_DIR"
(( FOUND_COUNTER++ ))
done < "$1"
The input information comes from variable $input and contains a list, here is a sample record from the list: PL000000002002638908
So far, I have tried to use foreach and while loops without success, any help is greatly appreciate it!
You almost certainly don't need to translate your script from bash to csh.
/bin/sh is the Bourne shell, or something that works very much like it. This should be the case even on systems where csh is the default interactive shell and bash is unavailable.(On some systems, /bin/sh it may be a symlink to ksh, for example.)
bash is derived from the Bourne shell, and adds some bash-specific features.
You can confirm that /bin/sh is a Bourne-like shell by typing the following command at any shell prompt:
/bin/sh -c 'if true ; then echo Yes, this is a Bourne-like shell ; fi'
In the unlikely event that /bin/sh is really csh, the above will give you a syntax error.
It would be easier to modify your script to avoid bash-specific features than to translate it to csh. Read the bash manual, look for Bash-specific features, and modify your script to avoid them. (If that's even necessary; it may be that /bin/sh is something that supports at least some of them.)
Change the first line of your script to
#!/bin/sh
and see what error messages you get.
You might need to replace the
$(command)
syntax by the older
`command`
and this:
(( FOUND_COUNTER++ ))
to something using expr, perhaps:
FOUND_COUNTER=`expr $FOUND_COUNTER + 1`
The while read file ... loop should work as it is.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Please excuse me for my amateur language.
I have written a simple one line code to cat out content I want from a huge text file:
cat file | grep "value" | cut -f1 -d":"
It out put lines of paths of file from there on.
I want it to go on doing this:
cd into the paths one line at a time.
each time after cd run this command:
ls | grep .fileformat
Then let me run 3 commands I choose to, if I press [return] with no value it will ignore and listen for the next command. After three is done it will go on.
cd into the directory of the next line and repeating until the last line.
Sorry I couldn't figure this later part out as I didn't know where to start googling even.
Thank you for looking!
Edit 1: output of my initial command gives paths like this:
/home/user/path/to/file
So there is no ~, it should be able to cd into them no problem?
A slightly modified version of Allan Wind's answer that does what I THINK the OP wants (tried to reply as a comment to Allan but couldn't figure out code formatting properly):
sed -n '/value/s/\([^:]*\):.*/\1/p' file | while read d
do
echo -e "Entering $d"
cd $d
ls | grep .fileformat
for i in {1..3}; do
echo -e "Type your command #${i}:\n"
read -p '>' input < /dev/tty
[ -z "$input" ] && echo -e "\nDoing nothing this time...\n" && continue
eval "$input"
done
done
(Usual caveats of reading interactively + using eval being dangerous)
Here is the skeleton of an answer to help you ask a better question:
sed -n '/value/s/\([^:]*\):.*/\1/p' file | while read d
do
(
cd d
ls | grep .fileformat
read -p "3 commands? " c
[ -z "$c" ] && continue
cd d
eval "$c"
)
done
(Usual caveats of reading interactively + using eval being dangerous)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm writing a short bash script to find if a file exists in a directory or not. Here is the script:
#!/bin/bash
> oldFiles.txt
files=$(grep ' jane ' ../data/list.txt | cut -d ' ' -f 3)
for f in $files; do
f=~$f
if [ -e $f ]; then
echo $f Found
echo $f>>oldFiles.txt
else
echo $f Not found
fi
done
Three relevant file names are found by the grep and searched over in the for loop. It appears, however, that every time my script runs the if command, it comes back False regardless of whether the file exists or not. Please see the results of the script, followed by test -e for each of the individual files, followed by a list of the files in the relevant directory. Also posted below is the referenced list.txt file.
001 jane /data/jane_profile_07272018.doc
002 kwood /data/otherfile.doc
003 pchow /data/pfile.doc
004 janez /data/zfile.doc
005 jane /data/jane_pic_07282018.jpg
006 kwood /data/kwood.jpg
007 pchow /data/pchow.jpg
008 jane /data/jane_contact_07292018.csv
009 kwood /data/kwfile.csv
010 pchow /data/p2file.csv
I know what I'm missing has to be simple, but what is it?
Here's a refactoring which hopefully fixes most of the errors, with inline comments.
#!/bin/bash
# No need for this when you overwrite the file anyway
#> oldFiles.txt
# Don't capture a variable when you only need the result once
# POSIX calls this grep -F
grep -F ' jane ' ../data/list.txt |
# Let read -r do the splitting
#cut -d ' ' -f 3)
# Prefer while read over for
while read -r x y f; do
# XXX what's this for?
#f=~$f
# Fix quoting
if [ -e "$f" ]; then
# Diagnostics to stderr
echo "$0: $f Found" >&2
# Redirect only once, below
echo "$f"
else
# stderr
echo "$0: $f Not found" >&2
fi
# Redirect only once; overwrite
done >oldFiles.txt
If you want to look in your home directory and $f doesn't already have a leading slash, I guess the mystery line should be
f=~/"$f"
If the file names contain a tilde already (as suggested by your screen shot) the shell will not expand that inside double quotes. You can fix this by manually expanding any leading tilde:
case $f in
'~/'*) f=~/"${f#~/}";;
esac
The parameter expansion $[f#pattern} produces the value of $f with any prefix matching the glob expression pattern trimmed off from the beginning. (There's also ${f%pattern} for trimming from the end, and a good number of other convenient string manipulation facilities.)
If you want /data/stuff to search for a file $HOME/data/stuff or ./data/stuff then you have to add the leading component yourself. The path /data refers to a path directly in the root directory. Perhaps see also Difference between ./ and ~/ which has an answer by yours truly about this.
Don't rely on tilde expansions in your script. In general, in your interactive shell, you can use ~ instead of $HOME since it's easier to type. bash has many other useful tilde expansions, such as ~foo (which expands to the home account of user foo), ~-, ~+, etc, but best practice is to only use these interactively. In your script, just use $HOME instead of ~.
I'd love to give a definitive answer, and the documentation states that no tilde expansion is done in posix mode, but I find that bash will sometimes do tilde expansions when I don't expect it. As a result of that missed expectation, I follow the general practice of simply never using ~ in scripts.
In this case, however, the problem is that bash is doing tilde expansion before variable expansion, and since there's no user named $f (note, that's the literal string $f, not the expansion of the variable f), the tilde prefix ~$f is unchanged by tilde expansion. Then variable expansion is applied and expands the $f, but the ~ remains unchanged. For example:
$ cat a.sh
#!/bin/sh
a=williamp
echo ~williamp
echo ~$a
$ ./a.sh
/Users/williamp
~williamp
This is almost the exact same question as in this post, except that I do not want to use eval.
Quick question short, I want to execute the command echo aaa | grep a by first storing it in a string variable Command='echo aaa | grep a', and then running it without using eval.
In the post above, the selected answer used eval. That works for me too. What concerns me a lot is that there are plenty of warnings about eval below, followed by some attempts to circumvent it. However, none of them are able to solve my problem (essentially the OP's). I have commented below their attempts, but since it has been there for a long time, I suppose it is better to post the question again with the restriction of not using eval.
Concrete Example
What I want is a shell script that runs my command when I am happy:
#!/bin/bash
# This script run-this-if.sh runs the commands when I am happy
# Warning: the following script does not work (on nose)
if [ "$1" == "I-am-happy" ]; then
"$2"
fi
$ run-if.sh I-am-happy [insert-any-command]
Your sample usage can't ever work with an assignment, because assignments are scoped to the current process and its children. Because there's no reason to try to support assignments, things get suddenly far easier:
#!/bin/sh
if [ "$1" = "I-am-happy" ]; then
shift; "$#"
fi
This then can later use all the usual techniques to run shell pipelines, such as:
run-if-happy "$happiness" \
sh -c 'echo "$1" | grep "$2"' _ "$untrustedStringOne" "$untrustedStringTwo"
Note that we're passing the execve() syscall an argv with six elements:
sh (the shell to run; change to bash etc if preferred)
-c (telling the shell that the following argument is the code for it to run)
echo "$1" | grep "$2" (the code for sh to parse)
_ (a constant which becomes $0)
...whatever the shell variable untrustedStringOne contains... (which becomes $1)
...whatever the shell variable untrustedStringTwo contains... (which becomes $2)
Note here that echo "$1" | grep "$2" is a constant string -- in single-quotes, with no parameter expansions or command substitutions -- and that untrusted values are passed into the slots that fill in $1 and $2, out-of-band from the code being evaluated; this is essential to have any kind of increase in security over what eval would give you.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I want to get the PID of bash and highlight it if it's in some textfile (assume it is). So when I'm typing this in my shell:
grep -o $(pidof bash) test.txt
it just works fine and gives me the desired output, the PID of bash.
Then why is this script not working:
#!/bin/bash
PID=$(grep -o $(pidof bash) test.txt)
echo $PID
I only get some lines with:
grep: xxx: file or directory not found
xxx are random numbers, but usually the last one is the one I'm looking for.
How do I achieve this and why is the above not working?
Has this something to do with creating a new process by the shell when calling grep in the script?
Thank you.
I don't have pidof, so I'm assuming that it's an equivalent to pgrep -v, printing a list of PIDs, one on each line, with a newline between them.
If that's so, consider this:
egrep -o "$(pgrep -v bash | tr '\n' '|')" test.txt
Assume that the output of pgrep -v bash is:
123
456
789
Your original code would do this:
egrep -o 123 456 789 test.txt
...thus, searching for 123 in a file named 456, in a file named 789, and in a file named test.txt.
Now, compare to what happens when you replace that whitespace with pipe symbols:
egrep -o "123|456|789" test.txt
...as executed by the pipeline suggested earlier in this question is exactly what you were looking for. (BTW, the quotes here are purely syntactic -- that is, they're for consumption by the shell when it's understanding how things are parsed, not passed to egrep).
That said, if you're looking for the current bash process, use either $$ (for the parent PID of the current shell) or $BASH_PID (for the current shell itself even if it's a subshell), rather than using so inexact a tool as pgrep or pidof.
When you run pidof inside your shell script, there are at least two instances of bash running, so it will return multiple numbers. The way grep is designed, it can only search for one pattern at a time so the first number is interpreted as a pattern and the other numbers are mistakenly interpreted as file or directory names.
I'm not sure which bash PID you care about, but one solution is to use something like grep, head, or tail to filter the output of pidof so you just get one number. Another solution is to use the special variable $$, which is the PID of bash instance that evaluates it.
In the future, you can debug this better for yourself. To debug, I would start by running this command inside your script to see exactly what arguments are being passed to grep:
echo grep -o $(pidof bash) test.txt
I have read a line of bash code from the file, and I want to send it to log. To make it more useful, I'd like to send the variable-expanded version of the line.
I want to expand only shell variables. I don't want the pipes to be interpreted, nor I don't want to spawn any side processes, like when expanding a line with $( rm -r /).
I know that variable expansion is very deeply woven into the bash. I hope there is a way to perform just expansion, without any side effects, that would come from pipes, external programs and - perhaps - here-documents.
Maybe there is something like eval?
#!/bin/bash
linenr=42
line=`sed "${linenr}q;d" my-other-script.sh`
shell-expand $line >>mylog.log
I want a way to expand only the shell variables.
if x=a, a="example" then I want the following expansions:
echo $x should be echo a.
echo ${a} should be echo example
touch ${!x}.txt should be touch example.txt
if [ (( ${#a} - 6 )) -gt 10 ]; then echo "Too long string" should be if [ 1 -gt 10 ]; then echo "Too long string"
echo "\$a and \$x">/dev/null should be echo "\$a and \$x>dev/null"
For those arriving five years later, and, although it is not the best answer to the OP's problem, an answer to the question is as follows. Bash can do indirect parameter expansion:
some_param=a
a=b
echo ${!some_param}
echo $BASH_VERSION
# 5.0.18(1)-release
You are absolutely right, it is very dangerous to use the bash.
In fact your command suffers from your problem.
Let us discuss your script in detail:
#!/bin/bash
line=`sed "${42}q;d" my-other-script.sh`
shell-expand $line >>mylog.log
The sed may produce many lines of output, so it is misleading to use the name line.
Then you did not quote $line, this may have obscure effects:
$ x='| grep x'
$ ls -l $x
ls: cannot access |: No such file or directory
ls: cannot access grep: No such file or directory
-rw-rw-r-- 1 foo bar 34493 Nov 19 18:51 x
$
In this case the pipe is not executed, but passed to the program ls.
If you have untrusted input, it is very hard to program a robust shell script.
Using eval is evil - I would never suggest using it, especially for such a purpose!
An alternative way would be in perl, iterate over the $ENV array and replace all env keys by the env values. This way you have more control over the things, which may happen.