How to use bash tail command inside a custom pipe command script? - bash

I want to use tail in my custom pipe command.
For example, I want to execute this command:
>ls -1 | tail -n 1 | awk '{print "last file is "$1}'
>last file is test.txt
And I want to make it short by making my own custom script. It looks like this:
>ls -1 | myscript
>last file is test.txt
I know myscript can get input from "ls -1" by this code:
while read line; do
echo last file is $line
done
But I don't know how to use "tail -n 1" in the custom pipe command code above.
Is there a way to use a pipe command in another pipe command script?
Or do I have to implement the code which does the same process as "tail -n 1" myself?
I hope bash has some solution for this.

Try putting just this in myscript
tail -n 1 | awk '{print "last file is "$1}'
This works as the first command (tail) consumes the stdin of your script. In general, scripts work as though you typed their contest as-is to the terminal.

Related

Writing a script that gets command and executes it

I want to write a script that gets in its argument a command and executes it while it's running. for example if the script called ex_script, writing
ex_script "cat file1.txt | wc -l"
and the ex_script is:
var=`"${1}"`
echo $var
will assign the number of lines in file1.txt in var and then print it.
But it gives me
./ex_script: line 3: cat file1.txt | wc -l: command not found
How do I write this correctly?
Use eval
var=$(eval "$1")
echo "$var"

how to read a value from filename and insert/replace it in the file?

I have to run many python script which differ just with one parameter. I name them as runv1.py, runv2.py, runv20.py. I have the original script, say runv1.py. Then I make all copies that I need by
cat runv1.py | tee runv{2..20..1}.py
So I have runv1.py,.., runv20.py. But still the parameter v=1 in all of them.
Q: how can I also replace v parameter to read it from the file name? so e.g in runv4.py then v=4. I would like to know if there is any one-line shell command or combination of commands. Thank you!
PS: direct editing each file is not a proper solution when there are too many files.
Below for loop will serve your purpose I think
for i in `ls | grep "runv[0-9][0-9]*.py"`
do
l=`echo $i | tr -d [a-z.]`
sed -i 's/v/'"$l"'/g' runv$l.py
done
Below command was to pass the parameter to script extracted from the filename itself
ls | grep "runv[0-9][0-9]*.py" | tr -d [a-z.] | awk '{print "./runv"$0".py "$0}' | xargs sh
in the end instead of sh you can use python or bash or ksh.

bash, execute "edmEventSize" command but it is not found when i tyoe bash script.sh

i hava a file in which using the command "edmEventSize" i can
extract a piece of information of that file (it is a number)
but know i have 700 files on which i have to execute that command
and i am trying to do it on a bash script but i cannot event do it for just
one file since i get "edmEventSize command not found", i already look for
more information but since i am new at bash i can not solve this task
Thank you in advanced
this is my script
#/usr/bin/env sh
for i in {1..700};
do
FILE="Py6_BstoJpsiKs0_7TeV_RECO_Run-0${i}.root"
edmEventSize... $FILE.root > salida${i}.log
done
head *.log | grep "^File" | cut -f4 > a.txt
rm *.log
As everyone would suggest, you can simplify your script like this:
#/bin/bash
for i in {1..700}; do
FILE="Py6_BstoJpsiKs0_7TeV_RECO_Run-0${i}.root"
/path/to/EdmEventSize "$FILE.root"
done | awk -F $'\t' '/^File/{print $4}' > a.txt
If your files actually are in the format of Py6_BstoJpsiKs0_7TeV_RECO_Run-####.root maybe the command you really need is:
printf -v FILE 'Py6_BstoJpsiKs0_7TeV_RECO_Run-%04d.root' "$i"

pipe tail output into another script

I am trying to pipe the output of a tail command into another bash script to process:
tail -n +1 -f your_log_file | myscript.sh
However, when I run it, the $1 parameter (inside the myscript.sh) never gets reached. What am I missing? How do I pipe the output to be the input parameter of the script?
PS - I want tail to run forever and continue piping each individual line into the script.
Edit
For now the entire contents of myscripts.sh are:
echo $1;
Generally, here is one way to handle standard input to a script:
#!/bin/bash
while read line; do
echo $line
done
That is a very rough bash equivalent to cat. It does demonstrate a key fact: each command inside the script inherits its standard input from the shell, so you don't really need to do anything special to get access to the data coming in. read takes its input from the shell, which (in your case) is getting its input from the tail process connected to it via the pipe.
As another example, consider this script; we'll call it 'mygrep.sh'.
#!/bin/bash
grep "$1"
Now the pipeline
some-text-producing-command | ./mygrep.sh bob
behaves identically to
some-text-producing-command | grep bob
$1 is set if you call your script like this:
./myscript.sh foo
Then $1 has the value "foo".
The positional parameters and standard input are separate; you could do this
tail -n +1 -f your_log_file | myscript.sh foo
Now standard input is still coming from the tail process, and $1 is still set to 'foo'.
Perhaps your were confused with awk?
tail -n +1 -f your_log_file | awk '{
print $1
}'
would print the first column from the output of the tail command.
In the shell, a similar effect can be achieved with:
tail -n +1 -f your_log_file | while read first junk; do
echo "$first"
done
Alternatively, you could put the whole while ... done loop inside myscript.sh
Piping connects the output (stdout) of one process to the input (stdin) of another process. stdin is not the same thing as the arguments sent to a process when it starts.
What you want to do is convert the lines in the output of your first process into arguments for the the second process. This is exactly what the xargs command is for.
All you need to do is pipe an xargs in between the initial command and it will work:
tail -n +1 -f your_log_file | xargs | myscript.sh

Bash/Awk: How can I run a command using bash or awk

How can I run a command in bash, read the output it returns and check if there's the text "xyz" in there in order to decide if I run another command or not?
Is it easy?
Thanks
if COMMAND | grep -q xyz; then
#do something
fi
EDIT: Made it quiet.
For example:
command1 | grep "xyz" >/dev/null 2>&1 && command2
run command1
its output filer with grep
discard output from the grep
and if the grep was successful (so found the string)
execute the command2
You can pipe the output of the command to grep or grep -e.
Your specification is very loose, but here is an idea to try
output="$(cmd args ....)"
case "${output}" in
*targetText* ) otherCommand args ... ;;
*target2Text* ) other2Command ... ;;
esac
I hope this helps.
While you can accomplish this task many different ways, this is a perfect use for awk.
prv cmd | awk '/xyx/ {print "cmd" } ' | bash
this is what you want
for example,
i have a text file called temp.txt that only contains 'xyz'
If i run the following command I will exepct the output "found it"
$ cat temp.txt | awk '/xyz/ {print "echo found it"}' | bash
> found it
so what I am doing is piping the output of my previous command into awk, who is looking for the pattern xyz (/xyz/). awk will print the command, in this case echo found it, and pipe it to bash to execute them. simple one-liner doing what you asked. note you can customize the regex that awk looks for.

Resources