How can I run a command from a line inside a text file, one line of my text file looks like this
echo "RouterPing;`ping -c4 -w4 -q DeviceIP| tail -2 |awk '{print}' ORS=' '`;$(date)" >> somefile.txt &
I have a file that has thousands of lines that being generated by external program and want to execute every line in it. I need each line to be run exactly as if I am running it from bash shell
You can just run:
bash file.txt
you can use below , but i would highly not recommend executing 1000 commands from a file ,
#!/usr/bin/bash
filename="$1"
while read -r line
do
$line
done < "$filename"
How to use
./this_file_name.sh file_with_commands
you$ bash somefile.txt
Just make sure your file is executable (chmod 744 somefile.txt)
with dot ('.') (I removed the & because if the command runs in background it wont work)
echo "RouterPing;`ping -c4 -w4 -q DeviceIP| tail -2 |awk '{print}' ORS=' '`;$(date)" >> somefile.txt
. somefile.txt
Related
I'm currently monitoring a log file and my ultimate goal is to write a script that uses tail -n0 -f and execute a certain command once grep finds a correspondence. My current code:
tail -n 0 -f $logfile | grep -q $pattern && echo $warning > $anotherlogfile
This works but only once, since grep -q stops when it finds a match. The script must keep searching and running the command, so I can update a status log and run another script to automatically fix the problem. Can you give me a hint?
Thanks
use a while loop
tail -n 0 -f "$logfile" | while read LINE; do
echo "$LINE" | grep -q "$pattern" && echo "$warning" > "$anotherlogfile"
done
awk will let us continue to process lines and take actions when a pattern is found. Something like:
tail -n0 -f "$logfile" | awk -v pattern="$pattern" '$0 ~ pattern {print "WARN" >> "anotherLogFile"}'
If you need to pass in the warning message and path to anotherLogFile you can use more -v flags to awk. Also, you could have awk take the action you want instead. It can run commands via the system() function where you pass the shell command to run
I have a file like this (text.txt):
ls -al
ps -au
export COP=5
clear
Each line corresponds at a command. In my script, I need to read each line and launch each command.
ps: I tried all these options and with all of them I have the same problem with the command "export". In the file there is "export COP=5", but after running the script, if I do echo $COP in the same terminal, no value is displayed
while IFS= read line; do eval $line; done < text.txt
Be careful about it, it's generally not advised to use eval as it's quite powerful and as easy to be abused.
However, if there is no risk of influence from unprivileged users on text.txt it should be ok.
cat test.txt | xargs -l1 bash -c '"$#"' echo
In order to avoid confusion I would simply rename the file from text.txt to text and add a shebang (e.g. #!/bin/bash) as the first line of the file. Make sure it is executable by calling chmod +x text. Afterwards you can execute it as expected.
$ cat text
#!/bin/bash
ls -al
ps -au
clear
$ chmod +x text
$ ./text
So I've made a script which is collecting data from many different files:
#!/bin/bash
mkdir DATAPOOL"$1"
grep achi *out>runner
grep treat *out>>runner
cat runner | grep Primitive *gout | grep '= '|awk '{print $1,$6}' > CellVolume"$1".txt
cat runner | grep ' c ' *gout | grep 'Angstrom '|awk '{print $1,$3}' > Cellc"$1".txt
cat runner | grep 'Final energy ' *gout |awk '{print $1,$5}' > CellEnergy"$1".txt
etc etc
cat runner |awk '{print "~/xtlanal",$1," > ",$1}' >runner2
vi runner2
:1,$s/gout:/xtl/
:1,$s/gout:/dat/
:wq
source runner2
grep Summary *dat | grep 'CAT-O ' |awk '{print $1,$6}' > AVE_NaO_"$1".txt
mv *txt DATAPOOL"$1"
So I end up with all the required text files when run without the vi part and so I know it all works. Furthermore when I run it with the vi commands, it just stops running at the vi command and then i can manually enter the 3 commands and I end up with the correct results. What I'm struggling with is I cant get vi to run the commands on its own so I can just execute the file multiple times within different directories and not have to manually enter commands time and time again.
Any help would be greatly appreciated.
Cheers
something like this as a bash script:
#!/bin/bash
vi filename.txt -c ':g/^/m0' -c ':wq'
where -c execute a command. Here the command is to reverse the lines in a textfile. After done, :wq to save and exit. (man vi to get more about -c)
If you don't want to type -c twice, you can do it this way:
vi -c "g/^/m0 | wq" filename.txt
For scripted editing tasks, you can use ed instead of vi:
ed runner2 <<'END'
1,$s/gout:/xtl/
1,$s/gout:/dat/
w
q
END
For global line-oriented search and replace, sed is a good choice:
sed -i 's/gout:/xtl/; s/gout:/dat/' runner2
Tested on VIM - Vi IMproved 8.0 (2016 Sep 12, compiled Apr 10 2018 21:31:58)
The vi -c "g/^/m0 | wq" filename.txt may appear to work, but it does not actually!
Typing vi -c "g/^/m0 | wq" filename.txt will result in vi writing and quitting before any major changes are made to the file. (using the pipe in this situation will attempt to execute the wq line by line forcing it to quit before the intended operation)
In order to see a demonstration try typing it without the q and see how slow it works writing line by line:
vi -c "g/^/m0 | w" filename.txt
The more efficient way is using -c as B. Kocis states, or use +.
As B. Kocis stated:
#!/bin/bash
vi filename.txt -c ':g/^/m0' -c ':wq'
or
vi filename.txt +g/^/m0 +wq
I am writing the output of a command into a bash file. The command gradually produces output, and I am using grep to retrieve part specific lines, and tee to write it to the file. Right now, the command is writing all the lines into the file. I want the file to be truncated everytime the bash command has some output, such that there is always one line in the file. How can I achieve such an effect?
The command I am using is:
2>&1 zypper -x -n in geany | grep -o --line-buffered "percent=\"[0-9]*\"" | tee /var/log/oneclick.log
This produces output like percent="10" and so on. Each time, only one line should exist in the file
If you need to overwrite the file for each line:
2>&1 zypper -x -n in geany |
grep -o --line-buffered "percent=\"[0-9]*\"" |
while read line; do
echo "$line" > /var/log/oneclick.log
echo "$line"
done
I'd like to write a bash script myscript such that issuing this command:
myscript > filename.txt
would return the name of the filename that it's output is being redirected to, filename.txt. Is this possible?
If you are running on Linux, check where /proc/self/fd/1 links to.
For example, the script can do the following:
#!/bin/bash
readlink /proc/self/fd/1
And then run it:
$ ./myscript > filename.txt
$ cat filename.txt
/tmp/filename.txt
Note that if you want to save the value of the output file to a variable or something, you can't use /proc/self since it will be different in the subshell, but you can still use $$:
outputfile=$(readlink /proc/$$/fd/1)
Using lsof:
outfile=$(lsof -p $$ | awk '/1w/{print $NF}')
echo $outfile