echo to standard input or stdin - shell

How can I echo to stdin?
For example, I have an alias called 'replace'. When I run 'replace' it should echo
sed 's/~/~\\n/g'
into my command line. Notice the sed command above doesn't have a file target. I'll supply the file target after it echos the sed into the command line.

Create alias like this:
alias replace="sed 's/~/~\\n/g' "
And then run it like this:
replace file
PS: It won't echo the full sed command on your tty but it will effectively run:
sed 's/~/~\\n/g' file
EDIT: On your question of echoing command before executing. Create a script sedscript.sh like this:
set -x
# do some sanity check on $1
sed 's/~/~\\n/g' "$1"
Now have you alias like this:
chmod +x sedscript.sh
alias replace="/path/to/sedscript.sh "
Now every time you run:
replace file
It will echo this line before sed command execution:
++ sed 's/~/~\\n/g' file

Related

How to capture shell script output

I have an unix shell script. I have put -x in shell to see all the execution step. Now I want to capture these in one log file on a daily basis.
Psb script.
#!/bin/ksh -x
Logfile= path.log.date
Print " copying file" | tee $logifle
Scp -i key source destination | tee -a $logfile.
Exit 0;
First line of the shell script is known as shebang , which indicates what interpreter has to be execute the below script.
Similarly first line is commented which denotes coming lines not related to that interpreted session.
To capture the output, run the script redirect your output while running the script.
ksh -x scriptname >> output_file
Note:it will output what your script's doing line by line
There are two cases, using ksh as your shell, then you need to do IO redirection accordingly, and using some other shell and executing a .ksh script, then IO redirection could be done based on that shell. Following method should work for most of the shells.
$ cat somescript.ksh
#!/bin/ksh -x
printf "Copy file \n";
printf "Do something else \n";
Run it:
$ ./somescript.ksh 1>some.log 2>&1
some.log will contain,
+ printf 'Copy file \n'
Copy file
+ printf 'Do something else \n'
Do something else
In your case, no need to specify logfile and/or tee. Script would look something like this,
#!/bin/ksh -x
printf "copying file\n"
scp -i key user#server /path/to/file
exit 0
Run it:
$ ./myscript 1>/path/to/logfile 2>&1
2>&1 captures both stderr and stdout into stdout and 1>logfile prints it out into logfile.
I would prefer to explicitly redirecting the output (including stderr 2> because set -x sends output to stderr).
This keeps the shebang short and you don't have to cram the redirecton and filename-building into it.
#!/bin/ksh
logfile=path.log.date
exec >> $logfile 2>&1 # redirecting all output to logfile (appending)
set -x # switch on debugging
# now start working
echo "print something"

Is there a way to 'inject' a command line output into another command?

for ex :
grep -R "requests" /some/really/long/path/to/type/out
I would like to do something like this
grep -R "requests" (pwd)
Basically, using the output of pwd sorta like a pipe (pipe dosent do it).
Use command substitution:
grep -R "requests" $(pwd)
The output of the command in $(...) is used as an argument list to the command. If you want the output to be treated as one word, wrap it in double quotes:
ls "$( command-that-produces-dirname-containing-whitespace )"
In bash you can use backtics for this:
grep -R "requests" `pwd`
pwd will be executed and the stdout of pwd will be used as the third parameter of the grep command

Bash script - Run commands that correspond to the lines of a file

I have a file like this (text.txt):
ls -al
ps -au
export COP=5
clear
Each line corresponds at a command. In my script, I need to read each line and launch each command.
ps: I tried all these options and with all of them I have the same problem with the command "export". In the file there is "export COP=5", but after running the script, if I do echo $COP in the same terminal, no value is displayed
while IFS= read line; do eval $line; done < text.txt
Be careful about it, it's generally not advised to use eval as it's quite powerful and as easy to be abused.
However, if there is no risk of influence from unprivileged users on text.txt it should be ok.
cat test.txt | xargs -l1 bash -c '"$#"' echo
In order to avoid confusion I would simply rename the file from text.txt to text and add a shebang (e.g. #!/bin/bash) as the first line of the file. Make sure it is executable by calling chmod +x text. Afterwards you can execute it as expected.
$ cat text
#!/bin/bash
ls -al
ps -au
clear
$ chmod +x text
$ ./text

How to refer to redirection file from within a bash script?

I'd like to write a bash script myscript such that issuing this command:
myscript > filename.txt
would return the name of the filename that it's output is being redirected to, filename.txt. Is this possible?
If you are running on Linux, check where /proc/self/fd/1 links to.
For example, the script can do the following:
#!/bin/bash
readlink /proc/self/fd/1
And then run it:
$ ./myscript > filename.txt
$ cat filename.txt
/tmp/filename.txt
Note that if you want to save the value of the output file to a variable or something, you can't use /proc/self since it will be different in the subshell, but you can still use $$:
outputfile=$(readlink /proc/$$/fd/1)
Using lsof:
outfile=$(lsof -p $$ | awk '/1w/{print $NF}')
echo $outfile

Executing commands containing space in Bash

I have a file named cmd that contains a list of Unix commands as follows:
hostname
pwd
ls /tmp
cat /etc/hostname
ls -la
ps -ef | grep java
cat cmd
I have another script that executes the commands in cmd as:
IFS=$'\n'
clear
for cmds in `cat cmd`
do
if [ $cmds ] ; then
$cmds;
echo "****************************";
fi
done
The problem is that commands in cmd without spaces run fine, but those with spaces are not correctly interpreted by the script. Following is the output:
patrick-laptop
****************************
/home/patrick/bashFiles
****************************
./prog.sh: line 6: ls /tmp: No such file or directory
****************************
./prog.sh: line 6: cat /etc/hostname: No such file or directory
****************************
./prog.sh: line 6: ls -la: command not found
****************************
./prog.sh: line 6: ps -ef | grep java: command not found
****************************
./prog.sh: line 6: cat cmd: command not found
****************************
What am I missing here?
Try changing the one line to eval $cmds rather than just $cmds
You can replace your script with the command
sh cmd
The shell’s job is to read commands and run them! If you want output/progress indicators, run the shell in verbose mode
sh -v cmd
I personally like this approach better - I don't want to munge the IFS if I don't have to do so. You do need to use an eval if you are going to use pipes in your commands. The pipe needs to be processed by the shell not the command. I believe the shell parses out pipes before the expanding strings.
Note that if your cmd file contains commands that take input there will be an issue. (But you can always create a new fd for the read command to read from.)
clear
while read cmds
do
if [ -n "$cmds" ] ; then
eval $cmds
echo "****************************";
fi
done < cmd
Edit: Turns out this fails on pipes and redirection. Thanks, Andomar.
You need to change IFS back inside the loop so that bash knows where to split the arguments:
IFS=$'\n'
clear
for cmds in `cat cmd`
do
if [ $cmds ] ; then
IFS=$' \t\n' # the default
$cmds;
echo "****************************";
IFS=$'\n'
fi
done
EDIT: The comment by Ben Blank pointed out that my old answer was wrong, thanks.
Looks like you're executing commands as a single string, so bash sees them as the script/executable name.
One way to avoid that would be to invoke bash on the command. Change:
if [ $cmds ] ; then
$cmds;
echo "****************************";
fi
to
if [ $cmds ] ; then
bash -c $cmds
echo "****************************";
fi
sed 'aecho "-----------"' cmd > cmd.sh; bash cmd.sh
sed 'aXX' appends XX to every line. This will not work for multiline-commands like:
for f in *
do
something $f
fi
but for single-line commands in most cases, it should do.

Resources