How to embed and run a multi-line perl script stored in a bash variable in a bash script (without immediately running perl) [duplicate] - bash

This question already has answers here:
Run Perl Script From Unix Shell Script
(2 answers)
Closed 3 years ago.
How do I replace [RUN_ABOVE_PERL_SORTING_SCRIPT_HERE] with something that runs this perl script stored in a bash variable?
#!/usr/bin/env bash
# The perl script to sort getfacl output:
# https://github.com/philips/acl/blob/master/test/sort-getfacl-output
find /etc -name .git -prune -o -print | xargs getfacl -peL | [RUN_ABOVE_PERL_SORTING_SCRIPT_HERE] > /etc/.facl.nogit.txt
Notes:
I do not want to employ 2 files (a bash script and a perl script) to solve this problem; I want the functionality to be stored all in one bash script file.
I do not want to immediately run the perl script when storing the perl-script variable, because I want to run it later in the getfacl(1) bash pipeline shown below.
There's many similar stackoverflow questions+answers, but none that I can find (that has clean-reading code, anyway?) that solve both the a) multi-line and b) delayed-execution (or the embedded perl script) portion of this problem.
And to clarify: this problem is not specifically about getfacl(1), which is simply an catalyst to explore how to embed perl scripts--and possibly other scripting languages like python--into bash variables for delayed execution in a bash script.)

Employ the bash read command, which reads the perl script into a variable that's executed later in the bash script.
#!/usr/bin/env bash
# sort getfacl output: the following code is copied from:
# https://github.com/philips/acl/blob/master/test/sort-getfacl-output
read -r -d '' SCRIPT <<'EOS'
#!/usr/bin/env perl -w
undef $/;
print join("\n\n", sort split(/\n\n/, <>)), "\n\n";
EOS
find /etc -name .git -prune -o -print | xargs getfacl -peL | perl -e "$SCRIPT" > /etc/.facl.nogit.txt

This is covered by Run Perl Script From Unix Shell Script.
As they apply here:
You can pass the code to Perl using -e/-E.
perl -e"$script"
or
perl -e"$( curl "$url" )"
You can pass the code via STDIN.
printf %s "$script" | perl -e"$script"
or
curl "$url" | perl
(This won't work for you because you need STDIN.)
You can create a virtual file.
perl <( printf %s "$script" )
or
perl <( curl "$url" )
You can take advantage of perl's -x option.
(Not applicable if you want to download the script dynamically.)
All of the above assume the following command has already been executed:
url='https://github.com/philips/acl/blob/master/test/sort-getfacl-output'
Some of the above assume the following command has already been executed:
script="$( curl "$url" )

Related

Diffrence between bash script.sh and ./script.sh [duplicate]

This question already has answers here:
History command works in a terminal, but doesn't when written as a bash script
(3 answers)
Closed 2 years ago.
Suppose we have env.sh file that contains:
echo $(history | tail -n2 | head -n1) | sed 's/[0-9]* //' #looking for the last typed command
when executing this script with bash env.sh, the output will be empty:
but when we execute the script with ./env.sh, we get the last typed command:
I just want to know the diffrence between them
Notice that if we add #!/bin/bash at the beginning of the script, the ./env.sh will no longer output anything.
History is disabled by BASH in non-interactive shells by-default. If you want to enable it however, you can do so like this:
#!/bin/bash
echo $HISTFILE # will be empty in non-iteractive shell
HISTFILE=~/.bash_history # set it again
set -o history
# the command will work now
history
The reason this is done is to avoid cluttering the history by any commands being run by any shell scripts.
Adding hashbang (meaning the file is to be interpreted as a script by the program specified in your hashbang) to your script when being run via ./env.sh invokes your script using the binary /bin/bash i.e. run via bash, thus again printing no history.

perl how to finish the script by 'cd $newdir'

I have a perl script that creates a directory $newdir based on some input passed as a parameter, and I would like the script to finish it's execution by doing:
cd $newdir
So that the next command in bash Linux 64bit (here program2) is executed from the $newdir working directory.
E.g.:
perl $HOME/import_script.pl -i someparameter && $HOME/program2 .
You can't.
Any cd (or similar) you run in the perl script will affect only the perl script (or a sub-shell spawned from the perl script).
It can't affect the parent shell directly.
The only thing you could do would be to output the directory and then cd to that or similar. (e.g. cd "$(perl "$HOME"/import_script.pl -i someparameter)" && "$HOME/program2" . but realize that this means you can't output anything else to standard output from the perl script or it will confuse cd.)
Or have perl run the second command also, etc.
Just adding another potential solution here; you can have your perl script output the bash you want to run, and run it with bash eval. For example;
File: do.pl:
#!/usr/bin/env perl
use strict;
use warnings;
use File::Temp qw(tempdir);
my $dir = tempdir();
print "echo Whatever I print out will be evaluated in my shell;";
print "cd $dir"; # Separate multiple commands with ';', see ^
From Bash
[~] > eval `./do.pl`
[~] > Whatever I print out will be evaluated in my shell
[/tmp/BcZI6ZaRB] > _
You can make things even easier by adding an alias, or bash function to your shell.
Add to File: ~/.bashrc
doit() {
eval $(./do.pl)
}
From Bash
[~] > doit
[~] > Whatever I print out will be evaluated in my shell
[/tmp/ejzVGauPXx] > _

Pretend to be a tty in bash for any command [duplicate]

This question already has answers here:
How to trick an application into thinking its stdout is a terminal, not a pipe
(9 answers)
Closed 5 years ago.
Whenever I use grep, and I pipe it to an other program, the --color option is not respected. I know I could use --color=always, but It also comes up with some other commands that I would like to get the exact output of that command as the output I would get if I was in a tty.
So my question is, is it possible to trick a command into thinking that the command is run inside a tty ?
For example, running
grep --color word file # Outputs some colors
grep --color word file | cat # Doesn't output any colors
I'd like to be able to write something like :
IS_TTY=TRUE grep --color word file | cat # Outputs some colors
This question seems to have a tool that might do what I want :empty - run processes and applications under pseudo-terminal (PTY), but from what I could read in the docs, I'm not sure it can help for my problem
There are a number of options, as outlined by several other Stack Overflow answers (see Caarlos's comment). I'll summarize them here though:
Use script + printf, requires no extra dependencies:
0<&- script -qefc "ls --color=auto" /dev/null | cat
Or make a bash function faketty to encapsulate it:
faketty () {
script -qefc "$(printf "%q " "$#")" /dev/null
}
faketty ls --color=auto | cat
Or in the fish shell:
function faketty
script -qefc "(printf "%q " "$argv")" /dev/null
end
faketty ls --color=auto | cat
(credit goes to this answer)
http://linux.die.net/man/1/script
Use the unbuffer command (as part of the expect suite of commands), unfortunately this requires an extra package install, but it's the easiest solution:
sudo apt-get install expect-dev # or brew install expect
unbuffer -p ls --color=auto | cat
Or if you use the fish shell:
function faketty
unbuffer -p $argv
end
faketty ls --color=auto | cat
http://linux.die.net/man/1/unbuffer
This is a great article on how TTYs work and what Pseudo-TTYs (PTYs) are, it's worth taking a look at if you want to understand how the linux shell works with file descriptors to pass around input, output, and signals. http://www.linusakesson.net/programming/tty/index.php

What is wrong this simple history script?

I am missing something really simple I think:
$ cat hs.sh
#!/bin/bash
echo $1
history | grep -i $1
echo $#
exit
$
here is output:
$ ./history_search sed
sed
1
$
Trying to create a script which I can use in form of './hs.sh sed' to search for all sed commands in history. I can create an alias using this which works fine, but not this script.
Here is the alias:
alias hg='history | grep -i $1'
Interactive shells have history; scripted shells do not have history. You can only ask for history from an interactive shell, which is why the alias works but the script does not.
When you run this as a shell script, it spawns a new shell that has no history.
Try running it in the same shell like this:
source ./history_search see
and it should work.

Makefile - "$" not taken into account for shell command

I have a Makefle with the following rule
bash -c "find . |grep -E '\.c$|\.h$|\.cpp$|\.hpp$|Makefile' | xargs cat | wc -l"
I'm expecting make to run the quoted bash script and to return the number of line in my project.
Running directly the command in a terminal does the work, but it doesn't work in makefile.
If I remove $ from the script, it does work ... but not as expected (since I only want *.{c,cpp,h,hpp,Makefile}.
Why bash -c doesn't run correctly my script?
if you write the rule like the following, it should produce the result you want:
target:
#echo $(shell find . | grep -E '\.c$$|\.h$$|\.cpp$$|\.hpp$$|Makefile' | xargs cat | wc -l)
In Makefiles, $ is used for make variables such as $(HEADERS), where HEADERS would have been defined previously using =.
To use a $ in inline bash, you have to double them to escape them. $$VAR will refer to a shell variable, and .c$$ and so on should escape the $ for the regex you're working with.
The following should suffice in escaping the $'s for what you're trying to accomplish
bash -c "find . |grep -E '\.c$$|\.h$$|\.cpp$$|\.hpp$$|Makefile' | xargs cat | wc -l"
Additionally, you can use bash globally in your Makefile as opposed to the default /bin/sh if you add this declaration:
SHELL = /bin/bash
With the above, you should be able to use the find command without needing the bash -c and quotes. The following should work if SHELL is defined as above:
find . |grep -E '\.c$$|\.h$$|\.cpp$$|\.hpp$$|Makefile' | xargs cat | wc -l
Also, note that you can, and will often see SubShells used for this purpose. These are created with (). This will make any variables defined by the inner shell local to that shell and its group of commands.

Resources