How to run "source" command (Linux) from a perl script? - bash

I am trying to source a script from a Perl script (script.pl).
system ("source /some/generic/script");
Please note that this generic script could be a shell, python or any other script. Also, I cannot replicate the logic present inside this generic script into my Perl script. I tried replacing system with ``, exec, and qx//. Each time I got the following error:
Can't exec "source": No such file or directory at script.pl line 18.
I came across many forums on the internet, which discussed various reasons for this problem. But none of them provided a solution. Is there any way to run/execute source command from a Perl script?

In bash, etc, source is a builtin that means read this file, and interpret it locally (a little like a #include).
In this context that makes no sense - you either need to remove source from the command and have a shebang (#!) line at the start of the shell script that tells the system which shell to use to execute that script, or you need to explicitly tell system which shell to use, e.g.
system "/bin/sh", "/some/generic/script";
[with no comment about whether it's actually appropriate to use system in this case].

There are a few things going on here. First, a child process can't change the environment of its parent. That source would only last as long as its process is around.
Here's a short program that set and export an environment variable.
#!/bin/sh
echo "PID" $$
export HERE_I_AM="JH";
Running the file does not export the variable. The file runs in its own proces. The process IDs ($$) are different in set_stuff.sh and the shell:
$ chmod 755 set_stuff.sh
$ ./set_stuff.sh
PID 92799
$ echo $$
92077
$ echo $HERE_I_AM # empty
source is different. It reads the file and evaluates it in the shell. The process IDs are the same in set_stuff.sh and the shell, so the file is actually affecting its own process:
$ unset HERE_I_AM # start over
$ source set_stuff.sh
PID 92077
$ echo $$
92077
$ echo $HERE_I_AM
JH
Now on to Perl. Calling system creates a child process (there's an exec in there somewhere) so that's not going to affect the Perl process.
$ perl -lwe 'system( "source set_stuff.sh; echo \$HERE_I_AM" );
print "From Perl ($$): $ENV{HERE_I_AM}"'
PID 92989
JH
Use of uninitialized value in concatenation (.) or string at -e line 1.
From Perl (92988):
Curiously, this works even though your version doesn't. I think the different is that in this there are no special shell metacharacters here, so it tries to exec the program directory, skipping the shell it just used for my more complicated string:
$ perl -lwe 'system( "source set_stuff.sh" ); print $ENV{HERE_I_AM}'
Can't exec "source": No such file or directory at -e line 1.
Use of uninitialized value in print at -e line 1.
But, you don't want a single string in that case. The list form is more secure, but source isn't a file that anything can execute:
$ which source # nothing
$ perl -lwe 'system( "source", "set_stuff.sh" ); print "From Perl ($$): $ENV{HERE_I_AM}"'
Can't exec "source": No such file or directory at -e line 1.
Use of uninitialized value in concatenation (.) or string at -e line 1.
From Perl (93766):
That is, you can call source, but as something that invokes the shell.
Back to your problem. There are various ways to tackle this, but we need to get the output of the program. Instead of system, use backticks. That's a double-quoted context so I need to protect some literal $s that I want to pass as part of the shell commans
$ perl -lwe 'my $o = `echo \$\$ && source set_stuff.sh && echo \$HERE_I_AM`; print "$o\nFrom Perl ($$): $ENV{HERE_I_AM}"'
Use of uninitialized value in concatenation (.) or string at -e line 1.
93919
From Shell PID 93919
JH
From Perl (93918):
Inside the backticks, you get what you like. The shell program can see the variable. Once back in Perl, it can't. But, I have the output now. Let's get more fancy. Get rid of the PID stuff because I don't need to see that now:
#!/bin/sh
export HERE_I_AM="JH";
And the shell command creates some output that has the name and value:
$ perl -lwe 'my $o = `source set_stuff.sh && echo HERE_I_AM=\$HERE_I_AM`; print $o'
HERE_I_AM=JH
I can parse that output and set variables in Perl. Now Perl has imported part of the environment of the shell program:
$ perl -lwe 'my $o = `source set_stuff.sh && echo HERE_I_AM=\$HERE_I_AM`; for(split/\R/,$o){ my($k,$v)=split/=/; $ENV{$k}=$v }; print "From Perl: $ENV{HERE_I_AM}"'
From Perl: JH
Let's get the entire environment, though. env outputs every value in the way I just processed it:
$ perl -lwe 'my $o = `source set_stuff.sh && env | sort`; print $o'
...
DISPLAY=:0
EC2_PATH=/usr/local/ec2/ec2-api-tools
EDITOR=/usr/bin/vi
...
I have a few hundred varaibles set in the shell, and I don't want to expose most of them. Those are all set by the Perl process, so I can temporarily clear out %ENV:
$ perl -lwe 'local %ENV=(); my $o = `source set_stuff.sh && env | sort`; print $o'
HERE_I_AM=JH
PWD=/Users/brian/Desktop/test
SHLVL=1
_=/usr/bin/env
Put that together with the post processing code and you have a way to pass that information back up to the parent.
This is, by the way, similar to how you'd pass variables back up to a parent shell process. Since that output is already something the shell understands, you use the shell's eval instead of parsing it.

You can't. source is a shell function that 'imports' the contents of that script into your current environment. It's not an executable.
You can replicate some of it's functionality by rolling your own - run or parse whatever you're 'sourcing' and capture the result:
print `. file_to_source; echo $somevar`;
or similar.

Related

Assigning a variable in a shell script for use outside of the script

I have a shell script that sets a variable. I can access it inside the script, but I can't outside of it. Is it possible to make the variable global?
Accessing the variable before it's created returns nothing, as expected:
$ echo $mac
$
Creating the script to create the variable:
#!/bin/bash
mac=$(cat \/sys\/class\/net\/eth0\/address)
echo $mac
exit 0
Running the script gives the current mac address, as expected:
$ ./mac.sh
12:34:56:ab:cd:ef
$
Accessing the variable after its created returns nothing, NOT expected:
$ echo $mac
$
Is there a way I can access this variable at the command line and in other scripts?
A child process can't affect the parent process like that.
You have to use the . (dot) command — or, if you like C shell notations, the source command — to read the script (hence . script or source script):
. ./mac.sh
source ./mac.sh
Or you generate the assignment on standard output and use eval $(script) to set the variable:
$ cat mac.sh
#!/bin/bash
echo mac=$(cat /sys/class/net/eth0/address)
$ bash mac.sh
mac=12:34:56:ab:cd:ef
$ eval $(bash mac.sh)
$ echo $mac
12:34:56:ab:cd:ef
$
Note that if you use no slashes in specifying the script for the dot or source command, then the shell searches for the script in the directories listed in $PATH. The script does not have to be executable; readable is sufficient (and being read-only is beneficial in that you can't run the script accidentally).
It's not clear what all the backslashes in the pathname were supposed to do other than confuse; they're unnecessary.
See ssh-agent for precedent in generating a script like that.

How to export environment variable set in perl script to batch shell?

I am executing a perl script on windows, using a batch script.
I am setting below variable in batch script:
SET PATH_VAR=C:\Users\
I am able to access PATH_VAR in perl as below:
my $path1 = $ENV{'PATH_VAR'}
I would like to also export environment variables set in perl to batch. Like the inverse of what I am doing now.
Is there a way to do that?
PS:
I tried this, but it doesn't work:
$ENV{'PATH_Z'}="Hello World";
Changes to environment variables can not effect the parent process, it's part of how they work, so nothing you do in the Perl script can change the environment variables of the batch script. However any child process, started with exec(), system() or `` will see the changes you made in the Perl script.
The only way to do this is to have the Perl script output shell statements, and for the shell to evaluate the output.
Bash example:
$ export FOO=123
$ echo $FOO
123
$ perl -e 'print "export FOO=456\n"' ; echo $FOO
123
$ $(perl -e 'print "export FOO=789\n"') ; echo $FOO
789
Edit: I see OP is using Windows, so this answer doesn't apply :-(

Why doesn't LIMIT=\`ulimit -u\` work in bash?

In my program I need to know the maximum number of process I can run. So I write a script. It works when I run it in shell but but when in program using system("./limit.sh"). I work in bash.
Here is my code:
#/bin/bash
LIMIT=\`ulimit -u\`
ACTIVE=\`ps -u | wc -l \`
echo $LIMIT > limit.txt
echo $ACTIVE >> limit.txt
Anyone can help?
Why The Original Fails
Command substitution syntax doesn't work if escaped. When you run:
LIMIT=\`ulimit -u\`
...what you're doing is running a command named
-u`
...with the environment variable named LIMIT containing the value
`ulimit
...and unless you actually have a command that starts with -u and contains a backtick in its name, this can be expected to fail.
This is because using backticks makes characters which would otherwise be syntax into literals, and running a command with one or more var=value pairs preceding it treats those pairs as variables to export in the environment for the duration of that single command.
Doing It Better
#!/bin/bash
limit=$(ulimit -u)
active=$(ps -u | wc -l)
printf '%s\n' "$limit" "$active" >limit.txt
Leave off the backticks.
Use modern $() command substitution syntax.
Avoid multiple redirections.
Avoid all-caps names for your own variables (these names are used for variables with meaning to the OS or system; lowercase names are reserved for application use).
Doing It Right
#!/bin/bash
exec >limit.txt # open limit.txt as output for the rest of the script
ulimit -u # run ulimit -u, inheriting that FD for output
ps -u | wc -l # run your pipeline, likewise with output to the existing FD
You have a typo on the very first line: #/bin/bash should be #!/bin/bash - this is often known as a "shebang" line, for "hash" (#) + "bang" (!)
Without that syntax written correctly, the script is run through the system's default shell, which will see that line as just a comment.
As pointed out in comments, that also means only the standardised options available to the builtin ulimit command, which doesn't include -u.

Passing a variable into awk within a shell script

I have a shell script that I'm writing to search for a process by name and return output if that process is over a given value.
I'm working on finding the named process first. The script currently looks like this:
#!/bin/bash
findProcessName=$1
findCpuMax=$2
#echo "parameter 1: $findProcessName, parameter2: $findCpuMax"
tempFile=`mktemp /tmp/processsearch.XXXXXX`
#echo "tempDir: $tempFile"
processSnapshot=`ps aux > $tempFile`
findProcess=`awk -v pname="$findProcessName" '/pname/' $tempFile`
echo "process line: "$findProcess
`rm $tempFile`
The error is occuring when I try to pass the variable into the awk command. I checked my version of awk and it definitely does support the -v flag.
If I replace the '/pname/' portion of the findProcess variable assignment the script works.
I checked my syntax and it looks right. Could anyone point out where I'm going wrong?
The processSnapshot will always be empty: the ps output is going to the file
when you pass the pattern as a variable, use the pattern match operator:
findProcess=$( awk -v pname="$findProcessName" '$0 ~ pname' $tempFile )
only use backticks when you need the output of a command. This
`rm $tempFile`
executes the rm command, returns the output back to the shell and, it the output is non-empty, the shell attempts to execute that output as a command.
$ `echo foo`
bash: foo: command not found
$ `echo whoami`
jackman
Remove the backticks.
Of course, you don't need the temp file at all:
pgrep -fl $findProcessName

Can you wrapper each command in GNU's make?

I want to inject a transparent wrappering command on each shell command in a make file. Something like the time shell command. ( However, not the time command. This is a completely different command.)
Is there a way to specify some sort of wrapper or decorator for each shell command that gmake will issue?
Kind of. You can tell make to use a different shell.
SHELL = myshell
where myshell is a wrapper like
#!/bin/sh
time /bin/sh "$0" "$#"
However, the usual way to do that is to prefix a variable to all command calls. While I can't see any show-stopper for the SHELL approach, the prefix approach has the advantage that it's more flexible (you can specify different prefixes for different commands, and override prefix values on the command line), and could be visibly faster.
# Set Q=# to not display command names
TIME = time
foo:
$(Q)$(TIME) foo_compiler
And here's a complete, working example of a shell wrapper:
#!/bin/bash
RESULTZ=/home/rbroger1/repos/knl/results
if [ "$1" == "-c" ] ; then
shift
fi
strace -f -o `mktemp $RESULTZ/result_XXXXXXX` -e trace=open,stat64,execve,exit_group,chdir /bin/sh -c "$#" | awk '{if (match("Process PID=\d+ runs in (64|32) bit",$0) == 0) {print $0}}'
# EOF
I don't think there is a way to do what you want within GNUMake itself.
I have done things like modify the PATH env variable in the Makefile so a directory with my script linked to all name the bins I wanted wrapped was executed rather than the actual bin. The script would then look at how it was called and exec the actual bin with the wrapped command.
ie. exec time "$0" "$#"
These days I usually just update the targets in the Makefile itself. Keeping all your modifications to one file is usually better IMO than managing a directory of links.
Update
I defer to Gilles answer. It's a better answer than mine.
The program that GNU make(1) uses to run commands is specified by the SHELL make variable. It will run each command as
$SHELL -c <command>
You cannot get make to not put the -c in, since that is required for most shells. -c is passed as the first argument ($1) and <command> is passed as a single argument string as the second argument ($2).
You can write your own shell wrapper that prepends the command that you want, taking into account the -c:
#!/bin/sh
eval time "$2"
That will cause time to be run in front of each command. You need eval since $2 will often not be a single command and can contain all sorts of shell metacharacters that need to be expanded or processed.

Resources