How to pass variables in a shell script as arguments? - bash

I have a shell script in which I am passing the same value many times as arguments. Instead of hardcoding this redundant value across all py scripts, I wanted to see if I can work with variables and pass those into the py script as arguments.
Here is the script.sh:
python A.py --month=10
python B.py --country=USA --month=10
I would like something like this:
#Setting variables to pass into args
country=USA
month=10
python A.py --month=month
python B.py --country=country --month=month
How would I do this?

To access the value of any variable in shell script, you can do it using the $ sign.
So below is how you can do it :
#Setting variables to pass into args
country=USA
month=10
python A.py --month="$month"
python B.py --country="$country" --month="$month"

If you have to use arguments passed to script, try something like below
script SomeMonth SomeCountry
Then in script
python A.py --month="$1"
python B.py --country="$2" --month="$1"
The positional parameters..
$0 = script (script Name)
$1 = SomeMonth (first argument)
$2 = SomeCountry (Second argument)
And so on..

Just to add something to the other correct answers. If you use those arguments many times, you can also embed the argument name together with its value, and save typing and errors:
# Setting variables to pass into args
p_country="--country=USA"
p_month="--month=10"
python A.py $p_month
python B.py $p_country $p_month
Note 1: beware of values containing spaces, or any other "strange" characters (meaningful to the shell). Ordinary words or numbers, like "USA" and "10" pose no problems.
Note 2: you can also store multiple "arg=value" in a single variable; it can save even more typing. For example:
p_m_and_c="--country=USA --month=10"
...
python B.py $p_m_and_c

Related

CMD to bash script conversion semicolumn

I am trying to convert a CMD script (.bat) into a .sh script on Linux side.
I did not find a proper documentation for instance for the following lines
set PATH="${PATH1}%;${PATH_NAME};"
another_script.bat -create "%LocalDestination%TEST;%LocalDestination%" -e %GenericEnvironementName% -d "%SettingsPath%/Env"
For the first one it is an export but I do not know it is like the if condition?
${PATH1}=${PATH_NAME}
export PATH=$PATH1
for the second one the expression
"%LocalDestination%TEST;%LocalDestination%" it's like an assignement? why we put the % at the end?
$LocalDestination$TEST = $LocalDestination
%GenericEnvironementName% will be $GenericEnvironementName
%SettingsPath%/Env >>> $SettingsPath/Env?
Variables in dos bat files are delimited with %, before AND after. So %VAR% is replaced by the value of VAR.
set PATH="${PATH1};${PATH_NAME};" assigns the values of PATH1 and PATH_NAME to variable PATH, separated by ;.
In Bash you would write: export PATH="$PATH1;$PATH_NAME"
Therefore, yes, any variable referencing is bash is done with $ before the variable name. So %TATA% becomes $TATA.
Example: %SettingsPath%/Env --> ${SettingsPath}/Env

csh scripting - passing variables and calling csh script from another csh script

I am trying to understand passing two variables from one csh script to another csh script. Perform simple arithmatic function then passing the sum variable to the orginal csh script. Then outputing the sum variable to standard output and a file.
script1.sh
#!/bin/csh
setenv num1 3
setenv num2 2
source script2.sh
set total=$sum
echo $total > total.txt
echo $total
script2.sh
#!/bin/csh
set int1=$num1
set int2=$num2
set sum=$int1 + $int2
With 'csh', the 'set' command perform simple (string) assignment, and takes a single word. The assignment 'set sum=$int +$int2' has 2 issues
It does not use a single word
Even the expression was combined into a single word (set sum="$int1 + $int2"), the set does not evaluate expression.
Consider instead using the '#' command which will take an expression
# sum2 = $int1 + $int2
Sde note: For 'csh' script, the common convention is to use a '.csh' suffix. The '.sh' suffix is commonly used to files associated with sh-like shells (sh, bash, ash, ...)

How to pass make flags stored in text files in command line?

I have a text file called OPTIONS.txt storing all flags of Makefile:
arg1=foo arg2="-foo -bar"
I want to pass all flags in this file to make. However,
make `cat OPTIONS.txt`
fails with make: invalid option -- 'a'. It seems that shell interprets it as:
make arg1=foo arg2="-foo -bar"
^argv[1] ^argv[2] ^argv[3]
Is there any way to make it interpreted as:
make arg1=foo arg2="-foo -bar"
^argv[1] ^--------argv[2]
Since you control the options file, store the options one per line:
arg1=foo
arg2="-foo -bar"
Then in the shell, you'll read the file into an array, one element per line:
readarray -t opts < OPTIONS.txt
Now you can invoke make and keep the options whole:
make "${opts[#]}"
If you want the shell to interpret quotes after backtick expansion you need to use eval, like this:
eval make `cat OPTIONS.txt`
however just be aware that this evaluates everything, so if you have quoted content outside of the backticks you'll get the same issue:
eval make `cat OPTIONS.txt` arg4="one two"
will give an error. You'd have to double-quote the arg4, something like this:
eval make `cat OPTIONS.txt` arg4='"one two"'
In general it's tricky to do stuff like this from the command line, outside of scripts.
ETA
The real problem here is that we don't have a set of requirements. Why do you want to put these into a file, and what kind of things are you adding; are they only makefile variable assignments, or are there other make options here as well such as -k or similar?
IF the OP controls (can change) the format of the file AND the file contains content only used by make AND the OP doesn't care about the variables being command line assignments vs. regular assignments AND there are only variable assignments and not other options, then they can just (a) put each variable assignment on its own line, (b) remove all quotes, and (c) use include OPTIONS.txt from inside the makefile to "import" them.

Replacemet of a variable in shell script from another file - preferably python

My shell script:
#!/bin/sh
.....
a='30'
b='2'
c='0.9'
.....
Is there a way to replace the values of my variables using python script/any other language? I should be able to replace the values with multiple values also. For example, I should be able to set b = '2 1 3' and a = '0.9 0.8' ..
It's simple: add a line to the top of your Bash script:
source thevars.sh
Then you can write all the variable assignments you require using a trivial Python script to create thevars.sh.
The Bash script can be modified slightly to have default arguments in case thevars.sh does not exist:
test -f thevars.sh && source thevars.sh
a=${a-30}
b=${b-2
c=${c-0.9}
Then if thevars.sh has any variables assigned, they will be used instead of the defaults--for example:
a="foo bar"
b=9001

How Can I access bash variables in tcl(expect) script

How Can I access bash variables in tcl(expect) script.
I have bash file say f1.sh which set some variables like
export var1=a1
export var2=a2
These variable I need to use in my expect script .
I tried using this in my script which does not work
system "./f1.sh"
puts "var1 is $::env(var1)"
puts "var2 is $::env(var2)"
But this does not seems to work.
I see that non of the variable from f1.sh are getting set as environment variable.
system "./f1.sh" << # Is this command in my script right ?
How I need to access these bash variables from tcl file.
I would say that this problem is rather general. First I met this problem, when I wanted to initialize Microsoft Visual Studio environment (which is done using .cmd script) in PoserShell. Later I've faced this problem with other scripting languages in any combinations (Bash, Tcl, Python etc.).
Solution provided by Hai Vu is good. It works well, if you know from the beginning, which variables you need. However, if you are going to use script for initialization of some environment it my contains dozens of variables (which you don't even need to know about, but which are needed for normal operation of the environment).
In general, the solution for the problem is following:
Execute script and at the end print ALL environment variables and capture the output.
Match lines of output for the pattern like "variable=value", where is what you want to get.
Set environment variables using facilities of your language.
I do not have ready made solution, but I guess, that something similar to this should work (note, that snippets below was not tested - they are aimed only to give an idea of the solution):
Execute script; print vars and capture the output (argument expanding - {*} - requires Tcl 8.5, here we can go without it, but I prefer to use it):
set bashCommand {bash -c 'myScriptName arg1 arg2 2>&1 >/dev/null && export -p'}
if [catch {*}${bashCommand} output] {
set errMsg "ERROR: Failed to run script."
append errMsg "\n" $output
error $errMsg
}
;# If we get here, output contains the output of "export -p" command
Parse the output of the command:
set vars [dict create]
foreach line [split $output "\n"] {
regex -- {^declare -x ([[:alpha:]_]*)=\"(.*)\"$} $line dummy var val
;# 3. Store var-val pair of set env var.
}
Store var-val pair or set env var. Here several approaches can be used:
3.1. Set Tcl variables and use them like this (depending on context):
set $var $val
or
variable $var $val
3.2. Set environment variable (actually, sub-case of 3.1):
global ::env
set ::env($var) $val
3.3 Set dict or array and use it within your application (or script) without modification of global environment:
set myEnv($var) val ;# set array
dict set myEnvDict $var $val ;# set dict
I'd like to repeat, that this is only the idea of the receipt. And more important, that as most of the modern scripting languages support regexes, this receipt can provide bridge between almost arbitrary pair of languages, but not only Bash<->Tcl
You can use a here-document, like this:
#!/bin/bash
process=ssh
expect <<EOF
spawn $process
...
EOF
Exported variables are only passed from a parent process to it's children, not the other way around. The script f1.sh (actually the bash instance that's running the script) gets it's own copies of var1 and var2 and it doesn't matter if it changes them, the changes are lost when it exits. For variable exporting to work, you would need to start the expect script from the bash script.
In f1.sh, printf what you want to return...
printf '%s\n%s\n' "$var1" "$var2"
...and read it with exec in Tcl:
lassign [split [exec ./f1.sh] \n] var1 var2
Perhaps I did not look hard enough, but I don't see any way to do this. When you execute the bash script, you create a different process. What happens in that process does not propagate back to the current process.
We can work-around this issue by doing the following (thanks to potrzebie for the idea):
Duplicate the bash script to a temp script
Append to the temp script some commands at the end to echo a marker, and a list of variables and their values
Execute the temp script and parse the output
The result is a list of alternating names and values. We use this list to set the environment variables for our process.
#!/usr/bin/env tclsh
package require fileutil
# Execute a bash script and extract some environment variables
proc getBashVar {bashScript varsList} {
# Duplicate the bash script to a temp script
set tempScriptName [fileutil::tempfile getBashVar]
file copy -force $bashScript $tempScriptName
# Append a marker to the end of the script. We need this marker to
# identify where in the output to begin extracting the variables.
# After that append the list of specified varibles and their values.
set f [open $tempScriptName a]
set marker "#XXX-MARKER"
puts $f "\necho \\$marker"
foreach var $varsList {
puts $f "echo $var \\\"$$var\\\" "
}
close $f
# Execute the temp script and parse the output
set scriptOutput [exec bash $tempScriptName]
append pattern $marker {\s*(.*)}
regexp $pattern $scriptOutput all vars
# Set the environment
array set ::env $vars
# Finally, delete the temp script to clean up
file delete $tempScriptName
}
# Test
getBashVar f1.sh {var1 var2}
puts "var1 = $::env(var1)"
puts "var2 = $::env(var2)"

Resources