I would like to be able to create and set the value of constants in an Expect script.
In scripts such as Bash and a great many programming languages, one can create and assign a value to a constant. I have tried many online sites looking how to do this in Expect, but I have surprisingly not been able to come across this basic information.
Creating and assigning a variable named name in Bash: name="Bob"
Creating and assigning a constant named CONFIG_FILE in Bash: readonly CONFIG_FILE="Configuration.ini"
Creating and assigning a variable named copyPath in Expect: set copyPath "/home/bob/tmp"
Creating and assigning a constant in Expect: ?????
How does one create and set the value of constants in an Expect script?
Like Python, TCL (the scripting language used for Expect) doesn't have constants.
I would suggest going the python route. Define them in all caps, don't change them and hope everyone else gets the hint.
set COPYPATH "/home/bob/tmp"
set PI 3.14159265359
If you really think you need them, there are some hacks in the link glenn jackman posted in the comments https://wiki.tcl-lang.org/page/constants.
Related
The help text for set -a is somewhat cryptic to me, but maybe i just don't understand what bash is doing behind the scenes with variables. Can someone explain why "marking" variables is needed (what are "unmarked" variables)?
My bash man-page uses a slightly different wording for this: Each variable or function that is created or modified is given the export attribute and marked for export to the environment of subsequent commands. This sounds more clear to me. It means:
Each variable you define in your script, is placed in the environment of any child process you create, and each function you define is also available to each bash-child-process you create.
I'm writing an experimental Bash module system that would allow local function namespaces, and my first idea was to write a Bash function parser that would read the function code line by line and prepend each function/variable name with <module-name>. (i.e. function func in module module would become module.func - which could again be imported in another module like module_2.module.func and so on; variables inside functions would be name-mangled - variable var within function func in module module would become __module_func_var).
However, in order to do that, I need a way to detect which names are variables and replace all their occurences in the function with the transported import-name. Trivial cases like variable=[...] are easily parsable, but there are countless of other cases where it's not that trivial - what about while read variable; do [...] done and variable2="asdf${variable//_/+}"?
It seems to me that in order to do this I need to dive into the parsing mechanisms of Bash or read a book on programming languages - but where do I start in order to achieve what I have explained above?
I need a way to detect which names are variables
I'm sorry to say this, but in general it's impossible.
Supporting only the static cases where variables can occur is possible but very tricky. Consider only variable assignments: Besides x= there are declare x=, printf -v x, read x, mapfile x, readarray x and probably many more. Even mature tools like shellcheck still have problems parsing all these cases correctly (for instance, see this issue).
However, even if you mastered parsing all the static cases correctly there still could by dynamic variables, for instance:
x=$(someCommand)
declare "$x=something"
In this example you cannot know the name of the new variable without executing someCommand. Other things which are equally (or even) worse are bash's indirection operator ${!x}, implicit indirection in arithmetic contexts (e.g. x=y; echo $((x))), and eval.
tl;dr: The only way to get all the variables in a script is to interpret/execute the script.
But here comes another problem: Executing the script is also not an option if there is non-determinism (declare "$(tr -cd a-z /dev/urandom | head -c1)=..."). Note that user-input is also non-deterministic (read x; declare "var$x=..."). You would have to write a static analyzer. But this is also not an option because of the halting problem. From the halting problem we can deduce that it is (in general) impossible to tell whether a given bash script has a finite amount of variables.
To implement your module system you could use another approach. For instance, if someone wants to implement a module for your framework then they have to specify the functions/variables in this module in an easy parsable format.
I need to refer to the same variable in several functions in a python script. (The write context of a CGPDFDocument).
I can either write global writeContext in every function;
or add the variable as an argument to every function;
....both of which seem to be excessive duplication.
Other answers to questions about global variables in python suggest that they are "bad".
So what's the better way of handling it?
I am using bash.
There is an environment variable that I want to either append if it is already set like:
PATH=$PATH":/path/to/bin"
Or if it doesn't already exist I want to simply set it:
PATH="/path/to/bin"
Is there a one line statement to do this?
Obviously the PATH environment variable is pretty much always set but it was easiest to write this question with.
A little improvement on Michael Burr's answer. This works with set -u (set -o nounset) as well:
PATH=${PATH:+$PATH:}/path/to/bin
PATH=${PATH}${PATH:+:}/path/to/bin
${PATH} evaluates to nothing if PATH is not set/empty, otherwise it evaluates to the current path
${PATH:+:} evaluates to nothing if PATH is not set, otherwise it evaluates to ":"
The answers from Michael Burr and user spbnick are already excellent and illustrate the principle. I just want to add two more details:
In their versions, the new path is added to the end of PATH. This is what the OP asked, but it is a less common practice. Adding to the end means that the commands will only be picked if no other commands match from earlier paths. More commonly, users will add to the front to path. This is not what the OP asked, but for other users coming here it may be closer to what they expect. Since the syntax is different I'm highlighting it here.
Also, in the previous versions, the PATH is not quoted. While its unlikely on most Un*x-like operating systems to have spaces in PATH, it is still better practice to always quote.
My slightly improved version, for most typical use cases, is
PATH="/path/to/bin${PATH:+:$PATH}"
What does VAR_NAME=${VAR_NAME:-"/some/path/file"} mean in an shell script?
This is for an init script, I'm writing a custom one to get some of our startup operations into init scripts so that we can start them automatically on boot, but I don't have much experience with shell scripting so I'm using a startup script for an unrelated piece of software that's we've customized in the past.
The path pointed to is to a file that contains configuration values that override defaults set in the script.
I'm having trouble figuring out what that construct really means (the :- part in particular).
The script I'm working off of also seems to chain this operation together to resolve which value to use such as:
LOG_FILE=${LOG_FILE:-${LOGFILE:-$DEFAULT_LOG_FILE}}
${parameter:-word}
Use Default Values. If parameter is unset or null, the expansion
of word shall be substituted; otherwise, the value of parameter shall be
substituted.
It sets VAR_NAME equal to VAR_NAME if it exists or /some/path/file if it doesn't.
Chaining it would only make sense if the variable names were different going down the chain.