Following is the issue, I am encountering
I have a shell script named my_script.sh.IN. I use configure_file for expanding script and thus creating my_script.sh.
I then need to run my_script.sh. For this purpose, I am using execute_process().
I need all of this at CMake time.
Issue: Problem is that when I run "cmake", system complains that he could not find "my_script.sh". I think that execute_process dependencies seem to be calculated before configure_file() function runs.
When I run "cmake" command second time, everything goes fine. Does anybody knows that how can I able to execute configure_file before execute_process?
You should try something like that:
set_source_files_properties("pat/to/my_script.sh" PROPERTIES GENERATED true)
It tells cmake to not check the existency of the file too early. You will probably have to use the variable containing the path of your generated shell script instead of typing directly its path.
But your issue is more probably related to order of the executed cmake commands. You should ensure configure_file() is run BEFORE execute_process() by CMake parser.
Related
I installed Haskell on my MacOS system using ghcup installer. It worked because if I type ghci I am dropped into this interactive shell. However I got this message in the terminal after doing the install:
In order to run ghc and cabal, you need to adjust your PATH variable.
You may want to source '/Users/user1/.ghcup/env' in your shell
configuration to do so (e.g. ~/.bashrc).
Detected bash shell on your system...
If you want ghcup to automatically add the required PATH variable to "/Users/user1/.bashrc"
answer with YES, otherwise with NO and press ENTER.
YES
grep: /Users/user1/.bashrc: No such file or directory
My shell is bash 3.2 But as you can see, when I typed YES it says there is no such file. How do I find my shell configuration file, or resolve this? I'd like to complete the setup correctly here.
And I have to be honest about my level of knowledge here, I don't truly understand what this is asking exactly. Is the PATH variable 'env'?
On macOS, .bashrc does not exist by default. ghcup will create this file, so the command you ran will have worked correctly. However, one of ghcup's subcommands expected to find the file before it was created, and therefore reported that error message. You can safely ignore this.
I'm trying to run a Makefile for building a kernel module in QtCreator. I can successfully invoke the make file from the command line.
My assumption was that this shouldn't be a problem to set also in QtCreator by defining the build step as a custom command make.
It seems however that QtCreator is introducing some other working paths instead.
As the showcase above points, both the working directory and the script absolute path are set to /home/user/module which is the path in which the correct Makefile resides.
However, QtCreator seems to be searching for the Makefile at /home/user/Qt/Tools/QtCreator/bin/Makefile: No such file or directory.
Am I missing a setting somewhere or is this a bug?
You are using the PWD environment variable in your makefiles. This environment variable is updated only by a shell though, and custom process steps are not executed in a shell by default, but started directly as a child process. This means that PWD will stay as it is shown in the "Run Environment" section of the run configuration instead of being changed to the working directory of the step.
If your custom step depends on features of the shell, you should run it in a shell, i.e. set the "Command" to /bin/sh (or /bin/bash or whatever you prefer), and the "Arguments" to -c make (or whatever you need to pass to your preferred shell to execute a command).
I have script that does a bunch of stuff. It sources a bunch of functions that are in the directory the script is being run from. i.e.
/home/me/script.sh
/home/me/function1
/home/me/function2
If I cd into /home/me and run ./script.sh everything works fine. The functions are sourced and do what needs to be done.
However, if I try to run this as a cron job, it will run up until the point I am trying to source the functions, and then it just stops and the process is terminated (if I run it directly from the directory, at least I get some errors).
Like wise, if I try to run this from another directory, I get a bunch of errors. e.g.
cd /opt/
/home/me/script.sh
function1: command not found
function2: command not found
I'm sure this has something to do with environmental variables, but I have no idea which ones. I have tried setting (in crontab):
PATH=/home/me
SHELL=/bin/bash
But that doesn't work either. Any help is appreciated. I don't want to hard code in the paths to the functions, and instead make them relative to the path the script is in (preferably the same dir).
Please let me know if you need any more information.
You are most probably aware of this, but just to be clear: A shell function does not have a path. They just need to be loaded into the current shell by sourcing the script that contains them:
source /path/to/functions
or
cd /path/to/functions
source functions
If you are talking about shell programs (scripts) instead, then you need to account for the fact that on Unix-like OS, the current directory is never in the PATH by default:
/path/to/functions/function1
or
cd /path/to/functions
./function1
You tagged your question Bash, but note that to be POSIX-compatible (e.g. if using sh), you have to use the . keyword (instead of either . or source on Bash) and the same restrictions regarding the PATH as for command execution apply, see dot:
. ./function1
What does 'export' do when used in a command line.
For example, and this is only one example, I build a number of C++ libraries and for a library such as zlib-1.2.8 I need to specify the install directories.
To do this I need to do the following in MSYS command line interface. This is just one example
export LIBRARY_PATH="c/libraries/libs;$LIBRARY_PATH"
Would anyone know what the command 'export' actually does in this instance?
Does it permanently install a record for MSYS to user later on when looking for dependencies such as ZLIB . My using make install the zlib library file is placed in this directory.
OR, when I close MSYS is this LIBRARY_PATH lost from MSYS memory?
Thanks
This is the bash syntax to set an environment variable. Using export allows the variable to be seen outside the script in which it's defined.
Environment variables only affect the msys process and any child processes started from that shell. If you want it to persist after you close the command line and start a new one, you will need to put it into a script such as .bashrc
I followed this instruction
http://smlnj.cs.uchicago.edu/dist/working/110.70/NOTES/INSTALL
and install smlnj on mu laptop(ubuntu).
However, when I want run run the sml
I have to change to /usr/share/smlnj/bin/
and run ./sml
before I can use sml.
I read something before, that I can add it to my PATH?
so that I can run ./sml without going to that directory?
Well, either add /usr/share/smlnj/bin to your PATH env variable, or make a symlink to the command in a standard folder like /usr/bin, or write a small wrapper script which would also allow you to make additional adjustments like working directory, further environment variables and the like...