My Jenkins job running on Kubuntu 14.04.3 consists out of two build steps:
Setting environment variables in bash via "Execute shell"
Executing a "CMake Build" process
My CMake scripts are very system dependent, which means that I exhaustively use the $ENV{UNIX_ENVIRONMENT_VARIABLE} command.
But the problem is, that the variables which I have set in the first step, can not be seen by CMake in the second one.
I've tried different solution:
Setting the variables via "Execute shell" build step via "export VAR=VAL"
Setting the variables via "export VAR=VAL" in the .bashrc of the jenkins user
I use for all configuration steps and as the common shell in Jenkins "/bin/bash".
It defenetly works with the "EnvInject" plugin, and if I add the variables to the "/etc/environment" file.
So my question is, what is wrong with the first two solutions?
Environment variables are per-process (and child processes).
So assuming "Execute Shell" runs its own shell then those variables won't be visible for any other spawned processes.
The .bashrc of the jenkins use should work assuming the spawned shell is an interactive shell (which is almost certainly isn't).
Try .bash_login for that instead.
That being said it would seem that whatever EnvInject is likely the better idea.
Related
I'm trying to run a Makefile for building a kernel module in QtCreator. I can successfully invoke the make file from the command line.
My assumption was that this shouldn't be a problem to set also in QtCreator by defining the build step as a custom command make.
It seems however that QtCreator is introducing some other working paths instead.
As the showcase above points, both the working directory and the script absolute path are set to /home/user/module which is the path in which the correct Makefile resides.
However, QtCreator seems to be searching for the Makefile at /home/user/Qt/Tools/QtCreator/bin/Makefile: No such file or directory.
Am I missing a setting somewhere or is this a bug?
You are using the PWD environment variable in your makefiles. This environment variable is updated only by a shell though, and custom process steps are not executed in a shell by default, but started directly as a child process. This means that PWD will stay as it is shown in the "Run Environment" section of the run configuration instead of being changed to the working directory of the step.
If your custom step depends on features of the shell, you should run it in a shell, i.e. set the "Command" to /bin/sh (or /bin/bash or whatever you prefer), and the "Arguments" to -c make (or whatever you need to pass to your preferred shell to execute a command).
What we're doing:
We're doing an automated deployment using a tool called Nolio. One of the steps we need to do is to set a few environment variables for applications that are being deployed - for example, JAVA_HOME pointing to our preferred java install directory.
We're using the SET command to permanently set the environment variables - and in most ways, it works great. If I right click on my computer and go into environment variables, they all appear perfectly.
The problem:
Unfortunately, later in the deployment, some command line commands are executed that rely on the environment variables, and the environment variables appear to not be set. Using SET without parameters verifies this by displaying all currently available variables.
Now, if I restart the computer, the command line commands work fine. So, the issue is that while the variables are permanently set and do appear in the GUI, they are not propagated to the command prompts until I reboot.
Another interesting tidbit: If I put the commands in a BAT file and double click it, it runs fine, but if I execute it in the command prompt the variables don't resolve prior to a reboot.
Does anyone know a way around this?
First, what version of Nolio do you use?
The Environment variables to which you set value, in the context of one Nolio action, stay in the scope of this action. (It's like opening two different shells on every action)
The best practice for this case would be using the environment variables arrays inputs in the Nolio 'Run Command Line' action. You should write two arrays of parallel Env variable names and values, and give them as input to the 'Run Command Line' action.
It appears your variables are not in scope for the command prompt. At what point in your deployment process are you using the SET command? Interesting that the GUI recognizes the values, but the command prompt doesn't until you've restarted.
Also, I'm not clear as to why using a .bat file is undesired. I can come up with my own reasons, but what are yours?
EDIT
I've found this article that shows a step that you didn't mention. Have you tried:
rem Set the JAVA_HOME environment variable and insert it into the system path.
rem This will make the javac and java commands reachable from the command line.
set JAVA_HOME="C:\Program Files\Java\jdk1.5.0_14"
set PATH=%JAVA_HOME%\bin;%PATH%
I'm not entirely sure why the command prompt won't recognise commands and the batch files will, but you could use SETX as an alternative to SET to see if that resolves your issues.
I need to set some environment variables in Ubuntu. I do the following and it works:
export PATH="/home/vagrant/ns-allinone-2.35/bin:/home/vagrant/ns-allinone-2.35/tcl8.5.10/unix:/home/vagrant/ns-allinone-2.35/tk8.5.10/unix:$PATH"
export LD_LIBRARY_PATH="/home/vagrant/ns-allinone-2.35/otcl-1.14:/home/vagrant/ns-allinone-2.35/lib"
export TCL_LIBRARY="/home/vagrant/ns-allinone-2.35/tcl8.5.10/library"
But I move the same thing in a script envexport.sh and execute it, the environment variables are not getting set.
Where am I going wrong? How to accomplish this?
Thanks.
If you just run the script, the environment variables get destroyed when the script finishes.
Use . envexport.sh. That way the commands get executed in the current shell (environment).
When you run a command in the shell, the shell creates a subprocess (child process). All the environment variables which were defined or changed down in the subprocess will be lost to the parent process.
However if you source a script, you force the script to run in the current process. That means environment variables in the script you ran will not be lost.
One thing that may help is if you will want those variables set for all of your sessions you can place the same commands in your .bashrc file by running the following command and pasting the lines in the file.
vim ~/.bashrc
and the run
source ~/.bashrc
in any terminals you currently are running. If you start any new terminals they will automatically have your directories added to your path.
How can I always run Ruby scripts with warnings turned on by default, by modifying my Unix or Windows environment variables?
Ideally this should work even when I'm running a script indirectly such as through Rake, not just when I'm running it directly.
Based on a comment in this answer.
The RUBYOPT environment variable defines default options like warnings, etc.
Unix/OS X/etc:
export RUBYOPT=-w
You can put this in your startup script in Unix so it's set for new shells.
Windows:
set RUBYOPT=-w
Use the system properties dialog to set it for new shells/command windows.
I've tried executing the following:
#!C:\cygwin\bin\bash.exe
ls ${WORKSPACE}
But that doesn't find ls (even if it's on the windows path). Is there any way to set this up?
UPDATE: In other words, I want to be able to set up a build step that uses cygwin bash instead of windows cmd like this page shows you how to do with Python.
So put your cygwin's bin directory in your PATH.
In case you don't know how to do it (Control Panel -> System -> Advanced -> Environment Variables), see: http://support.microsoft.com/kb/310519
That shell-script has two errors: the hash-bang line should be "#!/bin/bash", and ${WORKSPACE} is not a shell-variable. Hudson has a bunch of variables of its own which are expanded in the commands you specify to be run (i.e. when you add commands in the web gui).
If you want to run Hudson build step on the Cygwin command line, you need to figure out what command Hudson runs and in which directory.
To give a more specific answer, you need to show us how your project is configured and what steps you want to run separately.
Provided cygwin's bin folder is in your path, the following works for me:
#!/bin/sh
ls ${WORKSPACE}
I find Hudson does not pick up environment variable changes unless you restart the server.
you might want to try to give a full path to ls
/cygdrive/c/cygwin/bin/ls
One other thing that seems to work is to use this:
#!C:\cygwin\bin\bash.exe
export PATH=$PATH:/usr/bin
ls
But it would be nice not to have to modify the path for every script.
Have you thought about power shell? as much as I like cygwin, it's always been a little flaky, powershell is a solid fully functional shell on windows, another option is Windows Services for UNIX it gives you korn shell or c shell not quite as nice as bash but it gets the job done
You will need to pass the --login (aka -l) option to bash so that it will source Cygwin's /etc/profile and set up the PATH variable correctly. This will cause the current directory to get changed to the default "home" but you can set the environment variable CHERE_INVOKING to 1 before running bash -l and it will stay in the current directory if you need to preserve that.