Substitute variable as an input from the STDOUT of a console application - cmd

In our release pipeline, we have a console app that performs a function which generates an encryption key and outputs it to STDOUT. We need to be able to use this value in a variable during deployment (updating a configuration file with the results from the console app). We've tried using the Output Variables option in the command line task in Azure Devops but unfortunately we need it in a different format... and it just doesn't seem to work as expected.
E.g. Our cmd line tool outputs 908321093RANDOMLYGENERATEDKEY3422543 to STDOUT
The name in our config file for that key is something like Settings.Security.OurKey however the output variable in the command line task does not allow periods (.) and as such is set to SettingsSecurityOurKey... we've also tried SETTINGS_SECURITY_OURKEY, but the variable value is never set by the task.
Is it possible to somehow set the Azure Devops variable to the value of the output variable from the command line or a powershell script? Something like:
set $(Settings.Security.OurKey) = SettingsSecurityOurKey
Or is there a simpler method of achieving this? It seems like it shouldn't be that difficult..

This sounds like a Powershell issue rather than an issue with Azure DevOps.
# Variable name with special characters
$VariableName.That.Contains.Periods # This will NOT work.
${VariableName.That.Contains.Periods} # This will work.
Refer this for more information: https://blog.danskingdom.com/accessing-powershell-variables-with-periods-in-their-name/

If you want a PowerShell variable to contain the standard output from a command, just assign it:
$yourVariableName = your_command_that_writes_to_stdout
If the output is only one line, the PowerShell variable will contain a single string; otherwise it will contain an array.

Related

How to "pipe" console values into interactive Windows PowerShell command (to make it non-interactive)

Does Windows PowerShell have anything that's similar to piping user-supplied values into interactive Unix/Linux bash commands to make them run non-interactive?
What I mean is something like this:
https://www.igorkromin.net/index.php/2017/04/12/invoking-interactive-shell-scripts-non-interactively
I need to supply values directly via PowerShell instead of letting the user type them on the console.
What I mean is not something that would be equivalent to xargs in bash, as that would only allow me to supplant command line parameters. For the specific command that I have in mind, default values can be specified on the command line, but are not suitable for the specific task. There are not any other parameters that can be given on the command line - values other than default are normally given by user input.
The only thing I tried out was similar to how you would do it in bash:
echo VALUE | CMD
This didn't work as the command still asked for user input on the console.

Set a Variable in Concourse Pipeline

I would like to run a command in my pipeline and then save the result in a variable to be used later on in the pipeline. The command I want to run is
gh release view | head -n 1 | cut -f 1
I can log into Github and everything else, so that is not a problem. My only issue is saving the result to a variable and using that variable.
How can I do this?
Unfortunately not. You must write the contents of the variable to file and use inputs and outputs to communicate between tasks. If you need to use the output between jobs, you'll also need a resource as described in the excerpt from https://docs.concourse.farm/power-tutorial/00-core-concepts
When inputs are passed between steps within a job they can remain just
that: inputs/outputs. For passing inputs/outputs between jobs, you
must use resources. A resource is an input/output set whose state is
retrieved/stored externally by a job, e.g. a git repo or an S3 object.
Of course, once a task receives an input from the previous task, it can then be read into a variable.

Environment variables not working properly

I'm trying to run an application that reads an environment variable that contains a JSON with about 22k characters. The project setup tells me to use $(cat ./path/to/file) to correctly configure it, but as I'm using windows, this commands do not work.
I've tried copying the contents of the file to the variable using the GUI Environment Variable, but its input truncates the value to a certain limit which is not even on half of the file.
After this I tried setting the variable using the Powershell with the command:
$env:myvar = iex '$(type path/to/file)'
and then saving the result with:
[System.Environment]::SetEnvironmentVariable('MYVAR', $env:MYVAR, [System.EnvironmentVariableTarget]::Machine)
After these commands, Powershell is able to print the result correctly but CMD still prints only part of the value when I echo it.
This is very odd because the regedit shows the correct value as suggested here.
The application still can't process the value because it is not complete.
Is there any fix for this?
Note: This answer applies to Windows.
tl;dr
While you can store up to 32,766 characters in a single environment variable, the standard retrieval mechanisms in cmd.exe and PowerShell / .NET (as of v7.1 / 5.0) support only up to 4,095.
A workaround in PowerShell is possible, but ultimately it comes down to whether the target executable that is meant to read an environment-variable value supports reading values up to the technical maximum length.
The technical limit for the number of characters in a single environment variable is 32,766 (32KB = 32768, minus 2).
Starting with Windows Server 2008 / Windows Vista, there is no longer a limit on the overall size of the environment block - see the docs.
However, depending on how the environment-variable is retrieved, the limit may be lower:
Both cmd.exe and PowerShell, as of v7.1 / .NET 5.0, support retrieving at most 4,095 characters.
However, in PowerShell you can retrieve longer values, assuming the variable of interest is defined persistently in the registry, and assuming that you know whether it is defined at the machine or user level; e.g., for a MYVAR environment variable:
At the machine level:
Get-ItemPropertyValue 'registry::HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Environment' MYVAR
At the user level:
Get-ItemPropertyValue registry::HKEY_CURRENT_USER\Environment MYVAR
try the type command. It is the windows equivalent of the unix cat command. This means storing the json inside of a seperate file and using the command "type <path_to_file>".

Jenkins environment variables picking up stray commas, how do I prevent this?

I have a problem with using jenkins environment variables. I am writing a batch file that can be called when a build is run that will write a file into the build directory with info about how the files were generated. (branch, date time, git revision, etc.)
The batch file just takes in some of the jenkins environment vars via command line parameters, and writes them to a text file. Here is the batch call I make via the Execute Shell step during build:
c:\\temp\\~BuildStamper.bat "$GIT_COMMIT", "$BUILD_URL", "$JOB_NAME", "$BUILD_ID", "$WORKSPACE", "$GIT_BRANCH", "$BUILD_USER"
I have noticed that for arguments that contain white space, an extra comma is being appended inside the quoted delimiters. Sample line from generated text file:
Job Name: "Departure Board Build and Publish,"
I know that it isn't a problem with the processing I am doing inside the batch file, because I can see the values that are passed into the batch file in the job logs Jenkins generates, and the commas exist in the values when they are passed to my batch file.
It almost looks like Jenkins is incorrectly splitting a comma delimited string when it encounters strings with white space, but I couldn't find anything on the net about a problem with Jenkins of that nature.
Anyone else seen this? Am I doing something wrong? I tried passing the vars to Jenkins sans quotes, but then the batch file starts reading each word as a separate argument.
KeepCalmAndCarryOn nailed it, commas are totally extraneous.

AWS Data Pipeline: setting local variable in shell command

I am trying to make use of the uuid library within a shell command invoked by an AWS data pipeline. It seems like the uuid function works fine, but when I try to pass this value to a variable, the data is lost.
A snippet of my testing script below:
sudo yum -y install uuid-devel
myExportId=$(uuid)
echo 'myExportId:' $myExportId
uuid
When I look at the activity log for the pipeline I see that the uuid function seems to be working, but the variable does not seem to contain anything?
myExportId:
b6cf791a-1d5e-11e6-a581-122c089c2e25
I notice this same behavior with other local variables in my scripts. Am I expressing these incorrectly?
My pipeline parameters are working fine within the script, so no issues there.
Shortly after posting this I realized that I had encoded some of the above steps in subshells within my pipeline definition. It can be difficult debugging shell scripts embedded within JSON, so I think I will move on to using the scriptUri parameter pointing to a bash file.

Resources