I am running this Octopus community script for creating or updating a lambda function.
When we hard-code values for the parameters, the script works as advertised. However, when we define variables for use in the parameters, it always injects the name of the Octopus variable instead of the value.
Such that a variable named AWS_Dash_OrderOnline_Lambda_Function_Name is read as #{AWS_Dash_OrderOnline_Lambda_Function_Name} for the value instead of the actual variable value.
What's going on and what do I need to change?
Pertinent script code is below:
# Get the parameters.
$functionName = $OctopusParameters['FunctionName']
...
Write-Output $functionName
Output:
#{AWS_Dash_OrderOnline_Lambda_Function_Name}
You'll get the variable back as the value like this if a value cannot be determined. This could be because of a typo in the name, or it could be that there is no value for the variable that matches the scope of the current deployment.
For example, you may have a value for a variable called Foo defined for Prod and not for Dev. When you run a deployment into Dev you'll get #{Foo} but in Prod you'd get the actual value.
A technique I've used is to provide an unscoped value of something like "UnscopedFoo", then if you see that value you know you've got the name entered correctly and it's a scoping issue. If you don't then the name isn't entered correctly.
If you do not properly define your variable's scope, you will get the result as indicated in the question.
Define your scope, environment, roles, channels, etc properly and your variable values will import correctly.
Related
I am experiencing weird behavior with YAML variables, parameters, and Azure pipeline resource references. The following shows the original implementation that works compared to my new implementation with a single line change that fails.
Working Implementation
Template A (makes a call to template B):
- template: Templates\TemplateB.yml
serviceBuildResourceName: resourceName
Template B (uses serviceBuildResourceName param to get pipeline run information):
$projectId = '$(resources.pipeline.${{ parameters.serviceBuildResourceName }}.projectID)'
$pipelineId ='$(resources.pipeline.${{ parameters.serviceBuildResourceName }}.PipelineID)'
Template B goes on to use the values in $projectId and $pipelineId (along with other values not listed here since it is irrelevant) to successfully retrieve information about the a pipeline run from the specific pipeline resource, serviceBuildResourceName. Note that all pipeline resources are correctly defined at the beginning yaml file for the pipeline. In this implementation above, everything works perfectly.
Failing Implementation
Template A (makes a call to template B):
- template: Templates\TemplateB.yml
serviceBuildResourceName: $(ServiceBuildResourceName)
Template B (uses serviceBuildResourceName param to get pipeline run information):
$projectId = '$(resources.pipeline.${{ parameters.serviceBuildResourceName }}.projectID)'
$pipelineId ='$(resources.pipeline.${{ parameters.serviceBuildResourceName }}.PipelineID)'
Note that the only difference is the following: instead of passing the hard-coded string into the serviceBuildResourceName parameter, I pass in a variable, which has the same value as before, resourceName. The variable is defined in an earlier template as such:
- name: ServiceBuildResourceName
value: resourceName
I feel it should still work the same, but I know get the following error in my pipeline run:
WARNING: 2023-02-12 15:52:29.5071 Response body: {"$id":"1","innerException":null,"message":"The value is not an integer.
$(resources.pipeline.resourceName.PipelineID)
I know that the variable is being correctly populated since the error message above contains "resourceName" in resources.pipeline.resourceName.PipelineID, as it should.
However, for reasons unknown to me, it now throughs an error. It seems like it doesn't recognize the pipeline resource, and instead recognizes it as a string.
Any help or insight here would be greatly appreciated, thanks!
As far as I can tell, this is because of how predefined variables work in YAML. Since resources.pipeline... is a predefined variable, it gets resolved at compile time. Thus, you can't use run-time defined variables like I am doing. Instead of resolving it as a predefined variable, it will get resolved to be a string at runtime.
I am looking to re-use a particular value across multiple step definitions in my Cypress/Cucumber test.
I was thinking of using a normal variable, but the problem is that the step definitions are stored in different files.
So I am wondering if I could assign the value to an environment variable & reference that in the other file.
I was trying to do something like Cypress.env('myUsername') = 'testUser', but I get this lint error:
Cypress environment variables can be set during a test by passing in the desired value as the second argument.
Cypress.env('HOST', 'asdf');
In the documentation, env API syntax should look like something below:
- Cypress.env()
- Cypress.env(name)
- Cypress.env(name, value)
- Cypress.env(object)
In your case the following will work
Cypress.env('myUsername', 'testUser')
I have a requirement. I am trying to pass a context variable defined in Talend(name=nomeFile, value=context.nome_file) from unix to myParent Job and then I want to pass it between my jobs. In this case the variable must be read from mainALF and passed to subLoad_Alf. I developed both ParentJob and ChildJob and after that I build myParentJob. At this point from unix terminal I wrote the following command:
./myParentJob.sh --context_param nomeFile="myDirectory/FileName.txt" and I does not work. I got "No such file or directory".
This is my main Job (myParentJob):
On the contrary, If I build myChildJob(subLoad_Alf) and I run the same command: ./myChildJob.sh --context_param nomeFile="myDirectory/FileName.txt" it works.
I think I am not able to read the context variable (nomeFile) in the ParentJob and send its value ("myDirectory/FileName.txt") to myChildJob.
This is my childJob config:
Does anyone can help me to figure out how to achieve this requirement?
The problem is that you are overriding your nomeFile context variable when you pass it to the child job.
The global variable (globalMap.get("context.nomeFile")) that you are using to override the context variable does not exist, hence the empty context.nomeFile that is passed to your child job.
Checking "Transmit whole context" on the child job's tRunJob will take care of passing the nomeFile that was passed to your parent job. You don't need to explicitly specify a value for that context variable so you need to remove it from context params table.
My JMeter test receives a parameter to specify test environment like PROD, DEV.
Results from both test environment will be sent to a certain GraphiteHost. But I need to separate the results of each environment by using rootMetricsPrefix.
For example, results from PROD will use prefix global.myapp.performance.prod. while the results from DEV will use prefix global.myapp.performance.dev..
So I set the rootMetricsPrefix in my Backend Listener as global.myapp.performance.${__groovy($__P(env).toLowerCase())}..
Unfortunately, It doesn't work.
Data in Graphite doesn't contains the environment name.
Can anyone tell me how to solve this?
You can use new changeCase function to lower case your value:
${__changeCase(${__P(env)},LOWER,)}
It will read the property and then execute lower case on value
There is an error in your expression it should be:
global.myapp.performance.${__groovy("${__P(env)}".toLowerCase())}
This will also work:
global.myapp.performance.${__groovy(props.get("env").toLowerCase())}
But for performances, it is better to use the solution provided by #user7294900
How can I declare a variable name by using the value of a property?
For example, I have the property propertyName with the value propertyValue. I want to declare a variable with the name propertyValue.
I've tried like ${${__P(variableName)}} but such constructions doesnt work.
You may need to evaluate the property name, using the ${__V()} function.
Thus, you'd probably end up with something like ${__V(${__P(propertyName)})} which would only declare a variable with a null value.
Basics on properties & command line:
if you need to pass variables through the command line, properties are indeed the correct choice.
The flag to set a property is -JpropertyName The function to read a property is ${__P(propertyName)}
For full details, see:
http://wiki.apache.org/jakarta-jmeter/JMeterFAQ#How_do_I_pass_parameters_into_my_Test_scripts.3F_I_want_to_be_able_to_use_the_same_script_to_test_with_different_numbers_of_threads_and_loops.2C_and_I_don.27t_want_to_have_to_change_the_script_each_time.
Give up using properties files, try using Variables From CSV plugin. It is pretty simple and robust way to have variables loaded from file.
Property files are great!!! For my requirement, I have created a simple config element for JMeter to read property files.
Please check here.
http://www.testautomationguru.com/jmeter-property-file-reader-a-custom-config-element.