Running unix shell TalendJob - shell

I have a issue. I build myTalendJob and I am running myShell succesfully by adding a contextVariable. The command I use is:
./mainJob_run.sh --context_param myVar="/myDirectory/file.txt"
Is it possible to simply run ./mainJob_run.sh and passing dynamically --context_param myVar="/myDirectory/file.txt" avoiding to rewrite it anytime?
Thank you in advance!

I am not sure I understand your question but this is my attempt to answer.
Either:
When exporting your job, override the context "myVar" by this given value
Write caller script to call mainJob_run.sh appending this additional parameter. I prefer this one as it gives more flexibility
Implicit contexts load

You can read your context params from a file.
With this, you don't need to pass the context params through the shell command, but instead it reads the context params from a file when the job is executing.Ideally, you should put this in your tPreJob.
After reading the values, you can also pass the context params through a tJavaRow for further processing. This way you can format your context params, or generate new context params based on the input values.
TalendByExample has provided a great guide on how to build a reusable context loading job which you can call from any of your jobs.
https://www.talendbyexample.com/talend-reusable-context-load-job.html

Related

Passing context variable from unix shell to Parent Job and then passing it between job

I have a requirement. I am trying to pass a context variable defined in Talend(name=nomeFile, value=context.nome_file) from unix to myParent Job and then I want to pass it between my jobs. In this case the variable must be read from mainALF and passed to subLoad_Alf. I developed both ParentJob and ChildJob and after that I build myParentJob. At this point from unix terminal I wrote the following command:
./myParentJob.sh --context_param nomeFile="myDirectory/FileName.txt" and I does not work. I got "No such file or directory".
This is my main Job (myParentJob):
On the contrary, If I build myChildJob(subLoad_Alf) and I run the same command: ./myChildJob.sh --context_param nomeFile="myDirectory/FileName.txt" it works.
I think I am not able to read the context variable (nomeFile) in the ParentJob and send its value ("myDirectory/FileName.txt") to myChildJob.
This is my childJob config:
Does anyone can help me to figure out how to achieve this requirement?
The problem is that you are overriding your nomeFile context variable when you pass it to the child job.
The global variable (globalMap.get("context.nomeFile")) that you are using to override the context variable does not exist, hence the empty context.nomeFile that is passed to your child job.
Checking "Transmit whole context" on the child job's tRunJob will take care of passing the nomeFile that was passed to your parent job. You don't need to explicitly specify a value for that context variable so you need to remove it from context params table.

List all active Configuration for Config Admin in gogo shell

I would like to show on screen the Configuration list returned by org.osgi.service.cm.ConfigurationAdmin.listConfigurations method via gogo shell. I tried with the following:
g! _sref = $.context getServiceReference "org.osgi.service.cm.ConfigurationAdmin"
g! _srv = $.context getService $_sref
g! $_srv listConfigurations
but it fails with the following error:
gogo: IllegalArgumentException: Cannot coerce listconfigurations() to any of [(String)]
What is the right syntax here? Is it possible to do that?
Thanks!
The listConfigurations method takes a String parameter, which is a filter. If you just want an unfiltered list, then you can pass null, e.g.:
$_srv listConfigurations null
This returns an array of Configuration objects, which you will probably want to iterate over with the each command.
However this kind of thing quickly gets too complex for Gogo scripting. For example you're not releasing the service reference with ungetService anywhere. It's probably better to build a reusable Gogo command in Java as a Declarative Services component.
It's probably a lot easier to use the following shell commands to achieve that:
https://bitbucket.org/pjtr/net.luminis.cmc
Which has, amongst other things, a command called:
cm list

HP UFT API Test - Saving Response/Checkpoint values

Is there a way to capture and store (or write to a file) the values returned in the Response? (Checkpoint values)
Using HP UFT 11.52
Thanks,
Lynn
I figured it out. In UFT API under Standard Activities, there are File function modules including "Write to File". I added the module to the test, set the path and other properties, passed the variable to the file and it worked! Couldn't be easier.
I mentioned this on my other answer , you can also write it programatically if you have dynamic array response please refer below:
https://stackoverflow.com/a/28012383/3972994
After running a test, in the test folder, you can find a Snapshots/LastIteration directory.
In it you can find the return value for each step saved in a txt file.
Pay attention that if you data drive the step, only the last iteration will be saved to file.
However, in the Test's log (Test dir/Log/vtd_user.log) you can find all the iterations persisted
Thanks,
Yossi
You do not need to use the standard activities if you do this
var iResponse = this.Activity.responsebody;
System.IO.File.WriteLines(#"directorypath&FileName);
the above will write the response to the file and rewrite it for every run

How to make jmeter while controller work

I'm having trouble getting the while controller to work in jmeter.
I've a feeling that I read that it doesn't re-evalute user defined variables, so I am trying to use properties instead.
I start off by using a BSF assertion to set a property called keepLooping
${__setProperty(keepLooping, true)};
This seems to work as it enters the While controller with a condition of
${__property(keepLooping)}
But I cannot for the life of me get it to change that property to something else. I want it to change the property depending on the resulting text of an http request.
So I am using a Regular Expression Extractor to set a variable, which I can see is getting set. Then I am trying to use a BSF assertion to set the keepLooping property on the basis of the variable that I have set. I am using javascript as follows:
log.info("IM IN HERE");
log.info("props is "+props);
//log.info("props keep looping is "+props["keepLooping"]);
if (${surveyRequired} == false){
log.info("IM IN HERE 1A and props is "+props);
${__setProperty(keepLooping, true)};
log.info("IM IN HERE 1B");
}
else {
log.info("IM IN HERE 2A");
${__setProperty(keepLooping, false)};
log.info("IM IN HERE 2B");
}
I can't figure out how to set the property with javascript - I've tried several things. Can anyone help? Many thanks!
Also can anyone recommend a good resource that negotiates what seem to be the many 'quirks' of jmeter? Many thanks!
"I've a feeling that I read that it doesn't re-evalute user defined variables" -- I use JMeter 2.9 and it really does. I use user defined variable in order to count number of loops. It looks like: ${__javaScript(${MY_USER_DEFINED_VARIABLE}>0)}. The only one annoying thing is that I have to get value of variable, increment it, cast to string (toString() in Groovy), and then put new value into MY_USER_DEFINED_VARIABLE (by using vars.putObject("MY_USER_DEFINED_VARIABLE",localBSFVariable))
Using vars.put or props.put will help, as explained in detailed in detail in this jmeter thread.

Jenkins - passing variables between jobs?

I have two jobs in jenkins, both of which need the same parameter.
How can I run the first job with a parameter so that when it triggers the second job, the same parameter is used?
You can use Parameterized Trigger Plugin which will let you pass parameters from one task to another.
You need also add this parameter you passed from upstream in downstream.
1.Post-Build Actions > Select ”Trigger parameterized build on other projects”
2.Enter the environment variable with value.Value can also be Jenkins Build Parameters.
Detailed steps can be seen here :-
https://itisatechiesworld.wordpress.com/jenkins-related-articles/jenkins-configuration/jenkins-passing-a-parameter-from-one-job-to-another/
Hope it's helpful :)
The accepted answer here does not work for my use case. I needed to be able to dynamically create parameters in one job and pass them into another. As Mark McKenna mentions there is seemingly no way to export a variable from a shell build step to the post build actions.
I achieved a workaround using the Parameterized Trigger Plugin by writing the values to a file and using that file as the parameters to import via 'Add post-build action' -> 'Trigger parameterized build...' then selecting 'Add Parameters' -> 'Parameters from properties file'.
I think the answer above needs some update:
I was trying to create a dynamic directory to store my upstream build artifacts so I wanted to pass my upstream job build number to downstream job I tried the above steps but couldn't make it work. Here is how it worked:
I copied the artifacts from my current job using copy artifacts plugin.
In post build action of upstream job I added the variable like "SOURCE_BUILD_NUMBER=${BUILD_NUMBER}" and configured it to trigger the downstream job.
Everything worked except that my downstream job was not able to get $SOURCE_BUILD_NUMBER to create the directory.
So I found out that to use this variable I have to define the same variable in down stream job as a parameter variable like in this picture below:
This is because the new version of jenkins require's you to define the variable in the downstream job as well. I hope it's helpful.
(for fellow googlers)
If you are building a serious pipeline with the Build Flow Plugin, you can pass parameters between jobs with the DSL like this :
Supposing an available string parameter "CVS_TAG", in order to pass it to other jobs :
build("pipeline_begin", CVS_TAG: params['CVS_TAG'])
parallel (
// will be scheduled in parallel.
{ build("pipeline_static_analysis", CVS_TAG: params['CVS_TAG']) },
{ build("pipeline_nonreg", CVS_TAG: params['CVS_TAG']) }
)
// will be triggered after previous jobs complete
build("pipeline_end", CVS_TAG: params['CVS_TAG'])
Hint for displaying available variables / params :
// output values
out.println '------------------------------------'
out.println 'Triggered Parameters Map:'
out.println params
out.println '------------------------------------'
out.println 'Build Object Properties:'
build.properties.each { out.println "$it.key -> $it.value" }
out.println '------------------------------------'
Just add my answer in addition to Nigel Kirby's as I can't comment yet:
In order to pass a dynamically created parameter, you can also export the variable in 'Execute Shell' tile and then pass it through 'Trigger parameterized build on other projects' => 'Predefined parameters" => give 'YOUR_VAR=$YOUR_VAR'. My team use this feature to pass npm package version from build job to deployment jobs
UPDATE: above only works for Jenkins injected parameters, parameter created from shell still need to use same method. eg. echo YOUR_VAR=${YOUR_VAR} > variable.properties and pass that file downstream
I faced the same issue when I had to pass a pom version to a downstream Rundeck job.
What I did, was using parameters injection via a properties file as such:
1) Creating properties in properties file via shell :
Build actions:
Execute a shell script
Inject environment variables
E.g : properties definition
2) Passing defined properties to the downstream job :
Post Build Actions :
Trigger parameterized build on other project
Add parameters : Current build parameters
Add parameters : predefined parameters
E.g : properties sending
3) It was then possible to use $POM_VERSION as such in the downstream Rundeck job.
/!\ Jenkins Version : 1.636
/!\ For some reason when creating the triggered build, it was necessary to add the option 'Current build parameters' to pass the properties.
Reading through the answers, I don't see another option that I like so will offer it as well. I love the parameterization of jobs, but it doesn't always scale well. If you have jobs which are not directly downstream of the first job but farther down the pipeline, you don't really want to parameterize every job in the pipeline so as to be able to pass the parameters all the way through. Or if you have a large number of parameters used by a variety of other jobs (especially those not necessarily tied to one parent or master job), again parameterization doesn't work.
In these cases, I favor outputting the values to a properties file and then injecting that in whatever job I need using the EnvInject plugin. This can be done dynamically, which is another way to solve the issue from another answer above where parameterized jobs were still used. This solution scales very well in many scenarios.
This could be done via groovy function:
upstream Jenkinsfile - param CREDENTIALS_ID is passed downsteam
pipeline {
stage {
steps {
build job: "my_downsteam_job_name",
parameters [string(name: 'CREDENTIALS_ID', value: 'other_credentials_id')]
}
}
}
downstream Jenkinsfile - if param CREDENTIALS_ID not passed from upsteam, function returns default value
def getCredentialsId() {
if(params.CREDENTIALS_ID) {
return params.CREDENTIALS_ID;
} else {
return "default_credentials_id";
}
}
pipeline {
environment{
TEST_PASSWORD = credentials("${getCredentialsId()}")
}
}
You can use Hudson Groovy builder to do this.
First Job in pipeline
Second job in pipeline
I figured it out!
With almost 2 hours worth of trial and error, i figured it out.
This WORKS and is what you do to pass variables to remote job:
def handle = triggerRemoteJob(remoteJenkinsName: 'remoteJenkins', job: 'RemoteJob' paramters: "param1=${env.PARAM1}\nparam2=${env.param2}")
Use \n to separate two parameters, no spaces..
As opposed to
parameters: '''someparams'''
we use
paramters: "someparams"
the " ... " is what gets us the values of the desired variables. (These are double quotes, not two single quotes)
the ''' ... ''' or ' ... ' will not get us those values. (Three single quotes or just single quotes)
All parameters here are defined in environment{} block at the start of the pipeline and are modified in stages>steps>scripts wherever necessary.
I also tested and found that when you use " ... " you cannot use something like ''' ... "..." ''' or "... '..'..." or any combination of it...
The catch here is that when you are using "..." in parameters section, you cannot pass a string parameter; for example This WILL NOT WORK:
def handle = triggerRemoteJob(remoteJenkinsName: 'remoteJenkins', job: 'RemoteJob' paramters: "param1=${env.PARAM1}\nparam2='param2'")
if you want to pass something like the one above, you will need to set an environment variable param2='param2' and then use ${env.param2} in the parameters section of remote trigger plugin step
You can also make a job write into a properties file somewhere and have another job read it. One of the way to do that is to inject variables via EnvInject plugin.

Resources