sync multiple flows with ALM - ipaf

I have set up a Test with multiple Flows i.e Flow1 , Flow2, Flow3 and run the Flow but it takes only one flow.
How can we sync multiple flows with ALM?

Create a flowmap.properties file in src folder and add the flow ids and path of the flow.xml file which we need to sync .
eg:
flow1=./sample_xml/flow.xml
flow2=./sample_xml/flow.xml
In run configuration, ALM sync-→Arguements
createTestCase flow_map flowmap(properties file name).

Related

Under " dropins folder in websphere there are more than one war files how to generate separate log files for each war"

When I run my server which is in websphere, it generates log file in "C:\Softwares\wlp\usr\servers\theory\logs\messages.logs".It has only theory.war in C:\Softwares\wlp\usr\servers\theory\dropins
When I place more than one war files such as theory.war, epic.war, success.war and execute the server it generates single log file in C:\Softwares\wlp\usr\servers\theory\logs\messages.logs
I have to generate separate log files for each war like :
C:\Softwares\wlp\usr\servers\theory\logs\theory.logs
C:\Softwares\wlp\usr\servers\theory\logs\epic.logs
C:\Softwares\wlp\usr\servers\theory\logs\success.logs
Im using util.logging package.
If you want to have separate logfiles you have to either write your custom FileHandler for JUL, or use 3rd party logging framework as Log4j for example.
You could also use binary logging , and then use binaryLog tool to filter messages for given app.
However, I'd STRONGLY recommend NOT doing all that, since in new apps that follow 12 factor principles should put all messages to standard out and standard error steams, what JUL by default does. This is used by containers, container orchestrators and logging infrastructure implemented there. Logging infa allows you to filter log messages based on various params without the need for separate log files. And if you have separate log files you will have to manually integrate that.

Config file content as checkbox list in jenkins

I have a config file in Jenkins which contains list of servers. Which running parameterized job pipeline I would like to list it and choose the server where the code can be deployed.
How can I write that in "groovy template for pipeline" area as xml?
Config file content:file.txt
Server1
Server2
Server3
A Checkbox list should come to select the servers.
Make a new pipeline project.
Enable the option 'The project is parameterized'. Then choose the check box option. I used active choice reactive .
Here we can have type as checkbox. Save it once.
Now take the job history and we can view the XML of the project. From this the XML can be copied and used.

Configure Controlm jobs through Backend?

We have number of jobs configured in Controlm and now we got new environment to configure same set of jobs,
Is there any way to create controlm jobs from backend server by writing shell or any scripts?
Any other possibilities to avoid spending time in creating same jobs again and again for each environment??
Control-M provides the functionality to export/import jobs across environments : https://communities.bmc.com/docs/DOC-50416
There are a couple of ways:
From the Planning domain, load the job folders you want to migrate into a blank workspace.
from the top bar then click Export and save. This will save a .xml file you can edit using a text editor and then Import Workspace and load - you can import the .xml into the same Control-M if environments are on one CTM EM Server, or a different CTM EM Server.
OR
From the planning Domain, load the job folders you want to migrate, right click on the folder and select Duplicate, then through the GUI update the new folder (If doing this I'd then unload the source folder just to ensure nothing is overwritten there). You can use the Find & Update option. This will only work if your environments are on the same CTM EM server.

JMeter - Multiple users reading requests from a directory

I have a directory of XML files, and I wanted to configure Jmeter such a way that multiple users(threads) should be able to read the XML files(SOAP requests) concurrently (in a round-robin way or some other way) and submit them to Web Service Endpoint, which means I wanted them to share the input files.. for example if my directory contains 100 XML files then all of my configured users(threads) should share the load and jointly have to process the XML files. (Each user should not process all the 100 files independently).
Is there any way to test the above scenario?
Thanks,
Siva
I believe Directory Listing Config plugin would be the easiest option to use
Just provide path to the directory where your XML files live, configure the plugin according to your test scenario and that's it, you will be able to use just one HTTP Request sampler and refer the file name (or path) as ${filename} where required.
You can install Directory Listing Config plugin and keep it up-to-date via JMeter Plugins Manager.

Specify multiple files for FileWatcher Autosys job

I would like to setup a FileWatcher job that looks for multiple signal files to be present before it kicks off. Is there any way to check for the presence of multiple files before the child job is triggered ? Or would I have to create multiple file watcher jobs, one for each file ?
File Trigger (R11) or File watcher (legacy) jobs can only take one value in the watch_file attribute. R11 does allow for wild cards, but that is probably not what you want. I would create a separate job for each signal file, put the jobs in a box, and run the downstream job on the success of the box.

Resources