Packaging/Hosting Selenium scripts - selenium-rc

A certain test script of mine needs to be run by the "operations" team periodically. My script uses the following components -
1. TestNG
2. Excel (for the input specifications)
3. Selenium RC, ofcourse.
It currently runs in Eclipse.
Is there a way I can package and host it in an, ideally, web accessible location, that folks in operations can click on and review results?
Thanks.

I ended up writing an Ant script controlling execution.

Related

How to package & share Selenium scripts with non-technical team?

I have a few python/selenium scripts that automate tedious work in browser-based apps (e.g., data entry from a CSV file).
I would like to share the scripts with my team members who are tech-savvy but don't know how to code.
How can I package up the scripts with all the dependencies so that "it just works" when downloaded?
Dependencies are:
Install python and add to PATH
Install selenium, pandas, autopylogger
Install selenium chromedrivers
...and a few other simple changes to variables/file paths in the scripts
My first instinct was to dockerize the environment, but because the automation need to use the user's Chrome profiles and docker runs on WSL I imagine that would cause issues. And setting up WSL + Docker Desktop isn't exactly simplifying things.
Jenkins, Replit, Maven, and other hosted options would have a similar issue using the user's Chrome profile (I think?)
I'm currently thinking I'll write a .BAT script that sets up the environment, but I haven't done that before and am hoping there's an easier way.
I would say,
1)Create a Jenkin job.
In Selenium, you can pass chrome user profile as an argument. They can save their chrome user profile from the chrome browser and pass it as an argument through jenkins.
Below is the sample code of how to pass custom chrome profiles:
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
options = Options()
options.add_argument("user-data-
dir=C:\\Users\\AtechM_03\\AppData\\Local\\Google\\Chrome\\User Data\\Profile 2")
driver = webdriver.Chrome(executable_path=r'C:\path\to\chromedriver.exe',
chrome_options=options)
driver.get("https://www.google.co.in")
Let them store their profile in a accessible path, where it can be passed as an argument to Jenkins

HP-UFT 12.02: How to set list of test to run without ALM managing tool

I have an UFT tool 12.02. A number of test are already created. Now i am trying to run all this test, one after another, without using ALM. Is it possible? If yes, then how?
Check out the Test Batch Runner.

Running Powershell Script

We have a third party software that is ran via a Powershell script once it has ran it it retuns results back via the script (as well as an XML files).
My question is there anyway within Sonar Qube to run a Powershell script (or windows command line) or would I have to write a brand new plugin?
Russell
You don't mention what the 3rd party tool is, or what kind of results it yields. So at a guess: yes, you'll have to write a new plugin to read the results generated by this tool and import them as part of your analysis.
Notice, I don't mention firing the tool. That can be done, but in general plugins simply import tool reports, and because plugin development is done in Java, it is likely to be simpler to include kicking the tool off as part of your build script/process. However, if you feel you really must fire it from your plugin, the StyleCop plugin may point you in the right direction. But while the StyleCop plugin fires StyleCop directly, you'll be firing a script to fire your tool. Or maybe you'll be incorporating your script logic into your plugin...? (Probably fewer headaches in the long run that way.)

Deploying/Re-Deploying SAS job in DIS via Script

Is there a possibility to deploy or redeploy a SAS job (Data Integration Studio) via a shell script ?
Also is there a way to create its SPK file via script ?
You can deploy DI jobs from commandline, see here:
http://support.sas.com/documentation/cdl/en/etlug/65807/HTML/default/viewer.htm#p1jxhqhaz10gj2n1pyr0hbzozv2f.htm
I have imported and exported objects into SAS DIS via shell script use this sas ExportPackage utility. I personally find it way more convenient as compared to window method. However, for it to work you need to have X-windows Environment, i used Xming for it.
As for deploying Jobs, never tried it.
To redeploy jobs DI Studio versions 4.901 and higher have a DepoyJobs tool which is designed to perform this function: read more in the SAS documentation. It is available on the server. Older versions had a similar but much more restrictive client tool using ant.
Also see Paper 1067-2017 An Introduction to the Improved SASĀ® Data Integration Studio Batch Deployment Utility on UNIX by Jeff Dyson, The Financial Risk Group, which gives a run through on how to use it.

Jenkins Timeout because of long script execution

I have some Issues regarding Jenkins and running a Powershell Script within. Long Story short: the Script takes 8x longe execution time then running it manually (takes just a few minutes) on the Server(Slave).
Im wondering why?
In the script are functions which which invoke commands like & msbuild.exe or & svn commit. I found out that the script hangs up in those Lines where before metioned commands are executed. The result is, that Jenkins time out because the Script take that long. I could alter the Timeout threshold in the Jenkins Job Configuration but i dont think this is the solution for the problem
There are no error ouputs or any information why it takes that long and i do not have any further Idea for the reason. Maybe one of you could tell me, how Jenkins invokes internaly those commands.
This is what Jenkins does (Windows batch plugin):
powershell -File %WORKSPACE%\ScriptHead\DeployOrRelease.ps1
I've created my own Powershell CI Service before I found that Jenkins supports it's own such plugin. But in my implementation and in my current jobs configs we follow sample segregation principle rule: more is better better. I found that my CI Service works better when is separated in different steps (also in case of error it's a lot easy for a root cause analyse). The Single responsibility principle is also helpful here. So as in Jenkins we have pre- & post-, build and email steps as separate script. About
msbuild.exe
As far as I remember in my case there were issues related with the operations in FileSystem paths. So when script was divided/separated in different functions we had better performance (additional checks of params).
Use "divide and conquer" technique. You have two choices: modify your script so that will display what is doing and how much it takes for every step. Second option is to make smaller scripts to perform actions like:
get the code source,
compile/build the application,
run the test,
create a package,
send the package,
archive the logs
send notification.
The most problematic is usually the first step: To get the source code from GIT or SVN or Mercurial or whatever you have as version control system. Make sure this step is not embeded into your script.
During the job run, Jenkins capture the output and use AJAX to display the result in your browser. In the script make sure you flush standard output for every step or several steps. Some languages cache standard output so you can see the results only at the end.
Also you can create log files that can be helpful to archive and verify activity status for older runs. From my experience using Jenkins with more then 10 steps requires you to create a specialized application that can run multiple steps like "robot framework".

Resources