Store data in selenium webdriver - maven

Im looking for a way to store data in selenium for use in future tests.
Im using jenkins, maven + selenium and testng.
How can i store some data, lets say i want to run test, get some data from website (weather forecast). Store it somewhere and next day run test to check if forecast match todays weather.
I can store it in txt file, and parse by regex but im sure there is better way to do it?

You have to consider what "selenium webdriver" is in this context. It is a Java app. It "exists" only when it is running. Once the run stops, it is purged from memory, including all data it held. If you are using JUnit or TestNG (as you specified), then this data is purged even more frequently: after every the class.
To accomplish what you are asking, you will need something external to your tests. You can certainly utilize a txt file as you suggested. A spreadsheet might do for your purposes. Most applications utilize an entire database.
You also mentioned Jenkins. This will make the external storage a more interesting problem, as Jenkins often purges the current working directory before each run.

Related

Which approach should I choose for testing my Spring Batch Job?

Currently I'm working on some integration tests for a Spring Batch application. Such application reads from a SQL table, writes on another table and, at the end, generates a report as a .txt file.
Initially I thought of just assuring that I had another file with the expected output and compare it with the report file and check the table content.
(For some context, I'm not very experienced on Spring).
But, after reading some articles on Baelung, I'm having doubts about my initial methodology.
Should I manipulate the table content on my code to assure that I have the expected input? Should I use the Spring Test framework tools? Without them, I'm able to run the job from my test?
The correct approach for batch job integration testing is to test the job as a black box. If the job reads data from a table and writes to another table or a file, you can proceed as follows:
Put some test data in the input table (Given)
Run your job (When)
Assert on the output table/file (Then)
You can find more details in the End-To-End Testing of Batch Jobs section of the reference documentation. Spring Batch provides some test utilities that might help in testing your jobs (like mocking batch domain objects, asserting on file content, etc). Please refer to the org.springframework.batch.test package.

Distributed JMeter test fails with java error but test will run from JMeter UI (non-distributed)

My goal is to run a load test using 4 Azure servers as load generators and 1 Azure server to initiate the test and gather results. I had the distributed test running and I was getting good data. But today when I remote start the test 3 of the 4 load generators fail with all the http transactions erroring. The failed transactions log the following error:
Non HTTP response message: java.lang.ClassNotFoundException: org.apache.commons.logging.impl.Log4jFactory (Caused by java.lang.ClassNotFoundException: org.apache.commons.logging.impl.Log4jFactory)
I confirmed the presence of commons-logging-1.2.jar in the jmeter\lib folder on each machine.
To try to narrow down the issue I set up one Azure server to both initiate the load and run JMeter-server but this fails too. However, if I start the test from the JMeter UI on that same server the test runs OK. I think this rules out a problem in the script or a problem with the Azure machines talking to each other.
I also simplified my test plan down to where it only runs one simple http transaction and this still fails.
I've gone through all the basics: reinstalled jmeter, updated java to the latest version (1.8.0_111), updated the JAVA_HOME environment variable and backed out the most recent Microsoft Security update on the server. Any advice on how to pick this problem apart would be greatly appreciated.
I'm using JMeter 3.0r1743807 and Java 1.8
The Azure servers are running Windows Server 2008 R2
I did get a resolution to this problem. It turned out to be a conflict between some extraneous code in a jar file and a component of JMeter. It was “spooky” because something influenced the load order of referenced jar files and JMeter components.
I had included a jar file in my JMeter script using the “Add directory or jar to classpath” function in the Test Plan. This jar file has a piece of code I needed for my test along with many other components and one of those components, probably a similar logging function, conflicted with a logging function in JMeter. The problem was spooky; the test ran fine for months but started failing at the maximally inconvenient time. The problem was revealed by creating a very simple JMeter test that would load and run just fine. If I opened the simple test in JMeter then, without closing JMeter, opened my problem test, my problem test would not fail. If I reversed the order, opening the problem test followed by the simple test then the simple test would fail too. Given that the problem followed the order in which things loaded I started looking at the jar files and found my suspect.
When I built the script I left the jar file alone thinking that the functions I need might have dependencies to other pieces within the jar. Now that things are broken I need to find out if that is true and happily it is not. So, to fix the problem I changed the extension on my jar file to zip then edited it in 7-zip. I removed all the code except what I needed. I kept all the folders in the path to my needed code, I did this for two reasons; I did not have to update my code that called the functions and when I tried changing the path the functions did not work.
Next I changed the extension on the file back to jar and changed the reference in JMeter’s “Add directory or jar to classpath” function to point to the revised jar. I haven’t seen the failure since.
Many thanks to the folks who looked at this. I hope the resolution will help someone out.

Simple Local Database Solution for Ruby?

I'm attempting to write a simple Ruby/Nokogiri scraper to get event information from multiple pages and then output it to a CSV that is attached to an email sent out weekly.
I have completed the scraping components and the CSV component and it's working perfectly. However, I now realize that I need to know when new events are added, which means I need some sort of database. Ideally I would just store this locally.
I've dabbled a bit with using the ruby gem 'sequel', but the data does not seem to persist beyond the running of the program. Do I need to download some database software to work with 'sequel'? Also I'm not using the Rails framework, just Ruby.
Any and all guidance is deeply appreciated!
I'm guessing you did Sequel.sqlite, as in the first example in the Sequel README, which creates an in-memory SQLite database. To create a database in your filesystem instead of memory, just pass it a path, e.g.:
Sequel.sqlite("./my-database.db")
This is, of course, assuming that you have the sqlite3 gem installed. If the given file doesn't exist, it will be created.
This is covered in the Sequel docs.

Jenkins plugin auto import

Fairly new to the use of Jenkins, but I am looking for a way to get test results and feed it back into Jenkins.
What this means is, I have a bash scripts that collects metrics on a number of applications and checks to see whether or not the files exist. I collect this data to a plain text file, basically with counters 1/5, 2/5, 5/10 etc.
The output can be however I want it, but I was wondering if there is a good/clean process that can take this data file and output it nicely inside of Jenkins web interface?
We also use Trac as well.. so if there is a Trac plugin that can do something similar, that would be good too.
The best practices would say to escape them and use them as parameters to a parameterized jenkins build or a file save/capture. Parameters are finicky and subject to url encoding, so I would personally do file passing using a shared filesystem such as S3. Your best bet is https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API

Move records from Development to Production in CloudKit

I did something silly and made hundreds of records in Development environment in CloudKit. A previous thread mentioned that the records could be downloaded into a file and re-uploaded to the Production environment. Is there any other way I could do this, and if not, how would I go about downloading in records and storing it into a file?
Thanks in advance!
There is no option do do it in one run. You need one app that is connected to the development environment for reading your records. Then if you want to write to the production environment, you can only do that by re-signing your app. So indeed you first need to download all data, store them somewhere, and then writing them back to your production database.
Since the CKRecord complies to the NSCoding protocol you can write the results of your query directly to a file using:
NSKeyedArchiver.archiveRootObject(records, toFile: filePath)
Then if you want to read that file you can use:
result = NSKeyedUnarchiver.unarchiveObjectWithFile(filePath)

Resources