Running load tests via Jenkins on a slave EC2 instance that starts and stops with the build - maven

Ideally, we'd like to run load tests on an EC2 Jenkins slave that starts and stops with our build.
Are there any tools out there (without writing our own plugins) that currently solve this?
I've come across this, but it seems to only be triggered based on the load of Jenkins in general, and not tied to a build.
This configuration is environment specific, and not project specific, so I would prefer to keep this maintained within Jenkins instead of within Maven and the project itself. Although, I'm open to suggestions in that realm.

You can check out WebLOAD Jenkins plugin, it executes RadView's WebLOAD load testing tool, triggered by Jenkins. WebLOAD itself can launch EC2 cloud machines as needed, if that's what you need.

Related

Jenkins + Docker Compose + Integration Tests

I have a crazy idea to run integration tests (xUnit in .Net) in the Jenkins pipeline by using Docker Compose. The goal is to create testing environment ad-hoc and run integration tests form Jenkins (and Visual Studio) wthout using DBs etc. on physical server. In my previous project sometimes there was a case, when two builds override test data from the second build and I would like to avoid it.
The plan is the following:
Add dockerfile for each test project
Add references in the docker compose file (with creation of DBs on docker)
Add step in the Jenkins that will run integration tests
I have no long experience with contenerization, so I cannot predict what problems can appear.
The questions are:
Does it have any sence?
Is it possible?
Can it be done simpler?
I suppose that Visual Sutio test runner won't be able to get results from the docker images. I am right?
It looks that development of tests will be more difficult, because test will be run on the docker. I am right?
Thanks for all your suggestions.
Depends very much on the details. In a small project - no, in a big project with multiple micro services and many devs - sure.
Absolutely. Anything that can be done with shell commands can be automated with Jenkins
Yes, just have a test DB running somewhere. Or just run it locally with a simple script. Automation and containerization is the opposite of simple, you would only do it if the overhead is worth it in the long run
Normally it wouldn't even run on the same machine, so that could be tricky. I am no VS Code expert though
The goal of containers is to make it simpler because the environment does not change, but they add configuration overhead. Most days it shouldn't make a difference but whenever you make a big change it will cost some time.
I'd say running a Jenkins on your local machine is rarelly worth it, you could just use docker locally with scripts (bash or WSL).

Run newman on the local build instead of deploying to a test environment using TeamCity

I am looking to be able to run my postman scripts using newman during a TeamCity build.
Instead of deploying the build to a test environment, I'd like to run the postman scripts on that particular build, so it isn't deployed to an environment used by other developers which could potentially break it.
My current build chain in TeamCity is:
Build main project (contains the REST Api and all required code)
Run Postman scripts using Newman on that project
I have the collection and environment file, along with the CLI command to call it. When I try and point the environment for a local build, it does not work.
I am thinking of running an IIS Express server on the agent and then with that active port, run the tests but I have been unsuccessful.
Any ideas on how to approach this would be appreciated!
I have looked at How do I integrate my Postman Integration Tests with TeamCity and this uses a test environment, which is not what I am after.
I looked at https://ie.com.au/a-how-set-up-automated-api-testing and this was helpful, but I think this is still reliant on setting up a test envrionment.
TeamCity isn't really equipped to handle what you are trying to do. You are trying to run API tests against a build, in order to do that, you'll need an environment. You need something to run your project in order to query against it.
The only potential path you might try looking at is containerizing your project, in docker or something similar, then running your image after it's built and querying against that. However this isn't a great practice and bloats the build time.
A good practice would be to build your project > deploy it to a test environment, you should set up a separate 'test' or 'dev' environment that is ok being broken > after deploy trigger a service to run your tests against the 'dev'

Is it possible to download and configure jenkins with a script?

I want to develop a continuous integration with one or many scripts locally and then on a server.
For that I need Jenkins. I installed jenkins in a docker container, but would it be possible to configure it with a script so that the configuration can be used on any computer that runs it? When I talk about configuration, I'm talking about jenkins jobs and plugins.
You can use a configuration management tool like chef or ansible to install and configure in automated way. If using chef you can use the community cookbook. If you are only looking for creating jobs automated way check this thread. Similar way you will be able to create groovy script to install plugins as well.
Also take a look at this article

Using MongoDB (in a container?) in Visual Studio Team Services pipelines

I have a node.js server that communicates with a MongoDB database. As part of the continuous-integration process I'd like to spin up a MongoDB database and run my tests against the server + DB.
With bitbucket pipelines I can spin up a container that has both node.js and MongoDB. I then run my tests against this setup.
What would be the best way to achieve this with Visual Studio Team Services? Some options that come to mind:
1) Hosted pipelines seem easiest but they don't have MongoDB on them. I could use Tool Installers, but there's no mention of a MongoDB installer, and in fact I don't see any tool installer in my list of available tasks. Also, it is mentioned that there is no admin access to the hosted pipeline machines and I believe MongoDB requires admin access. Lastly, downloading and installing Mongo takes quite a bit of time.
2) Set up my own private pipeline - i.e. a VM with Node + Mongo, and install the pipeline agent on it. Do I have to spin up a dedicate Azure instance for this? Will this instance be torn down and set up again on each test run, or will it remain up between test runs (meaning I have to take extra care to clean it up)?
3) Magically use a container in the pipeline through an option that I haven't yet discovered...?
I'd really like to use a container to run my tests because then I can use the same container locally during the development process, rather than having to maintain multiple environments. Can this be done?
So as it turns out, VSTS now has Docker support in its pipeline (when I wrote my question it was in beta and I didn't find it for whatever reason). It can be found at https://marketplace.visualstudio.com/items?itemName=ms-vscs-rm.docker.
This command allows you to spin up a container of your choice and run a single command on it. If this command is to be synchronously run as part of the pipeline, then Run in Background needs to be unchecked (this will be the case for regular build commands, I guess). I ended up pushing a build script into my git repository and running it on a container.
And re. my question in (2) above - machines in private pipelines aren't cleaned up between pipeline runs.

Jenkins - Build, Deploy and Promote

Recently, I started learning how to use Jenkins CI. So I am a little bit of a noob at jenkins. I am about to start to try and do the following:
I have setup a maven multi-module job on jenkins, which builds, tests, and finally creates 4 seperate war applications. I archive the war artifacts as part of this job. These war files will only ever be built once, they contain multiple environment properties, and the war file along with each environments server will manage the profile it runs in, eg dev, test, staging, prod, etc
I have another job on jenkins which will deal with the deployment to multiple environments.
This second job, uses the copy artifact plugin, and uses a post build action to deploy to a dev environment.
The job in step 2 will hopefully be able to have multiple promotions, allowing deployment to multiple environments: test/staging/performance/production etc.
I have searched stackoverflow and google, and all the posts I see, always use the parameterized plugin, specifying a parameter for the environment. This means there is a seperate build for each env which I don't like.
Can anyone tell me if this is the right way to go? Or direct me to some tutorial on how to do this properly.
Looks like what you need is a matrix-project build.
P.S.
A good introduction to Jenkins could be found in Jenkins: The Definitive Guide
After playing around with the jenkins configuration. I have this working very nicely now.
In the deployment job, I didn't see the "Add another promotion process" button, which allows me to promote the same build to multiple environments manually or automatically.

Resources