Jenkins + Docker Compose + Integration Tests - visual-studio

I have a crazy idea to run integration tests (xUnit in .Net) in the Jenkins pipeline by using Docker Compose. The goal is to create testing environment ad-hoc and run integration tests form Jenkins (and Visual Studio) wthout using DBs etc. on physical server. In my previous project sometimes there was a case, when two builds override test data from the second build and I would like to avoid it.
The plan is the following:
Add dockerfile for each test project
Add references in the docker compose file (with creation of DBs on docker)
Add step in the Jenkins that will run integration tests
I have no long experience with contenerization, so I cannot predict what problems can appear.
The questions are:
Does it have any sence?
Is it possible?
Can it be done simpler?
I suppose that Visual Sutio test runner won't be able to get results from the docker images. I am right?
It looks that development of tests will be more difficult, because test will be run on the docker. I am right?
Thanks for all your suggestions.

Depends very much on the details. In a small project - no, in a big project with multiple micro services and many devs - sure.
Absolutely. Anything that can be done with shell commands can be automated with Jenkins
Yes, just have a test DB running somewhere. Or just run it locally with a simple script. Automation and containerization is the opposite of simple, you would only do it if the overhead is worth it in the long run
Normally it wouldn't even run on the same machine, so that could be tricky. I am no VS Code expert though
The goal of containers is to make it simpler because the environment does not change, but they add configuration overhead. Most days it shouldn't make a difference but whenever you make a big change it will cost some time.
I'd say running a Jenkins on your local machine is rarelly worth it, you could just use docker locally with scripts (bash or WSL).

Related

Should I use a Docker container as a GitLab Runner?

Newbie to GitLab CI/CD here.
I'd like to use a linux base container as the Runner. Reason being I only have access to a Windows VM, and I'd rather not have to write PowerShell scripts build/test/deploy. Also, the build is using Spring Boot + maven, and I'm doubtful that the build scripts provided by Spring will run on Windows.
To further the problem, the build done by maven+spring spins up a container to execute the build. In other words, my build would run a container in a container, which seems feasible based on this blog.
Any feedback is much appreciated.
Edit based on #JoSSte 's feedback:
Are there any better approaches to setup a runner besides needing a container inside a container? For instance, if WSL enables running bash scripts, can the Windows VM act as the build server?
And to satisfy #JoSSte's request to make this question less opinion based, are there any best practices to approach a problem like this?

Use docker compose with compiling

I want to deploy a maven application with docker container and if possible also test with docker, but a have some problems.
I because of using java I need to compile my application before using is.
In the process of compiling there also running unit test, which need a database connection.
For testing I used a database container started from hand who run on localhost:5432.
If I start docker-compose now this causes an error because the container can't reach localhost:5432 any more. If I write postgres:5432 in my application.properties it does not compile because of the unknown host postgres.
How to handle this. Is there a way to start a with maven and an with postgres to building time.
As you see I am new to docker-compose, and don't have a workflow yet.
Thanks for your help
You should use your existing desktop-oriented build process to build and test the application and only use Docker to build the final deployment artifact. If you are hard-coding the database location in your source code, there is lurking trouble there of exactly the sort you describe (what will you do if you have separate staging and production databases hosted by your cloud provider?) and you should make that configurable.
During the docker build phase there’s no way to guarantee that any particular network environment, external services, or DNS names will be present, so you can’t do things like run integration tests that depend on an external database. Fortunately that’s a problem the software engineering community has spent a long time addressing in the decades before Docker existed. While many Docker setup are very enthusiastic about mounting application source code directly into containers, that’s much less useful for compiled languages and not really appropriate for controlled production deployments.
In short: run Maven the same way you did before you had Docker, and then just have your Dockerfile COPY the resulting (fully-tested) .jar file into the image.

Run newman on the local build instead of deploying to a test environment using TeamCity

I am looking to be able to run my postman scripts using newman during a TeamCity build.
Instead of deploying the build to a test environment, I'd like to run the postman scripts on that particular build, so it isn't deployed to an environment used by other developers which could potentially break it.
My current build chain in TeamCity is:
Build main project (contains the REST Api and all required code)
Run Postman scripts using Newman on that project
I have the collection and environment file, along with the CLI command to call it. When I try and point the environment for a local build, it does not work.
I am thinking of running an IIS Express server on the agent and then with that active port, run the tests but I have been unsuccessful.
Any ideas on how to approach this would be appreciated!
I have looked at How do I integrate my Postman Integration Tests with TeamCity and this uses a test environment, which is not what I am after.
I looked at https://ie.com.au/a-how-set-up-automated-api-testing and this was helpful, but I think this is still reliant on setting up a test envrionment.
TeamCity isn't really equipped to handle what you are trying to do. You are trying to run API tests against a build, in order to do that, you'll need an environment. You need something to run your project in order to query against it.
The only potential path you might try looking at is containerizing your project, in docker or something similar, then running your image after it's built and querying against that. However this isn't a great practice and bloats the build time.
A good practice would be to build your project > deploy it to a test environment, you should set up a separate 'test' or 'dev' environment that is ok being broken > after deploy trigger a service to run your tests against the 'dev'

Using MongoDB (in a container?) in Visual Studio Team Services pipelines

I have a node.js server that communicates with a MongoDB database. As part of the continuous-integration process I'd like to spin up a MongoDB database and run my tests against the server + DB.
With bitbucket pipelines I can spin up a container that has both node.js and MongoDB. I then run my tests against this setup.
What would be the best way to achieve this with Visual Studio Team Services? Some options that come to mind:
1) Hosted pipelines seem easiest but they don't have MongoDB on them. I could use Tool Installers, but there's no mention of a MongoDB installer, and in fact I don't see any tool installer in my list of available tasks. Also, it is mentioned that there is no admin access to the hosted pipeline machines and I believe MongoDB requires admin access. Lastly, downloading and installing Mongo takes quite a bit of time.
2) Set up my own private pipeline - i.e. a VM with Node + Mongo, and install the pipeline agent on it. Do I have to spin up a dedicate Azure instance for this? Will this instance be torn down and set up again on each test run, or will it remain up between test runs (meaning I have to take extra care to clean it up)?
3) Magically use a container in the pipeline through an option that I haven't yet discovered...?
I'd really like to use a container to run my tests because then I can use the same container locally during the development process, rather than having to maintain multiple environments. Can this be done?
So as it turns out, VSTS now has Docker support in its pipeline (when I wrote my question it was in beta and I didn't find it for whatever reason). It can be found at https://marketplace.visualstudio.com/items?itemName=ms-vscs-rm.docker.
This command allows you to spin up a container of your choice and run a single command on it. If this command is to be synchronously run as part of the pipeline, then Run in Background needs to be unchecked (this will be the case for regular build commands, I guess). I ended up pushing a build script into my git repository and running it on a container.
And re. my question in (2) above - machines in private pipelines aren't cleaned up between pipeline runs.

How to automatically test configuration managing scripts?

I am using tools like Puppet/Chef/Ansible to set up and config development environments and production servers.
Whenever I update the configuration, I run the tool against my development environment and log in to check manually if things works as expected.
But this is tedious to do, and I can't test everything every time, so is there any way I can automate the testing?
There are Infrastructure Testing Frameworks for this:
ServerSpec / InSpec - ruby-based. Famous, big community, nice looking and best in his class.
BATS - Bash Automated Testing System, which is a bit easier.
TestInfra - Python-based infra testing framework. Still pretty young, very small community. Intro.
Goss - Fast (written in Go), small tool for validating server/infra configuration. Test scenarios are written in yaml.
Automation:
There is interesting Molecule project - some automation for testing Ansible roles, designed by Cisco. Never tried it yet.
Step further would be using TestKitchen which handles automation to spin up Vagrant or Docker or even AWS instance and test Puppet/Chef/Ansible with Rspec/BATS against just spinned up machines.
So what you need - pick up framework, write tests and run your playbooks/recipes & tests against mock VMs.
Ideally is to keep your "infra as code" in vcs and configure ci like TravisCI to run your tests for every PR once you bring new changes in your repository.
You can even follow tdd here: write tests first, make them fail, then write actual implementation in your favorite configuration management tool and see if that change makes tests green/passed.
MOAR Infrastructure Testing & Automation!
If you can let us know, what you want to test. We can help better.
But,
Did you check the dry-run mode? I think, Puppet and Ansible supports it, you can have a cron or some automated script which runs all the puppet/ansible modules against a single(test) node.
More info:
1. http://docs.ansible.com/ansible/playbooks_checkmode.html
2. Check the noop mode in https://docs.puppet.com/puppet/latest/reference/man/agent.html

Resources