Using MongoDB (in a container?) in Visual Studio Team Services pipelines - continuous-integration

I have a node.js server that communicates with a MongoDB database. As part of the continuous-integration process I'd like to spin up a MongoDB database and run my tests against the server + DB.
With bitbucket pipelines I can spin up a container that has both node.js and MongoDB. I then run my tests against this setup.
What would be the best way to achieve this with Visual Studio Team Services? Some options that come to mind:
1) Hosted pipelines seem easiest but they don't have MongoDB on them. I could use Tool Installers, but there's no mention of a MongoDB installer, and in fact I don't see any tool installer in my list of available tasks. Also, it is mentioned that there is no admin access to the hosted pipeline machines and I believe MongoDB requires admin access. Lastly, downloading and installing Mongo takes quite a bit of time.
2) Set up my own private pipeline - i.e. a VM with Node + Mongo, and install the pipeline agent on it. Do I have to spin up a dedicate Azure instance for this? Will this instance be torn down and set up again on each test run, or will it remain up between test runs (meaning I have to take extra care to clean it up)?
3) Magically use a container in the pipeline through an option that I haven't yet discovered...?
I'd really like to use a container to run my tests because then I can use the same container locally during the development process, rather than having to maintain multiple environments. Can this be done?

So as it turns out, VSTS now has Docker support in its pipeline (when I wrote my question it was in beta and I didn't find it for whatever reason). It can be found at https://marketplace.visualstudio.com/items?itemName=ms-vscs-rm.docker.
This command allows you to spin up a container of your choice and run a single command on it. If this command is to be synchronously run as part of the pipeline, then Run in Background needs to be unchecked (this will be the case for regular build commands, I guess). I ended up pushing a build script into my git repository and running it on a container.
And re. my question in (2) above - machines in private pipelines aren't cleaned up between pipeline runs.

Related

Jenkins + Docker Compose + Integration Tests

I have a crazy idea to run integration tests (xUnit in .Net) in the Jenkins pipeline by using Docker Compose. The goal is to create testing environment ad-hoc and run integration tests form Jenkins (and Visual Studio) wthout using DBs etc. on physical server. In my previous project sometimes there was a case, when two builds override test data from the second build and I would like to avoid it.
The plan is the following:
Add dockerfile for each test project
Add references in the docker compose file (with creation of DBs on docker)
Add step in the Jenkins that will run integration tests
I have no long experience with contenerization, so I cannot predict what problems can appear.
The questions are:
Does it have any sence?
Is it possible?
Can it be done simpler?
I suppose that Visual Sutio test runner won't be able to get results from the docker images. I am right?
It looks that development of tests will be more difficult, because test will be run on the docker. I am right?
Thanks for all your suggestions.
Depends very much on the details. In a small project - no, in a big project with multiple micro services and many devs - sure.
Absolutely. Anything that can be done with shell commands can be automated with Jenkins
Yes, just have a test DB running somewhere. Or just run it locally with a simple script. Automation and containerization is the opposite of simple, you would only do it if the overhead is worth it in the long run
Normally it wouldn't even run on the same machine, so that could be tricky. I am no VS Code expert though
The goal of containers is to make it simpler because the environment does not change, but they add configuration overhead. Most days it shouldn't make a difference but whenever you make a big change it will cost some time.
I'd say running a Jenkins on your local machine is rarelly worth it, you could just use docker locally with scripts (bash or WSL).

How can create script to get code, publish and run it in some empty machine (NetCore WebApi)

I have a doubt.
How can i create scritps to :
Get my code from repository (GitHub, GitLab...)
Build
Publish
Test
Run in IIS
This script should run in windows or linux OS, and consider that i have a empty VM.
This application is an .Net Core WebApi.
I searched in web but not found an template geting code from repository.
This is doable with scripts like #Scott said and you should consider using solutions for this because there are some great free ones out there like teamcity with octopus integration. Here is what you need to consider if you decide on making scripts for this.
The vm you have is empty so the runtimes need to be installed and
checked are they compatible with code you are trying to deploy to
them.
The scripts for some parts of deployment will need to be run under user with sufficient privileges
You will need to handle the webserver configuration with the scripts as well for all of this
And those only a few things that are on the list for that path. Now having said that there is the path of containers which handle most of this through code and can be deployed to all of environments you mentioned before and you only need to worry that there is a container service on those vm-s you want to deploy to and it will be much easier to handle since like i mentioned it is all in code and is easily changed unlike some scripts.

How to configure ASPNET Core with Docker using VSTS to build, run tests and deploy to Azure with Unit Tests and Environment Variables

I'm trying to do something incredibly trivial I thought but apparently this needs to be hard. And yes there are bits and pieces throughout stack overflow but they're either out of date or don't actually work.
I've got an asp.net core site that I've dockerized with the add/docker/linux command.
In VSTS I can build the image and publish it with 2 docker-compose items.
And then I can release the image with the release management.
What I can't figure out how to do:
run dotnet test on my image and report the results to VSTS
Setup environment variables on Azure App Service Container that get properly passed into the image when its run.
On #1, I cannot find any up-to-date documentation on how to set it up so that while developing unit tests don't run unless specifically specified (and if I tell it to run tests in visual studio they should run in the docker image! I can get them to run always, but that's a waste of time while developing if they run every time you start debugging!).
And I cannot figure out how to use either docker-compose or the new VS.net 2017 15.8 way with just docker run commands to run the tests. It seems to me that I would need a new dockerfile just for the tests to run and have it generate and then discard the image that was created. But I can't figure out how to do this or even if this is the right way.
How should this be setup to do unit tests? (I've gone through 5 pages of google search results and none of them work right.)
On #2, setting and application setting in the App Service does not pass the values in docker run. I've tried everything and they never get passed. How do you pass environment variables on Azure so that the run command gets the right -e parameters?
For#1 you could use dotnet test command. This will generate a .trx file that VSTS can pick up and render a nice test report. You just need to setup the “Publish Test Results” task.
dotnet test --logger trx --results-directory /var/temp
More details please take a look at this blog: Running your unit tests with Visual Studio Team Services and Docker Compose
For#2 not totally get your point, if you want to override environment variable values on VSTS and use the value on Azure App Service Container. Please try this solution through powershell script: How to override values of environment variables on VSTS tasks
Beside suggest you also go through this blog shows how Docker Deployment to Azure App Service (Linux) using VSTS including both CI and CD. Which maybe helpful to you.

running process with gitlab ci

I have a gitlab runner installed on one of my test servers.
I want to build and deploy my app on every commit.
The server is a windows server and the app is a .net core 1.1 app.
my build script works fine but eventually it runs dotnet MyApp.dll which obviously makes the pipeline stop wait until it finishes (but, of course, my app won't finish, I want it to run..)
I tried running start dotnet MyApp.dll but that still doesn't work as gitlab's runner won't stop running until all of it's child processes exit.
I am certain I'm using gitlab's CI in a non idiomatic way but fail to understand how to deploy locally correctly.
Any suggestions?
Windows doesn't offer any easy way to disown a process and you probably don't want to task yourself with stopping the process on your next deploy. What you should do is use SRVANY.EXE to create service out of your application and then use the Gitlab CI to stop the service, replace the files and run it again. It's been a while since I used Windows so I'm sorry but I can't provide the exact commands to run.

Running load tests via Jenkins on a slave EC2 instance that starts and stops with the build

Ideally, we'd like to run load tests on an EC2 Jenkins slave that starts and stops with our build.
Are there any tools out there (without writing our own plugins) that currently solve this?
I've come across this, but it seems to only be triggered based on the load of Jenkins in general, and not tied to a build.
This configuration is environment specific, and not project specific, so I would prefer to keep this maintained within Jenkins instead of within Maven and the project itself. Although, I'm open to suggestions in that realm.
You can check out WebLOAD Jenkins plugin, it executes RadView's WebLOAD load testing tool, triggered by Jenkins. WebLOAD itself can launch EC2 cloud machines as needed, if that's what you need.

Resources