AzureDevOps / Executing the pipelines locally with Visual Studio 2019 - visual-studio

I would like to duplicate the build process (with pipelines) from Azure DevOps to my PC with Visual Studio 2019.
This could save time and git ops during the debugging/evolving phase of the pipelines.
Is that possible?

This has been requested many times in many places, and I think that, given Microsoft's push toward Github, you may not see a lot of effort in this direction.
Right now, I don't think you'll be able to able to use VS 2019 to directly debug pipelines, but you can do a couple of things as a workaround:
Put your build process into a Powershell script and debug that process using VS Code or Powershell ISE. Then, set up your Azure Pipelines build to invoke that script. Your pipeline, and local build then are using the same mechanism to build.
This is more to be able to test outside any existing pipelines you may have.
2a. Create a new agent pool for testing local pipelines.
2b. Install the Azure Pipelines agent code, and when configuring the agent, point it to the agent pool you just created.
2c. In a topic branch for your project, change the pool in your azure-pipelines.yml file to use the agent pool you just created.
2d. Commit and push that branch.
2e. Manually queue up a build of your project, selecting the branch you just pushed.
2f. Debug. Wash. Rinse. Repeat
2g. Once you've got it all sorted out, you can revert the azure-pipelines.yml pool change, and commit/PR the Powershell script you created.
The second option doesn't really speak directly to your question, rather it adds one more thing you can do to help with the first option - getting everything tested locally.

Related

Using MongoDB (in a container?) in Visual Studio Team Services pipelines

I have a node.js server that communicates with a MongoDB database. As part of the continuous-integration process I'd like to spin up a MongoDB database and run my tests against the server + DB.
With bitbucket pipelines I can spin up a container that has both node.js and MongoDB. I then run my tests against this setup.
What would be the best way to achieve this with Visual Studio Team Services? Some options that come to mind:
1) Hosted pipelines seem easiest but they don't have MongoDB on them. I could use Tool Installers, but there's no mention of a MongoDB installer, and in fact I don't see any tool installer in my list of available tasks. Also, it is mentioned that there is no admin access to the hosted pipeline machines and I believe MongoDB requires admin access. Lastly, downloading and installing Mongo takes quite a bit of time.
2) Set up my own private pipeline - i.e. a VM with Node + Mongo, and install the pipeline agent on it. Do I have to spin up a dedicate Azure instance for this? Will this instance be torn down and set up again on each test run, or will it remain up between test runs (meaning I have to take extra care to clean it up)?
3) Magically use a container in the pipeline through an option that I haven't yet discovered...?
I'd really like to use a container to run my tests because then I can use the same container locally during the development process, rather than having to maintain multiple environments. Can this be done?
So as it turns out, VSTS now has Docker support in its pipeline (when I wrote my question it was in beta and I didn't find it for whatever reason). It can be found at https://marketplace.visualstudio.com/items?itemName=ms-vscs-rm.docker.
This command allows you to spin up a container of your choice and run a single command on it. If this command is to be synchronously run as part of the pipeline, then Run in Background needs to be unchecked (this will be the case for regular build commands, I guess). I ended up pushing a build script into my git repository and running it on a container.
And re. my question in (2) above - machines in private pipelines aren't cleaned up between pipeline runs.

TFS Team Build with VSO - Git Commit/Push files that have been altered by the build process

I have created a build definition that uses the Default Template(GitTemplate.12.xaml). I have a Pre-Build Script that updates the version numbers for all of the assemblies in the build.
I would like to be able to commit and push the files that have been altered by the build definition to the git repository.
I have tried doing this using a PowerShell script but was not able to do this because Git with VSO requires that you pass in user credentials.
I have downloaded the template to see if I can customise it to complete this task but can see no obvious way of solving my problem.
My next step will be to investigate writing a custom piece of code that can be called by the template.Just wanted to find out before I delve into this any deeper if I am wasting my time.
I am using VSO and VS2013.
Does anybody have a solution to my problem?
You should not commit those changes as it allows a developer to build an identical version numbered assembly locally.It is not a good practice to push the results back into Source Control.
You should set all of your assemblyinfo.* files to 0.0.0.0 and push. Then the only way for your assemblies to get a "good" version number is through the build process.
If you do want to go ahead you will need to authenticate using the "alternative" credentials that you can get from your profile page.

AppOffline rule issue w/ Continuous Integration/Build

I've connected Visual Studio Team Services to an Azure Website to enable automatic deployments. New Relic is running as a system process and therefore NewRelic.Agent.Core.dll is locked which prevents successful builds from being deployed.
I've tried adding a wpp.targets file to the solution in order to utilize MSDeploy to copy an app_offline document to the site before deployment, then delete it when deployment is done like seen here. However, it doesn't seem as if it is executing. I don't see anything in the build logs and my deployments continue to fail.
How do I take the app offline when using the VS Team Services/Azure CI process?
I came across this old post and there are now build tasks to start and stop an app service. See the Azure App Service Manage under the deployment tasks in the build task catalog.
Slightly different to what you are asking but what you could do is to login to your Azure website and set COR_ENABLE_PROFILING to 0 before your build runs. You then deploy as normal. Once done you set COR_ENABLE_PROFILING back to 1.
The act of changing the setting will cause an IIS reset and setting it to zero will disable the file from being locked again before the publish finishes.
Source: https://discuss.newrelic.com/t/visual-studio-online-azure-website-continuous-integration-fails/3825

Run command before pulling from SVN in TeamCity

I'm having an issue with TeamCity, which relates to the fact that it runs the source control step before it runs the build steps. My project is a windows service, so there are complications with this.
TeamCity often decides to delete the entire contents of the project directory, even though I have the clean build option unchecked. However, since this is a windows service this does not fly, as when trying to delete the dll's it errors out since they're in use:
Error while applying patch: Failed to delete: F:\PathToService\bin\Release\Library.dll
The most frustrating part is that the dll's aren't even under source control, TeamCity seems to have a mind of its own and decides to delete them anyway.
Is there a way to get around this, to be able to run a build step BEFORE doing the svn checkout so that I can stop the windows service first?
I would try to set up your CI environment so it uninstalls the windows service once you are done testing it. I am not aware of Teamcity pre-checkout hook.
The answer was to split up each service into a separate working directory. That prevents teamcity from deleting the dll's.

How does one version control the configuration of a TeamCity project?

In my CruiseControl instances, I have version controlled the ccnet.config file.
When I want to update CruiseControl, I run an "update config" job which fetches the config from version control.
In this manner, the very build process of a release is configuration managed.
I am wondering how to achieve these goals effectively under TeamCity.
I try to keep what ever CI I am using as light as possible and put as much of the running of the build into an msbuild or nant script including running tests, code coverage, etc.
The benefit of this is:
The build file is version controlled.
You can run the script in any environment.
Easier to move between CI environments.
Everyone becomes responsible for the build.
This has been introduced in TeamCity 9. Also answered in another post:
Version control (e.g. in TFS) build configuration for TeamCity - is it possible?
I've been wanting a way to source control TeamCity config for a long time. I ended up writing a Windows Service which monitors the configuration directory and commits changes to git.
The project is on GitHub: https://github.com/grenade/teamcity-config-monitor
You might try looking at the folders that are backed up prior to upgrade (or when restoring team city) as those represent the configurations and changes you've made since initial installation.
http://confluence.jetbrains.net/display/TCD4/TeamCity+Data+Backup
Some of the relevant data is actually a database, (and in fact the documentation advises you to point team city to a real database like mysql instead of the default embedded database it uses)
You could try checking those into SVN, but you'll want to stop team city for any check-in actions.

Resources