Octopack only creating one of two artifacts in TeamCity using Octopus deploy - visual-studio

I have a multiple project solution and am using Octopus Deploy and TeamCity for deployment.
I have installed the Octopack NuGet package on two projects that I wish to be created as artifacts during the CI build.
I have set the follwing in my build step:
and have checked the csproj files of both projects and can see:
<Import Project="..\packages\OctoPack.2.0.26\targets\OctoPack.targets" />
However, after the build runs one of the projects has been packaged as an artifact and the other one hasn't and cant see what the difference between the two is.
Is there something else I should be doing in TeamCity or in my solution?

I had to set the following in configuration manager on the project that wasn't producing the artifact:
Integration|Mixed Platforms.ActiveCfg = Integration|Any CPU
Integration|Mixed Platforms.Build.0 = Integration|Any CPU
Teamcity then happily produced the package.

Make sure you check in the \packages\OctoPack.2.0.26\targets\OctoPack.targets file to your source control.

Related

Is it possible to get the visual studio build folder name over azure devops?

We are using azure devops for CI/CD, but we got a problem with our build pipelines.
When creating visual studio build, we would like to get the specific folder where the build is created, so we can copy the folder (add outputpath parameter to visual studio task doesn't work with some projects).
The main problem is, that name of the project and the repo can be different.
We would need something like:
$(agent.builddirectory)\s\PROJECT-NAME\bin\Release
Is there a way to get the project name or the output folder?
$(agent.builddirectory) isn't working.
EDIT: We need to use the build pipelines at this point, so currently we don't work with artifacts or release pipelines (we know we should...)
It seems like you are using Azure Pipelines to do what MSBuild / dotnet cli are made to help you.
You can instead use msbuild / dotnet cli to build and pack your project output to a designated location, like output in the repository root folder, that is common across all projects that use the pipeline.
If you only checkout 1 repository, all tasks will be executed in the repository folder. You do not need to mess with $(agent.builddirectory)\s\xxxx. Accessing output is as simple as output/ in your pipeline.

Deploy Azure WebJob using VSTS

I'm having some issues deploying an Azure WebJob using Visual Studio Team Services (VSTS).
The WebJob seems to be deployed successfully but it breaks the Azure website that is hosted in the same App Service! I don't have this problem if I deploy using VS2013.
This is my build task that generates the WebJob deployment package:
And this is my deployment task:
There are no errors when I deploy the Azure WebJob. If I go to the Azure Portal I see the WebJob is there, and it runs successfully. WebJob files are copied into the wwwroot\App_Data\jobs\triggered\RemoveExpiredDids folder as expected, but the problem is that some other files will be copied into the wwwroot\App_Data\bin folder, which will break the existing website that was already deployed into that App Service!!!
So I decided to find out why this was happening. After downloading and extracting the deployment package I saw there are 2 folders (app_data and bin) and the scheduler file (settings.job):
This explains why some assemblies are coppied into the wwwroot\App_Data\bin of the App Service. The strange thing is that this doesn't happen when deploying from VS2013!!! I took a look into the MSBuild log and found the following line:
Object dirPath ([app service name]\bin) skipped due to skip directive 'SkipBinFolderOnDeploy'.
Concluding, bin folder is included when deploying the Azure WebJob from VSTS but is excluded when deploying it from VS2013.
So my question is: how to prevent the bin folder from being deployed when using VSTS? Is there any MSBuild parameter/flag to do this?
I've had issue with this particular problem as well.
The latest method I found is using Web Deploy Operation Settings , -skip:Directory= (in this case it would be -skip:Directory='\\bin') when you create your azure deploy task in the release definition (Additional arguments). I've seen that this indeed excludes the bin folder from the update actions (result).
Let me know if this helps you in any way.
Refer to these ways to deploy webjob to azure:
Modify Visual Studio Build task to deploy webjob with FileSystem (MSBuild Arguments: /p:DeployOnBuild=true /p:WebPublishMethod=FileSystem /p:publishUrl="$(build.artifactstagingdirectory)\\WebJob" /p:DeployDefaultTarget=WebPublish)
Add Delete Files task to release definition to delete bin folder (Source Folder: $(System.DefaultWorkingDirectory)/WebJobVnext/drop/WebJob); Contents:bin)
Modify Azure App Service Deploy task (1. Uncheck Publish using Web Deploy option. 2. Package or folder: $(System.DefaultWorkingDirectory)/[artifact name] /drop/WebJob)
I was finally able to fix it, thanks #starain-MSFT for pointing me in the right direction. I had to make some minor changes, though. This is the task that creates the deployment package:
MSBuild arguments:
/p:DeployOnBuild=true /p:WebPublishMethod=FileSystem /p:DeployDefaultTarget=WebPublish /p:Configuration=$(BuildConfiguration) /p:OutputPath=.\bin\ /p:publishUrl="$(build.artifactstagingdirectory)\temp\WebJob"
The difference here comparing to #starain-MSFT answer is that I had to add the /p:OutputPath= parameter, otherwise I'd get the following error:
The OutputPath property is not set for project
After generating the package, I delete the bin folder and zip it (this reduces the build time).
This is my deployment task:
Please note that $(DeploymentPackagePath) is the path to the zip file that contains the deployment package, as mentioned before. It doesn't matter if you deploy the package as a zip file or if you unzip it and deploy the folder, it works both ways.

Building zip file with Visual Studio

I am using Visual Studio 2013 to develop a website. The website is on github, and I have a server for continuous integration set up with Teamcity.
I am trying to get the website to automatically deploy to AWS when I change it on github. I have Teamcity hooked up, but the AWS CLI is having some issues, so I need to compile the solution in Teamcity into a zip file so that I can deploy to AWS using a workaround.
I've tried editing the project files for an MSBuild fix...I managed to get a zip file output. However, I ran into problems with general compilation.
What I am wondering is, since I can publish a website package from Visual Studio, is it possible to compile as if I was publishing using the build commands from TeamCity (or the command line) so that the result is the compiled project and the website files needed to run the site in a zip file?
you can create a zipped artifact in TeamCity. Simply build the project then set the artifacts for the build like this:
outputFolder\*.dll=>myzipfile.zip
outputFolder\*.whatever=>myzipfile.zip
etc
obviously you'll need to change outputFolder to be where the files are actually output by the build and the patterns to macth the files you want

Skip step in personal builds in TeamCity

on my CI server running TeamCity 8.0 I have a build configuration whose last steps are the creation and the push of a new version of a NuGet package.
I wonder if there is a way to inhibit these two steps if the current build is a personal one.
Any clue?
There's an environmental variable thats exposed in teamcity that can tell you if this is a personal build BUILD_IS_PERSONAL :
See http://confluence.jetbrains.com/display/TCD7/Predefined+Build+Parameters
E.g. using the msbuild runner (you just need to supply the nuget path)
<Project DefaultTargets="Pack" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Target Name="Pack" Condition="$(BUILD_IS_PERSONAL)!='True'">
<Message Text="Personal: $(BUILD_IS_PERSONAL)"/>
<Exec Command="$(NUGETPATH)\nuget pack $(NugetProject)"/>
</Target>
</Project>
Alternatively, you could halt the condition of build steps using an extra build step. thereby making them conditional:
Add an extra build step that uses the powershell-
if (([environment]::GetEnvironmentVariable("BUILD_IS_PERSONAL","Process")) -eq "True")
{
throw
}
On each build step there is the option:Execute step
If you choose the option "If all previous build successfully" then the step will be passed
If you choose: "Even if some of the previous build steps failed" it will execute.
TeamCity 2020.1 has added support for conditional build steps, and there is a quick shortcut to skip the build step for personal builds.
You could create a new build configuration - leaving out those 2 steps. Then - install the TeamCity Visual Studio plugin (assuming you are using VS) and then run a personal build, choosing the build configuration you'd like to use.
Similar to Jordan's response I think the best approach is to separate compilation from packaging / deployment build configurations. In fact if you use Octopus to deploy then you have to keep the TeamCity Octopus deployment steps in a separate build configuration from the compilation as the NuGet feed doesn't get populated until after the build configuration has been completed successfully.
If make your packaging / deployment build configuration a dependency of the compilation build configuration and set the build trigger to only trigger after a successful build of the compilation build configuration, then it will not trigger after a personal build even if that is successful : -
This way you are always calling the same build configuration for compilation, whether via Developers on VS using the AddIn or Release Team via the Web UI
Hope this helps
We have a similar issue where we only wanted to build-and-deploy our local Apache Maven repository when it has changed.
One possible solution: If your build step is a Command Line (or can be), create a tiny shell script to decide if these step should be run or not.

Deploying .NET with Jenkins/Hudson

I've been using Jenkins/Hudson CI for deploying my .NET web site project. I've been using the MSbuild plugin to build my project, and then xcopy to copy it out to the server.
I've noticed if I use the publish feature in Visual Studio I get a different set of files. I've got the config transforms working, but I end up with all the .cs files and a winmerge compare shows the binaries being different.
So, I'd like to either get Jenkins working just like the publish feature, or confirm that an xcopy deploy is functionally the same thing.
I've had good experiences with using Web Deploy and as a final build step with Jenkins running a bat file containing:
msdeploy.exe -verb:sync -source:package=%PACKAGE% -dest:auto,ComputerName=%TARGETHOST%
You'll have to install the web deploy package on your build server and the extention on IIS.
I'm using the MSBuild Jenkins plugin to build and then deploy the project. As mentioned in other answers, you need to have Web Deploy installed.
In the project configuration page in Jenkins, you need to add the following to the Command Line Arguments field:
/p:Configuration=Debug /p:DeployOnBuild=true /p:PublishProfile=publishProfileName
Of course, you need to first create the publish profile, either in VS or by exporting it from IIS and you also need to specify the solution file path in the MSBuild Build File field.

Resources