Include node modules in azure deployment via VSTS - node-modules

I have an azure app service which I need to deploy through as part of a release definition in VSTS.
To provide some context, it is a ASP.NET MVC app and uses Angular 2. It also has a package.json file. I have a VSTS build definition which includes a 'npm install' task to install all dependencies from package.json, so that I don't need to check in all the node modules. At the end of the build the files are dropped to a share without the node_modules folder.
I have a corresponding release definition to web deploy that build to azure. However, I am not sure how to get the node_modules folder on the azure machine.
Can someone help or provide suggestions for this scenario? I was hoping the packages can be npm installed on the prod machine in some way.

You can do it by using Kudu API with PowerShell. For example (package.json is in wwwroot folder)
Add Azure PowerShell step/task (Script Arguments: -resourceGroupName VS-starain2-Group -webAppName tempappstarain -dir "site\wwwroot" -command "npm install")
PowerShell script:
param(
[string]$resourceGroupName,
[string]$webAppName,
[string]$slotName="",
[string]$dir,
[string]$command
)
function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
if ([string]::IsNullOrWhiteSpace($slotName)){
$resourceType = "Microsoft.Web/sites/config"
$resourceName = "$webAppName/publishingcredentials"
}
else{
$resourceType = "Microsoft.Web/sites/slots/config"
$resourceName = "$webAppName/$slotName/publishingcredentials"
}
$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
Write-Host $publishingCredentials
return $publishingCredentials
}
function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
$publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
Write-Host $publishingCredentials.Properties.PublishingUserName
Write-Host $publishingCredentials.Properties.PublishingPassword
return ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
}
function RunCommand($dir,$command,$resourceGroupName, $webAppName, $slotName = $null){
$kuduApiAuthorisationToken = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
$kuduApiUrl="https://$webAppName.scm.azurewebsites.net/api/command"
$Body =
#{
"command"=$command;
"dir"=$dir
}
$bodyContent=#($Body) | ConvertTo-Json
Write-Host $bodyContent
Invoke-RestMethod -Uri $kuduApiUrl `
-Headers #{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
-Method POST -ContentType "application/json" -Body $bodyContent
}
RunCommand $dir $command $resourceGroupName $webAppName
Related article: Interacting with Azure Web Apps Virtual File System using PowerShell and the Kudu API
You also can just deploy node_module folder and files to azure web app by using Azure App Service Deploy (Select 3.* version of step/task, do not check Publish using Web Deploy option. Package or foder: [node_module folder path])

Related

Powershell DSC Hangs

So here's my issue, I am trying to use PS Dsc to install some basic packages, but whenever I try and run my script it looks like it starts but never finishes. I try to pass the -Force parameter as recommended but it seems like it's just stacking operations and these processes just keep getting stuck.
Here is my script
Configuration WebServer {
# Import the module that contains the resources we're using.
Import-DscResource -ModuleName PsDesiredStateConfiguration
Node "localhost" {
Package InstallAspNetWebPages2
{
Ensure = "Present"
Path =
"C:\Users\jryter\Documents\WebServerInstalls\AspNetWebPages2Setup.exe"
Name = "Microsoft ASP.NET Web Pages 2 Runtime"
ProductID = "EA63C5C1-EBBC-477C-9CC7-41454DDFAFF2"
}
}
}
WebServer -OutputPath "C:\DscConfiguration"
Start-DscConfiguration -Wait -Force -verbose -Path "C:\DscConfiguration"
the current LCM state is it's performing a consistency check.
I tried following this link -->
https://powershell.org/forums/topic/stop-dsc-configuration-which-is-runningstuck/
but to no avail....
is there some base configuration that I missed to run this stuff properly? Has anyone had this issue?
DSC probably started the exe and it just sits there waiting for your input. You need to add arguments for silent install.
Package InstallAspNetWebPages2
{
Ensure = "Present"
Path = "path\file.exe"
Name = "Microsoft ASP.NET Web Pages 2 Runtime"
ProductID = "EA63C5C1-EBBC-477C-9CC7-41454DDFAFF2"
Arguments = "/silent" or "/quiet"
}
I don't know what's the proper argument for this exe

Automatically create list of nuget packages and licenses

Is there any way to get an automatically updated list of all used nuget packages in my solution, including a link to the corresponding license, which I can display within my app?
Running the following from Package Manager Console within Visual Studio gives me the required information:
Get-Project | Get-Package | select Id, Version, LicenseUrl
How to get this list a) automatically updated on each change and b) get it into my app?
Target is to have an Info/About dialog showing all this data.
I found one way, pretty sure it has some limitations...
I'm calling this in the pre-build event:
powershell.exe -ExecutionPolicy Bypass -File $(ProjectDir)\Resources\tools\PreBuildScript.ps1 $(ProjectDir) $(SolutionDir) $(TargetDir)
And here is how resources\tools\PreBuildScript.ps1 looks like:
param (
[Parameter(Mandatory=$True)]
[string]$ProjectDir,
[Parameter(Mandatory=$True)]
[string]$SolutionDir,
[Parameter(Mandatory=$True)]
[string]$TargetDir
)
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem')
$nupkgs = Get-ChildItem -Recurse -Filter *.nupkg -Path "$SolutionDir\packages"
$nuspecs = $nupkgs | %{ [IO.Compression.ZipFile]::OpenRead($_.FullName).Entries | where {$_.Fullname.EndsWith('.nuspec')} }
$metadata = $nuspecs | %{
([xml]([System.IO.StreamReader]$_.Open()).ReadToEnd()) | %{New-Object PSObject -Property #{
Version = $_.package.metadata.version
Authors = $_.package.metadata.authors
Title = IF ([string]::IsNullOrWhitespace($_.package.metadata.title)){$_.package.metadata.id} else {$_.package.metadata.title}
LicenseUrl = $_.package.metadata.licenseUrl
}}
}
$metadata | %{ '{0} {1}{4}Autor(en): {2}{4}Lizenz: {3}{4}{4}' -f $_.Title, $_.Version, $_.Authors, $_.LicenseUrl, [Environment]::NewLine } | Out-File "$ProjectDir\Resources\ThirdPartyLicenseOverview.txt"
This gives me an (ugly) textfile Resources\ThirdPartyLicenseOverview.txt that I can include as embedded resource to use it within my app.
Not the final solution but one step on the way...
Is there any way to get an automatically updated list of all used nuget packages in my solution, including a link to the corresponding license.
As far as I am aware there is nothing currently available to get the list automatically updated on each change and get it into app.
We could not get the license information directly from the command line as part of a CI build, need to create an application to open the .nupkg zip file, extract the license url from the .nuspec file and download the license from this url.
Alternatively, you could use the package manager console window inside Visual Studio and with a bit of PowerShell download the license files. But if you want get it into your app, you could not use the package manager console, in this case you could not get the licenses.
Besides, we could use PowerShell script to get the package Id Version and download the license files, but we still need require someone to run the script to get the Id, version and download the licenses. If you still want to automatically updated on each change, you need use PowerShell to monitor the package.config. The PowerShell script should be invoked and executed automatically after any change in the package, but it will be difficult to achieve.

How do I set up TeamCity CI so that it unpacks Xamarin components?

In Visual Studio everything works and a Components directory is created with the appropriate dlls. However, TeamCity is not able to retrieve the Android Support Library dlls because the trigger for the restore is a Xamarin VS plugin that runs when loading the solution. The equivalent of nuget package restore for Xamarin is xamarin-component. I have placed the xamarin-component.exe in my C:\Windows directory. To configure TeamCity, I prepended a Command Line build step with
Command executable: xamarin-component
Command parameters: restore mysolution.sln
TeamCity runs as NT Authority\System. So using PsExec,
psexec -i -s %SystemRoot%\system32\cmd.exe
If I then run 'xamarin-component login'
INFO (login): Computed cookie jar path: C:\Windows\system32\config\systemprofile\.xamarin-credentials
INFO (login): Computed cookie jar path: C:\Windows\system32\config\systemprofile\.xamarin-credentials
INFO (login): Credentials successfully stored.
When I go to my solution in cmd and attempt the restore, I get an attempt to download the componet, and then a Json parsing error. This is the same error I get in TeamCity.
I get the error if I use 'Administrator' (which stores the credential in C:\Users\Administrator. Earlier when I was using my personal account, it did work. However, once I deleted the C:\Users\tim\AppData\Local\Xamarin\Cache\Components, the same issue emerged. Fiddler shows that rather than getting Json back (as we do when we enter an invalid token) we are getting a 302 redirect that says Object moved here. And here is the xamarin
login page - obviously not Json.
Tried.
1. Set COOKIE_JAR_PATH to C:\Users\tim.xamarin-credentials - xpkg picks up but same error
2. Copy .xamarin-credentials from Config\system32 to D:\, set COOKIE_JAR_PATH to D:.xamarin-credentials - xpkg picks up but same error
3. Move .xamarin-credentials to C:\, set COOKIE_JAR_PATH - same error
4. Re-login in NT Authority with COOKIE_JAR_PATH to C:.xamarin-credentials - same error
My temporary idea now is to figure out where the NT Authority xamarin-component looks for Cache and put the files there.
C:\Windows\system32\config\systemprofile\AppData\Local\Xamarin\Cache\Components\xamandroidsupportv4-18-4.18.1.xam
The version of my xamarin-component is 0.99 - for 100, we try harder...
I’ve had trouble actually getting the cookie jar to load correctly from the system32 path. I think this is a path virtualization issue that I just don't understand well enough to make heads or tails of.
I ended up adding an environment variable that the tool will read from (I'm its principal author at Xamarin :-) that specifies the cookie jar path to read from, and this solved the problem for others using TeamCity. The environment variable is COOKIE_JAR_PATH.
You can set it from TeamCity's environment settings to point to a cookie jar path outside of the system32 profile directory (I think in my original testing, I put it in the root of the C: drive, but it can be anywhere, really).
As a hack, I copied the Cache folder from
C:\Users\tim\AppData\Local\Xamarin
to
C:\Windows\system32\config\systemprofile\AppData\Local\Xamarin\
That bypassed communication with the Xamarin server.
Update. I suspect it might be a bad link or setup on their server side. When xamarin-component restore is called, a call is made to
GET /api/available_versions?alias=xamandroidsupportv4-18 HTTP/1.1
which returns "Object moved to here" where "here" is nowhere.
If you start Visual Studio after deleting the Cache and Components folder (next to the solution), Xamarin makes a call to
GET /api/download/xamandroidsupportv4-18/4.18.1 HTTP/1.0
which has a similar looking Object moved to, but this time it directs you to xamarin-components.s3.amazonaws.com/
GET /fdca922d2b77799fe208a08c9f3444fe/xamandroidsupportv4-18-4.18.1.xam HTTP/1.0
Perhaps something changed, or the available_versions API has changed.
Thanks very much for this question and your answers to it. I didn't really like the idea of storing an auth cookie on the build node or having to copy a cache there manually, so I came up with my own solution so I hacked around this problem with a quick Powershell script that mimics the behaviour of the xamarin-component.exe restore action:
param
(
[Parameter(Mandatory=$true)]
$authCookie,
[Parameter(Mandatory=$true)]
$componentDirectory,
[Parameter(Mandatory=$true)]
$project
)
[void]([System.Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem'))
$xml = [xml] $(cat $project);
$components = $xml.Project.ItemGroup.XamarinComponentReference | ? { $_.Include.Length -gt 0 } | % { $_.Include };
if (!(test-path $componentDirectory))
{
echo "$componentDirectory didn't exist, so it was created.";
[void](mkdir $componentDirectory);
}
foreach ($component in $components)
{
$source = "http://components.xamarin.com/download/$component";
$destination = "$componentDirectory\$component.zip";
if (test-path $destination)
{
echo "$destination already exists, skipping...";
continue;
}
echo "Downloading $component from $source to $destination...";
$client = New-Object System.Net.WebClient
$client.Headers.Add([System.Net.HttpRequestHeader]::Cookie, "XAM_AUTH=$authCookie");
try
{
$client.DownloadFile($source, $destination);
}
catch
{
# The error message will be on one of these lines hopefully:
write-error "Failed to download! Errors are below:";
write-error $_
write-error $_.Exception
write-error $_.Exception.InnerException
write-error $_.Exception.InnerException.InnerException
exit 1;
}
if (!(test-path $destination))
{
write-error "$destination doesn't exist - the download must have failed!";
exit 1;
}
echo "Decompressing $source to $componentDirectory"
[System.IO.Compression.ZipFile]::ExtractToDirectory($destination, $componentDirectory)
echo ""
}
echo "Done!";
The -authCookie parameter can be extracted from either the XAM_AUTH cookie in your browser or from the .xamarin-credentials "cookiejar" in your home directory. It's nice to have it parameterised like this so you can store it as a secret variable in TeamCity.
The componentDirectory parameter must be the full path to the component directory - it will be created if it doesn't exist.
The project parameter should be the path to your project that you want to restore packages for - if you have multiple projects that need this then you'll have to execute the script for each one. Don't specify your solution as it won't work.
Unfortunately, this isn't very resilient to Xamarin's whims - a simple API change could render this useless, so obviously the best solution is to wait for Xamarin to fix this. I e-mailed Xamarin support to complain about this problem but I don't imagine I'll get a timely response (they seem very very busy these days). I hope this is useful!
Create directory and put that directory path in environment variable XAMARIN_CACHEPATH

After nuget package uninstal ps script modifies project, the project has to be manually saved

I'm a bit confused about what I need to do in a nuget uninstall script when I am removing items from the project's xml. First, the basic scripts below work - in that they correctly remove items that I'd added in the install script. But when the uninstall process finishs, the project is still marked dirty and has to be saved. I'd like that to be all taken care of when I'm done. Here is the code I've got:
Import-Module (Join-Path $toolsPath msbuild.psm1)
#
# Get the project
#
$project = Get-Project
$buildProject = Get-MSBuildProject
#
# Next, add the import statements
#
$imports = $buildProject.XML.Imports | ? {([System.IO.FileInfo] $_.Project).Name -eq "LINQTargets.targets" }
if ($imports)
{
foreach ($i in $imports)
{
$buildProject.XML.RemoveChild($i)
}
}
$project.Save() #persists the changes
$buildProject.Save()
After my nuget package has been uninstalled, I then have to "save" the project. How can I avoid that? I'd like it to be left in a clean state so the user has no option of not-saving the file after the uninstal (as that would leave the project file in a very inconsistent state!).

Forcebuild CruiseControl.net from remote command line

Is there a command line switch to forcebuild cruisecontrol.net remotely. I am trying to avoid going to cctray and forcebuilding it manually every morning. It seems I have to create custom hook on the CruiseControl server by creating my own custom web service.
What about writing a Powershell Wrapper around ThoughtWorks.CruiseControl.Remote.dll? We do something very similar in a project we call CruiseHydra which emulates the ability to split multiple tasks across several build servers. I have attempted to extract the portions that should be relevant to you here. Please note that I have not tested this exact code, our library wraps this deep in its own abstraction, but the jist of it is here:
using ThoughtWorks.CruiseControl.Remote;
public ForceBuild(String ServerAddress, String projectToExecute)
{
RemoteCruiseManagerFactory rcmf = new RemoteCruiseManagerFactory();
ICruiseManager ccnetServer = rcmf.GetCruiseManager(ServerAddress);
ccnetServer.ForceBuild(projectToExecute,"Forced By Programatic Wrapper");
}
You can obviously change the second argument to ForceBuild to be the name of your task. It's whats shown under the 'Integration Request' section on the dashboard.
If you're building every morning, why not set up a schedule trigger instead?
UPDATE BASED ON NEW INFORMATION:
If your Power Shell script can be modified to modify an internally accessible web page (update a time stamp text in the HTML), then you can use the urlTrigger
There is a tool called CCCmd which is included in the CC.NET installer. This is a command-line interface that allows forcing a build remotely.
Run this in the same directory of the ccnet.config file
"C:\Program Files (x86)\CruiseControl.NET\server\ccnet.exe" -r off -p [Project Name]
I had a similar requirement - to trigger a project from Nant/C# code. With the help of fiddler found out what was happening when we click on 'Force Build' on the Project's web dashboard.
You can send this URL to the build server. Do note the parameters in URL"ForceBuild=Force".
http://your-build-server/ccnet/server/local/project/your-project-name/ViewProjectReport.aspx?ForceBuild=Force
The "local" in the URL could vary depending on your configuration. For that, first try to fetch project report from CCTray and see what is the URL of your Cruise Control.NET project. Based on the URL modify it to trigger the project.
Good luck!
What about splitting the problem? Set up a new CCNET project that has a PowerShell task and a ForceBuild publisher which triggers the original project:
<cruisecontrol>
<project name="OriginalProject">
<!-- ... -->
</project name="NewProject">
<project>
<tasks>
<powershell>
<script>CreateDatabase.ps</script>
<!-- ... -->
</powershell>
</tasks>
<publishers>
<forcebuild>
<project>OriginalProject</project>
</forcebuild>
</publishers>
</project>
</cruisecontrol>
In case you want to run the original project only if powershell task went through without any errors just move the forcebuild block from the publishers to the tasks section.
You can trigger directly by submitting a HTTP post. No need to create a separate URL or URL trigger. If Powershell is an option, this works for us (note that our builds have parameters which ccnet prefixes with "param_" in the post variable names, you can omit or tailor the parameters with this prefix for your needs):
function Build-CCNetProject {
param(
[string] $hostname,
[string] $server,
[string] $username,
[string] $password,
[string] $project,
[string] $param_environment,
[string] $param_build_version,
[string] $param_request_id
)
$securePassword = ConvertTo-SecureString "$password" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($username, $securePassword)
$postParams = #{projectName="$project";serverName="$server";ForceBuild='Force';param_environment="$param_environment";param_build_version="$param_build_version";param_request_id="param_request_id";submit='Build'}
$postUrl = "http://{0}/ccnet/server/{1}/project/{2}/ViewProjectReport.aspx" -f $hostname, $server, $project
Invoke-WebRequest -Uri $postUrl -Method POST -Body $postParams -Credential $credential
}
# Usage:
Build-CCNetProject -hostname "teamcity" -server "somehost" -username "foo\bar" -password "baz" -project "awesome-app" -param_environment "uat" -param_build_version "1.0.1.123" -param_request_id "1"

Resources