Batch script to Queue the builds in tfs 2015/2017 - visual-studio

I was trying to execute builds using a Batch script, I wrote one but getting this error:
please define the Build definition name. tfsbuild start /collection:https://tfs.prod.dcx.int.bell.ca/tfs/bellca/Consumer/builds/All Definitions/{Release)/{Project-name}/{Build definition name}
How can can I fix this?

The tfsbuild command line tool is only for XAML builds. For modern builds, you'll need to use the REST API or the C# wrapper for the REST API.
The documentation has good examples, but basically POST https://{instance}/DefaultCollection/{project}/_apis/build/builds?api-version={version}
with an appropriate body:
{
"definition": {
"id": 25
},
"sourceBranch": "refs/heads/master",
"parameters": "{\"system.debug\":\"true\",\"BuildConfiguration\":\"debug\",\"BuildPlatform\":\"x64\"}"
}

Yes, just as Daniel said, you need to use the REST API, See Queue-a-build.
You could simply use below PowserShell script to queue the builds (Just replace the params accordingly):
Param(
[string]$collectionurl = "http://server:8080/tfs/DefaultCollection",
[string]$projectName = "ProjectName",
[string]$keepForever = "true",
[string]$BuildDefinitionId = "34",
[string]$user = "username",
[string]$token = "password"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
function CreateJsonBody
{
$value = #"
{
"definition": {
"id": $BuildDefinitionId
},
"parameters": "{\"system.debug\":\"true\",\"BuildConfiguration\":\"debug\",\"BuildPlatform\":\"x64\"}"
}
"#
return $value
}
$json = CreateJsonBody
$uri = "$($collectionurl)/$($projectName)/_apis/build/builds?api-version=2.0"
$result = Invoke-RestMethod -Uri $uri -Method Post -Body $json -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}

Related

Sample ansible custom fact on Windows

Does anyone have a sample of a working powershell script to create a custom fact on Windows for Ansible ? I tried something similar on Linux and it works fine but on Windows I cannot get it to work. It does not help that I cannot find any example on the internet either !
I have tried a number of variations of the following powershell script but am unable to get it to work. To be clear, I am able to display the actual custom fact but am unable to access it using say ansible_facts.some_var
$myjson = #"
{
"some_var": "some_value"
}
"#
Write-Output $myjson
Variations attempted
Tried converting to json Write-Output $myjson | ConvertToJson
Added/removed quotes
I've wrestled with windows facts in the past. This is what I do:
This is a test facts file you need to place on your Windows server. I use ansible to put it in place:
$InstanceType = $(Invoke-RestMethod -uri http://169.254.169.254/latest/meta-data/instance-type)
$AvailZone = $(Invoke-RestMethod -uri http://169.254.169.254/latest/meta-data/placement/availability-zone)
$AMI_ID = $(Invoke-RestMethod -uri http://169.254.169.254/latest/meta-data/ami-id)
#{
local = #{
local_facts = #{
cloud = 'AWS'
instance_type = $InstanceType
avail_zone = $AvailZone
ami_id = $AMI_ID
region = 'eu-west-1'
environment = 'DIT'
Support_Team = 'Win_Team'
Callout = '6-8'
}
}
}
You don't need to nest the local & local_facts, that's just my personal preference.
At the top of the file, I've added a few variables to grab the AWS metadata but you can change these to something else.
I put my file in C:\TEMP\facts
Then run ansible:
ansible win -m setup -a fact_path=C:/TEMP/facts
and you should get out something like this:
"ansible_local": {
"local": {
"local_facts": {
"Callout": "6-8",
"Support_Team": "Win_Team",
"ami_id": "ami-07d8c98607d0b1326",
"avail_zone": "eu-west-1c",
"cloud": "AWS",
"environment": "DIT",
"instance_type": "t2.micro",
"region": "eu-west-1"
}
}
},
if you put the JSON into a file, you should be able to query it with a command like this:
cat file1 | jq '.[] .ansible_facts.ansible_local.local.local_facts'
{
"ami_id": "ami-0c95efaa8fa6e2424",
"avail_zone": "eu-west-1c",
"region": "eu-west-1",
"Callout": "6-8",
"environment": "DIT",
"instance_type": "t2.micro",
"Support_Team": "Win_Team",
"cloud": "AWS"
}
it is a late answer, but it works for me:
i use the following powershell-script with the name installed_software.ps1 to get the installed software:
# get installed software
Get-WmiObject -Class Win32_Product | select Name, Version
put the powershell script in the path c:\temp\facts.
I call it with:
ansible win -m setup -a 'fact_path=C:/TEMP/facts gather_timeout=20 gather_subset=!hardware,!network,!ohai,!facter'
read the documentation here: https://docs.ansible.com/ansible/latest/collections/ansible/builtin/setup_module.html
the most important thing is: do not output json. The setup-module in windows expect raw hashtables, arrays, or other primitive objects.
My output within the json looks like:
...
"ansible_installed_software": [
{
"Name": "Office 16 Click-to-Run Extensibility Component",
"Version": "16.0.15427.20178"
},
{
"Name": "Office 16 Click-to-Run Localization Component",
"Version": "16.0.14131.20278"
},
...

Azure Databricks API: import entire directory with notebooks

I need to import many notebooks (both Python and Scala) to Databricks using Databricks REST API 2.0
My source path (local machine) is ./db_code and destination (Databricks workspace) is /Users/dmitriy#kagarlickij.com
I'm trying to build 2.0/workspace/import call, so my body is: { "content": "$SOURCES_PATH", "path": "$DESTINATION_PATH", "format": "SOURCE", "language": "SCALA", "overwrite": true }
However I'm getting error: Could not parse request object: Illegal character and as per documentation content must be The base64-encoded content
Should I encode all notebooks to base64?
Maybe there're some examples of importing directory to Databricks using API?
After some research I've managed to get it work:
Write-Output "Task: Create Databricks Directory Structure"
Get-ChildItem "$SOURCES_PATH" -Recurse -Directory |
Foreach-Object {
$DIR = $_.FullName.split("$WORKDIR_PATH/")[1]
$BODY = #{
"path" = "$DESTINATION_PATH/$DIR"
}
$BODY_JSON = $BODY | ConvertTo-Json
Invoke-RestMethod -Method POST -Uri "https://$DATABRICKS_REGION.azuredatabricks.net/api/2.0/workspace/mkdirs" -Headers $HEADERS -Body $BODY_JSON | Out-Null
}
Write-Output "Task: Deploy Scala notebooks to Databricks"
Get-ChildItem "$SOURCES_PATH" -Recurse -Include *.scala |
Foreach-Object {
$NOTEBOOK_NAME = $_.FullName.split("$WORKDIR_PATH/")[1]
$NOTEBOOK_BASE64 = [Convert]::ToBase64String([IO.File]::ReadAllBytes("$_"))
$BODY = #{
"content" = "$NOTEBOOK_BASE64"
"path" = "$DESTINATION_PATH/$NOTEBOOK_NAME"
"language" = "SCALA"
"overwrite" = "true"
"format" = "SOURCE"
}
$BODY_JSON = $BODY | ConvertTo-Json
Invoke-RestMethod -Method POST -Uri "https://$DATABRICKS_REGION.azuredatabricks.net/api/2.0/workspace/import" -Headers $HEADERS -Body $BODY_JSON | Out-Null
}
Yes, the notebooks must be encoded to base64. You can use Powershell to achieve this. Try the below.
$BinaryContents = [System.IO.File]::ReadAllBytes("$SOURCES_PATH")
$EncodedContents = [System.Convert]::ToBase64String($BinaryContents)
The body for the REST call would look like this.
{
"format": "SOURCE",
"content": "$EncodedContents",
"path": "$DESTINATION_PATH",
"overwrite": "true",
"language": "Scala"
}
With respect to importing whole directories, you could loop through the files and folders in your local directory using Powershell and make a REST call for each notebook.

How to update configuration in any environments in urban code using rest call?

I am trying to create rest api to update configuration in all environments in urban code.
Is there any rest client or do we need write any custom code.?
How do I start? Your kind suggestions please
Are you asking about updating environment properties? If so. I do this with Powershell.
$webUrl = "https://ibm-ucd.myCompany.com"
$ucdApiUserName = "yourCLIAccount"
$ucdApiPass = "yourCLIpassword"
$appName = "MyApplication"
$environment = "ADT"
$propertyNewValue = "myNewValue"
$credential = New-Object System.Management.Automation.PSCredential ($ucdApiUserName,(ConvertTo-SecureString $ucdApiPass -AsPlainText -Force))
####################################################################
## Bypass Cert Issues with connecting to HTTPS API
####################################################################
$certData = [string][System.Net.ServicePointManager]::CertificatePolicy
if ($certData -ne "TrustAllCertsPolicy")
{
add-type #"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"#
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
}
$hash = [ordered] #{
"application" = "$AppName";
"environment" = "$environment";
"isSecure" = "false";
"name" = "myEnvProperty";
"value" = "$propertyNewValue"
}
$newValuesJson = $hash | ConvertTo-Json
Write-Host "Updating uDeploy environment properties for $environment"
$uri = "$WebUrl/cli/environment/propValue?"
Invoke-RestMethod -Method 'PUT' -ContentType "application/json" -Credential
$credential -uri $uri -Body $newValuesJson | Out-Null

Problems when calling an API

We have two APIs that does a POST and GET requests. Both of them used to work perfectly fine but the API that does POST started giving an error:
Invoke-WebRequest : The underlying connection was: An unexpected error occurred on a receive.`
I have been trying to research for few days and all the KBs pointing to some sort of SSL/TLS and adding this piece of code:
[Net.ServicePointManager]::SecurityProtocol = "SystemDefault,Tls12, Tls11, Tls, Ssl3"
but I already had this code from the start. However, I cannot find a solution to my problem.
OS : Windows 2012
PowerShell Version: 4.0
function funName ($Val1, $Val2) {
[Net.ServicePointManager]::SecurityProtocol = "SystemDefault,Tls12, Tls11, Tls, Ssl3"
#[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
#[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls -bor [Net.SecurityProtocolType]::Tls11 -bor [Net.SecurityProtocolType]::Tls12
$url = "https://someAPI/post.request/do-something"
$user = "username"
$pass = "password"
$pair = "$($user):$($pass)"
$encodedCreds = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($pair))
$basicAuthValue = "Basic $encodedCreds "
$Headers = #{
"Accept"="application/json"
Authorization = $basicAuthValue
}
$Body = #{
'#type' ='x'
parm1 = $Val1
parm2 = $Val2
}
#[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
Invoke-WebRequest -Uri $url -Headers $Headers -Method Post -Body $Body | Out-Null
}
## Deactivation Request ffffffff##
funName -RequestID Val1 adID Val2
As stated earlier, this used to work up until last week.
Set this to the top of your script:
Add-Type #"
using System.Net;
using System.Security.Cryptography.X509Certificates;
namespace myTrust
{
public class TrustAllCertsPolicy : ICertificatePolicy
{
public bool CheckValidationResult( ServicePoint srvPoint, X509Certificate certificate, WebRequest request, int certificateProblem)
{
return true;
}
}
}
"#
$AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[System.Net.ServicePointManager]::CertificatePolicy = New-Object myTrust.TrustAllCertsPolicy
I was working on a similar request to retrieve data from API and I found out that I was calling my function as funName -val1 1234 val2 9999 where I was missing "-" on my seconda parameter. As soon as I fixed that it starting working again funName -val1 1234 -val2 9999 . Thanks Stackoverflow community for the help on this.

TFS 2015 API - 401 - Unauthorized: Access is denied due to invalid credentials

I am trying to call the REST API to get a previous builds details but when I try to run the script that calls the API, I get the error in the title:
401 - Unauthorized: Access is denied due to invalid credentials
It's using the credentials of the Build Agent on the build server. The build server can see the TFS url because it's able to successfully build. And if I try to call the API using my credentials, it works. It just won't work with the account that the build agent is running under.
Any ideas?
How did you set the Authorization in your script?
You can Use the OAuth token to access the REST API
To enable your script to use the build process OAuth token, go to the
Options tab of the build definition and select Allow Scripts to Access
OAuth Token (Reference below screenshot to enable the option).
Below script works on my side:
$url = "$($env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI)$env:SYSTEM_TEAMPROJECTID/_apis/build/builds/14?api-version=2.0"
Write-Host "URL: $url"
$result = Invoke-RestMethod -Uri $url -Headers #{
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
}
Write-Host "$result = $($result | ConvertTo-Json -Depth 1000)"
You can also set the Authorization in script like below: (hardcoded your credentials in the script)
e.g :
Param(
[string]$collectionurl = "http://server:8080/tfs/DefaultCollection",
[string]$projectName = "ProjectName",
[string]$keepForever = "true",
[string]$BuildId = "8",
[string]$user = "UserName",
[string]$token = "Password"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
$uri = "$($collectionurl)/$($projectName)/_apis/build/builds/$($BuildId)?api-version=2.0"
$result = Invoke-RestMethod -Uri $uri -Method Get -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
Write-Host "$result = $($result | ConvertTo-Json -Depth 1000)"

Resources