How to set build configuration-dependent variable in AppVeyor - visual-studio

For the Debug build configuration of Visual Studio, I want to exclude certain tests from CTest. My idea was to do something like
matrix:
- configuration: Release
environment:
EXCLUDETESTS: ""
- configuration: Debug
environment:
EXCLUDETESTS: "solver"
i.e. creating a corresponding string environment variable called EXCLUDETESTS per configuration.
But this gives me Error parsing appveyor.yml: "matrix" section must be a mapping. (Line: 15, Column: 3), though the syntax should be fine according to http://yaml-online-parser.appspot.com/
The complete appveyor.yml file reads
version: "{build}"
os:
- Visual Studio 2017
- Visual Studio 2015
# x64 is a CMake-compatible solution platform name.
# This allows us to pass %PLATFORM% to CMake -A.
platform:
- x64
# Build Configurations, i.e. Debug, Release, etc.
# EXCLUDETESTS determines which tests will not be run
matrix:
- configuration: Release
environment:
EXCLUDETESTS: ""
- configuration: Debug
environment:
EXCLUDETESTS: "solver"
environment:
- PYTHON: "C:\\Python36-x64"
# Cmake will autodetect the compiler, but we set the arch
before_build:
- set PATH=%PYTHON%;%PATH%
- set CXXFLAGS=%additional_flags%
- cmake -H. -BBuild -A%PLATFORM% -DUI_CXX_USE_QT=OFF
# Build with MSBuild
build:
project: Build\spirit.sln # path to Visual Studio solution or project
parallel: true # enable MSBuild parallel builds
verbosity: normal # MSBuild verbosity level {quiet|minimal|normal|detailed}
install:
- "%PYTHON%/Scripts/pip.exe install numpy"
test_script:
- cd Build
- ctest --output-on-failure -C %CONFIGURATION% -E %EXCLUDETESTS%

Please check Exclude configuration from the matrix. Something like this should work for you:
configuration:
- Debug
- Release
environment:
matrix:
- EXCLUDETESTS: solver
- EXCLUDETESTS:
matrix:
exclude:
- configuration: Release
EXCLUDETESTS: solver
- configuration: Debug
EXCLUDETESTS:

Related

GitHubActions on Windows host (powershell?): exit code of previous lines being ignored

I had this step in a macOS lane:
jobs:
macOS_build:
runs-on: macOS-latest
steps:
- uses: actions/checkout#v1
- name: Build in DEBUG and RELEASE mode
run: ./configure.sh && make DEBUG && make RELEASE
Then I successfully split it up this way:
jobs:
macOS_build:
runs-on: macOS-latest
steps:
- name: Build in DEBUG and RELEASE mode
run: |
./configure.sh
make DEBUG
make RELEASE
This conversion works because if make DEBUG fails, make RELEASE won't be executed and the whole step is marked as FAILED by GitHubActions.
However, trying to convert this from the Windows lane:
jobs:
windows_build:
runs-on: windows-latest
steps:
- uses: actions/checkout#v1
- name: Build in DEBUG and RELEASE mode
shell: cmd
run: configure.bat && make.bat DEBUG && make.bat RELEASE
To this:
jobs:
windows_build:
runs-on: windows-latest
steps:
- uses: actions/checkout#v1
- name: Build in DEBUG and RELEASE mode
shell: cmd
run: |
configure.bat
make.bat DEBUG
make.bat RELEASE
Doesn't work, because strangely enough, only the first line is executed. So I tried trying to change the shell attribute to powershell:
jobs:
windows_build:
runs-on: windows-latest
steps:
- uses: actions/checkout#v1
- name: Build in DEBUG and RELEASE mode
shell: powershell
run: |
configure.bat
make.bat DEBUG
make.bat RELEASE
However this fails with:
configure.bat : The term 'configure.bat' is not recognized as the name
of a cmdlet, function, script file, or operable program. Check the
spelling of the name, or if a path was included, verify that the path
is correct and try again.
Then I saw this other SO answer, so I converted it to:
jobs:
windows_build:
runs-on: windows-latest
steps:
- uses: actions/checkout#v1
- name: Build in DEBUG and RELEASE mode
shell: powershell
run: |
& .\configure.bat
& .\make.bat DEBUG
& .\make.bat RELEASE
This finally launches all batch files independently, however it seems to ignore the exit code (so if configure.bat fails, it still runs the next lines).
Any idea how to separate lines in a GithubActions workflow properly?
In PowerShell, you'll have to check the automatic $LASTEXITCODE variable after each call if you want to take action on the (nonzero) exit code of the most recently executed external program or script:
if ($LASTEXITCODE) { exit $LASTEXITCODE }
If you want to keep the code small, you could check for intermediate success vs. failure via the automatic $? variable, which is a Boolean that contains $true if the most recent command or expression succeeded, which in the case of external programs is inferred if the exit code is 0:
.\configure.bat
if ($?) { .\make.bat DEBUG }
if ($?) { .\make.bat RELEASE }
exit $LASTEXITCODE
Note that if you were to use PowerShell (Core) 7+, you could use the bash-like approach, since && and ||, the pipeline-chain operators, are now supported - as long as you end each statement-internal line with &&, you can place each call on its own line:
# PSv7+
.\configure.bat &&
.\make.bat DEBUG &&
.\make.bat RELEASE
However, note that any nonzero exit code is mapped onto 1 when the PowerShell CLI is called via -Command, which is what I presume happens behind the scenes, and assuming that an external program is called last. That is, the specific nonzero exit code is lost. If it is of interest, append an exit $LASTEXITCODE line to the above.

Stop github action matrix case

I want to use a github action matrix for different build types, but there's one case of the matrix that I'm not interested in supporting. How do I stop this case from running but still get the build to marked successfully.
In this particular case I want to build Windows and Ubuntu, 32bit and 64bit but I'm not interested in supporting 32bit on Ubuntu. So my matrix would be:
strategy:
fail-fast: false
matrix:
os: [windows-latest, ubuntu-latest]
platform: ['x64', 'x86']
My current solution is to stop each action running by adding an if expression:
- name: Build Native
if: ${{ ! (matrix.os == 'ubuntu-18.04' && matrix.platform == 'x86') }}
While this works okay, I feel there ought to be a more elegant way of solving this. Can anyone help make my yaml script more beautiful?
Perhaps the strategy.matrix.exclude directive is suitable?
From the documentation:
You can remove a specific configurations defined in the build matrix
using the exclude option. Using exclude removes a job defined by the
build matrix.
So in your case, probably something like this:
strategy:
matrix:
os: [windows-latest, ubuntu-latest]
platform: ['x64', 'x86']
exclude:
- os: ubuntu-latest
platform: x86
There are situations where one wants to include or exclude specific matrix coordinates so as not to run some of them, yet (stretching the question a bit) also still want the job to run for a couple of these coordinates, so as to track the evolution of it across commits, while not blocking the whole process.
In that situation, continue-on-error at the job level, combined with matrix include and exclude is very useful:
Prevents a workflow run from failing when a job fails. Set to true to allow a workflow run to pass when this job fails.
This is similar to GitLab CI's allow_failure, although at time of writing GitHub Actions UI only has two states (red failed and green passed) whereas GitLab introduces a third one (orange warning) in that case.
Here is a real-life workflow example:
jobs:
linux:
continue-on-error: ${{ matrix.experimental }}
strategy:
fail-fast: false
matrix:
os:
- ubuntu-20.04
container:
- 'ruby:2.0'
- 'ruby:2.1'
- 'ruby:2.2'
- 'ruby:2.3'
- 'ruby:2.4'
- 'ruby:2.5'
- 'ruby:2.6'
- 'ruby:2.7'
- 'ruby:3.0'
- 'ruby:2.1-alpine'
- 'ruby:2.2-alpine'
- 'ruby:2.3-alpine'
- 'ruby:2.4-alpine'
- 'ruby:2.5-alpine'
- 'ruby:2.6-alpine'
- 'ruby:2.7-alpine'
- 'jruby:9.2-jdk'
experimental:
- false
include:
- os: ubuntu-20.04
container: 'ruby:3.0.0-preview2'
experimental: true
- os: ubuntu-20.04
container: 'ruby:3.0.0-preview2-alpine'
experimental: true

How to setup CI/CD for nativescript using Visual studio online/Azure dev ops tools

Im try so setup a CI/CD Pipeline for a Nativescript app, added the commands to install node and npm install but nativescript has dependencies that it need. How do I go about on Azure dev ops dynamically without having to create a vm that has nativescript and all its dependencies installed and setup
So i have used a VM and install nativescript on it and used and agent to connect to the machine and build solution, i have done the same using jenkins but jenkins was running on the vm, no i want to move the whole pipeline on azure dev ops
command used in build step: tns build android
If you don't want to use a vm you'll have to install everything needed for nativescript before building it on their hosted agent each time you create a build for your app.
A couple of important things to note. First the name of your repository is changed to 's' this messes with the naming of your entitlement file... or at least it did for me. I fix this with a bash file I add to my repository that changes the name of the path in build.xcconfig for the CODE_SIGN_ENTITLEMENTS variable. I added an npm run entitle command in my package.json file to run this before building. Second you'll want to store all files and secure passwords in the library section under pipelines in you Azure Devops project. Third, Using the classic editor is your best friend for figuring out yaml as most jobs have an option to view the YAML. You can also use the classic editor as an alternative to the YAML file
The YAML and bash file below show an example of how you can build an ipa and apk file that is stored as an artifact. You can then use that trigger a release pipeline to push the the play and app store.
# YAML File
name: Release Build
trigger:
- release/* # will start build for pull request into release branch ie. realease/version_1_0_0, release/version_2_0_0
pool:
vmImage: 'macOS-10.13'
variables:
scheme: 's' # default name/scheme created on this machine for ipa
sdk: 'iphoneos'
configuration: 'Release'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.14'
displayName: 'Install Node.js'
# Download Secure File for Android
# Note: if multiple secure files are downloaded... variable name will change and break pipeline
- task: DownloadSecureFile#1
displayName: 'download android keystore file'
inputs:
secureFile: myKeystore.keystore
#Install Apple Certificate(Distrobution)
- task: InstallAppleCertificate#2
displayName: 'Install an Apple certificate Distribution (yourcertificate.p12)'
inputs:
certSecureFile: '00000000-0000-0000-0000-000000000000' # got id from viewing file in clasic editor for pipeline
certPwd: '$(myCertificatePasswordP12)' # password stored in Library
# Install Apple Provisioning Profile(Distrobution)
- task: InstallAppleProvisioningProfile#1
displayName: 'Apple Provisioning Profile(myProvisioningProfile.mobileprovision)'
inputs:
provisioningProfileLocation: 'secureFiles' # Options: secureFiles, sourceRepository
provProfileSecureFile: '00000000-0000-0000-0000-000000000000' # Required when provisioningProfileLocation == SecureFiles
# General Set Up
- script: |
npm install -g nativescript#latest
npm install
displayName: 'Install native script and node Modules'
# variable explination
# $DOWNLOADSECUREFILE_SECUREFILEPATH is keystore file downloaded earlier
# $KEYSTORE_PASSWORD refers to the environment variable in this script which references library variable
# $(MyPlayStoreAlias) refers to library variable for your apps alias
# $BUILD_SOURCESDIRECTORY location where apk is built to
# Android
- script: |
tns build android --env.production --release --key-store-path $DOWNLOADSECUREFILE_SECUREFILEPATH --key-store-password $KEYSTORE_PASSWORD --key-store-alias $(MyPlayStoreAlias) --key-store-alias-password $KEYSTORE_PASSWORD --bundle --copy-to $BUILD_SOURCESDIRECTORY #creates apk
displayName: 'Build Android Release apk'
env:
KEYSTORE_PASSWORD: $(MyPlayStoreKeystore)
# create apk artifact
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: '$(Build.SourcesDirectory)/app-release.apk'
artifactName: 'apkDrop'
displayName: 'Publishing apkDrop artifact'
# have to use xcode 10.1 to meet min standards for uploading ipa... default version for this machine was lower than 10.1
#changing xcode version
- script: |
xcodebuild -version
/bin/bash -c "echo '##vso[task.setvariable variable=MD_APPLE_SDK_ROOT;]'/Applications/Xcode_10.1.app;sudo xcode-select --switch /Applications/Xcode_10.1.app/Contents/Developer"
xcodebuild -version
displayName: 'changing xcode to 10.1'
# Optional... was running into build issues with latest version
#downgrading cocoapods version
- script: |
sudo gem uninstall cocoapods
sudo gem install cocoapods -v 1.5.3
displayName: 'Using cocoapods version 1.5.3'
#iOS
- script: |
xcodebuild -version # makeing sure the correct xcode version is being used
pip install --ignore-installed six # fixes pip 6 error
npm run entitle #custom bash script used to change entitlement file
tns run ios --provision #see what provisioning profile and certificate are installed... helpful for debugging
tns build ios --env.production --release --bundle #creates xcworkspace
displayName: 'Build ios Release xcworkspace'
#build and sign ipa
- task: Xcode#5
displayName: 'Xcode sign and build'
inputs:
sdk: '$(sdk)' # custom var
scheme: '$(scheme)' # must be provided if setting manual path to xcworkspace
configuration: '$(configuration)' # custom var
xcodeVersion: 'specifyPath'
xcodeDeveloperDir: '/Applications/Xcode_10.1.app' #using xcode 10.1
xcWorkspacePath: 'platforms/ios/s.xcworkspace'
exportPath: '$(agent.buildDirectory)/output/$(sdk)/$(configuration)' #location where ipa file will be stored
packageApp: true #create ipa
signingOption: manual
signingIdentity: '$(APPLE_CERTIFICATE_SIGNING_IDENTITY)' # distribution certificate
provisioningProfileUuid: '$(APPLE_PROV_PROFILE_UUID)' # distribution profile
#creating ipa artifact
- task: PublishBuildArtifacts#1
displayName: 'Publishing ipaDrop artifact'
inputs:
pathtoPublish: '$(agent.buildDirectory)/output/$(sdk)/$(configuration)/s.ipa'
artifactName: 'ipaDrop'
Bash file
#!/usr/bin/env bash
# filename: pipeline-entitlements.sh
echo "Editing build.xcconfig"
TARGET_KEY="CODE_SIGN_ENTITLEMENTS"
REPLACEMENT_VALUE="s\/Resources\/YOURENTITLEMENTFILENAME.entitlements"
CONFIG_FILE="./app/App_Resources/iOS/build.xcconfig"
echo "Editing $TARGET_KEY and replaceing value with $REPLACEMENT_VALUE"
sed -i.bak "s/\($TARGET_KEY *= *\).*/\1$REPLACEMENT_VALUE/" $CONFIG_FILE
echo "Finished editing build.xcconfig"

Change compose file generated by Visual Studio Docker Tools

I have an ASP.NET Core 2 project that uses the UseWebpackDevMiddleware from Microsoft.AspNetCore.SpaServices.Webpack.
Unfortunately the aspnet webpack node plugin is complaining about Error: ENOENT: no such file or directory, lstat 'c:\ContainerMappedDirectories'. See NodeJS Issue for details.
There's a workaround, but I can't try it out because I cannot change the docker compose file that Visual Studio is generating. The file they generate under obj\Docker\docker-compose.vs.debug.g.yml, always creates a volume from my project to C:\app, but for the workaround I need it to point to G:\.
Any idea how I can force Visual Studio to use different values when it generates these debugger compose files?
This is what the generated file looks like:
version: '3.6'
services:
employeemap.app:
image: employeemapapp:dev
environment:
- DOTNET_USE_POLLING_FILE_WATCHER=1
- NUGET_PACKAGES=C:\.nuget\packages
- NUGET_FALLBACK_PACKAGES=c:\.nuget\fallbackpackages
volumes:
- C:\Users\nswimberghe\projects\EmployeeMap\EmployeeMap.App:C:\app
- C:\Users\nswimberghe\onecoremsvsmon\15.0.27428.1:C:\remote_debugger:ro
- C:\Users\nswimberghe\.nuget\packages\:c:\.nuget\packages:ro
- C:\Program Files\dotnet\sdk\NuGetFallbackFolder:c:\.nuget\fallbackpackages:ro
entrypoint: C:\\remote_debugger\\x64\\msvsmon.exe /noauth /anyuser /silent /nostatus /noclrwarn /nosecuritywarn /nofirewallwarn /nowowwarn /timeout:2147483646
labels:
com.microsoft.visualstudio.debuggee.program: "\"C:\\Program Files\\dotnet\\dotnet.exe\""
com.microsoft.visualstudio.debuggee.arguments: " --additionalProbingPath c:\\.nuget\\packages --additionalProbingPath c:\\.nuget\\fallbackpackages bin\\Debug\\netcoreapp2.0\\EmployeeMap.App.dll"
com.microsoft.visualstudio.debuggee.workingdirectory: "C:\\app"
com.microsoft.visualstudio.debuggee.killprogram: "C:\\remote_debugger\\x64\\utils\\KillProcess.exe dotnet.exe"
There are other containers in the composer file which I removed for simplicity.
Only the employeemap.app should use G:\.
Add new file "docker-compose.vs.debug.yml" to your docker-compose project.
Fill it like this
version: '3.6'
services:
employeemap.app:
volumes:
- .\EmployeeMap.App:G:\
labels:
com.microsoft.visualstudio.debuggee.workingdirectory: "G:\\"

GitLab CI. Path in yml for different users

I'm trying to set up GitLab CI for .net project. Now I'm writing script in yml file. What I want to know: the path to the msbuild.exe and mstest.exe may be different for the different team members, how the same yml script may work for different users?
Or may be I'm understand how GitLab CI work in wrong way?
The path to the mstest.exe and all other referenced executable and files is based on the machine that has the GitLab runner running.
What's on your machine or anyone else's doesn't matter; Only the build server matters, so write your gitlab .yml accordingly.
Sample .net yml file
##variables:
## increase indentation carefully, one space per cascade level.
## THIS IS YAML. NEVER USE TABS.
stages:
- build
- deploy
#BUILD
# Builds all working branches
working:
stage: build
except:
- master
script:
- echo "Build Stage"
- echo "Restoring NuGet Packages..."
- '"c:\nuget\nuget.exe" restore "SOLUTION PATH"'
# - '"c:\nuget\nuget.exe" restore "ANOTHER ABSOLUTE PATH TO YOUR SOLUTION"'
- ''
- echo "Building Solutions..."
- C:\Windows\Microsoft.NET\Framework64\v4.0.30319\msbuild.exe /consoleloggerparameters:ErrorsOnly /maxcpucount /nologo /property:Configuration=Release /verbosity:quiet "SOLUTION PATH"
# Builds all stable/master pushes
stable:
stage: build
only:
- master
script:
- echo "Build Stage"
- echo "Restoring NuGet Packages..."
- '"c:\nuget\nuget.exe" restore "SOLUTION PATH"'
# - '"c:\nuget\nuget.exe" restore "ANOTHER ABSOLUTE PATH TO YOUR SOLUTION"'
- ''
- echo "Building Solutions..."
- C:\Windows\Microsoft.NET\Framework64\v4.0.30319\msbuild.exe /consoleloggerparameters:ErrorsOnly /maxcpucount /nologo /property:Configuration=Release /verbosity:quiet "SOLUTION PATH"
#DEPLOY
stage: deploy
only:
- dev
script:
- echo "Deploy Stage"
#SEND TO YOUR DEV SERVER
## deploy latest master to the correct servers
stage: deploy
script:
- echo "Deploy Stage"
only:
- master
#SEND TO YOUR PRODUCTION SERVER
tags:
- .NET
#put tags here you put on your runners so you can hit the right runners when you push your code.

Resources