I have a list of jobs A, B, C, D... which run no matter if the previous job succeed.
The last of them is the artifact publish, that will be used by the Release pipeline. I would like to not run it if none of the previous jobs succeed.
I need to deploy any service which does not fail (so I need to pass the failed services creations) but prevent the automatic launch of the release pipeline linked if all failed.
Here is the YAML code given by Azure
pool:
name: Default
demands:
- msbuild
- visualstudio
steps:
- task: VSBuild#1
displayName: 'Generate Solution'
inputs:
solution: LisaMES.sln
platform: '$(PlatformName)'
configuration: '$(ConfigurationName)'
- task: ArchiveFiles#2
displayName: 'Create Service A'
inputs:
rootFolderOrFile: '$(ServiceName)\bin\$(ConfigurationName)'
includeRootFolder: false
archiveFile: 'Bin_Services/$(ServiceName)_$(ConfigurationName).zip'
condition: succeededOrFailed()
- task: ArchiveFiles#2
displayName: 'Create Service B'
inputs:
rootFolderOrFile: '$(ServiceName)\bin\$(ConfigurationName)'
includeRootFolder: false
archiveFile: 'Bin_Services/$(ServiceName)_$(ConfigurationName).zip'
condition: succeededOrFailed()
... C D E F like this
- task: PublishPipelineArtifact#0
displayName: 'Publish Pipeline Artifact'
inputs:
artifactName: Services
targetPath: 'Bin_Services'
condition: succeededOrFailed()
remove the condition on the PublishPipelineArtifact#0 As it currently reads this task will run irregardless if the preceding tasks succeeded or published.
Even if a previous dependency has failed, unless the run was canceled. Use succeededOrFailed() in the YAML for this condition.
Unless there is a reason you'd want to publish a subset of artifacts that succeeded (this could lead to an inconsistent build state as the same build could produce different artificats) would recommend removing the condition from all the tasks. The default behavior is to run each task only if the preceding task was successful.
Related
I have tried to use ManualValidation task in my pipeline but it did not passed because it is not installed from the market place but, when I go on the page on the market place I can not find any extension for manual validation on pipeline. Can someone provide me with information about this?
My azure-pipeline.yml looks like this:
trigger:
main
jobs:
job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320
steps:
task: ManualValidation#0
timeoutInMinutes: 1440
inputs:
notifyUsers: |
myemail#name.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'resume'
A task is missing. The pipeline references a task called 'ManualValidation'. This usually indicates the task isn't installed, and you may be able to install it from the Marketplace: https://marketplace.visualstudio.com.
I'm working on a Kotlin Multiplatform project which is building fine locally but I can't get it to work on an Azure DevOps pipeline.
Some good things to know:
not using Cocoapods
using the embedAndSignAppleFrameworkForXcode gradlew command in Build Phases
all commands using fastlane work for multiple developers locally
we use custom configurations like: ProjectADebug/ProjectARelease but we defined KOTLIN_FRAMEWORK_BUILD_TYPE for all of them
I'm trying to get an Azure DevOps pipeline to build and upload to App Store Connect using fastlane. We are using match for signing, that works great. Archiving fails and it looks like it's failing on building the shared KMM framework.
Anybody with the same problems that could help me out? Or some tips how I can view those gym logs on the Azure VM because I assume there it says what actually went wrong instead of this general error.
▸ Running script 'Build Kotlin Common'
▸ Copying /Users/runner/Library/Developer/Xcode/DerivedData/Project-ffubndppzitzbxhibjgeavrhnzpw/Build/Intermediates.noindex/ArchiveIntermediates/Project/BuildProductsPath/ProjectRelease-iphoneos/Airship_AirshipCore.bundle
▸ Copying /Users/runner/Library/Developer/Xcode/DerivedData/Project-ffubndppzitzbxhibjgeavrhnzpw/Build/Intermediates.noindex/ArchiveIntermediates/Project/BuildProductsPath/Project Release-iphoneos/Airship_AirshipAutomation.bundle
** ARCHIVE FAILED **
The following build commands failed:
PhaseScriptExecution Build\ Kotlin\ Common /Users/runner/Library/Developer/Xcode/DerivedData/Project-ffubndppzitzbxhibjgeavrhnzpw/Build/Intermediates.noindex/ArchiveIntermediates/Project/IntermediateBuildFilesPath/Project.build/ProjectRelease-iphoneos/Project.build/Script-2F4970EC27CD16A000E32F91.sh (in target 'Project' from project 'Project')
(1 failure)
ERROR [2022-05-10 13:04:32.36]: Exit status: 65
ERROR [2022-05-10 13:04:32.53]: ⬆️ Check out the few lines of raw `xcodebuild` output above for potential hints on how to solve this error
WARN [2022-05-10 13:04:32.53]: 📋 For the complete and more detailed error log, check the full log at:
WARN [2022-05-10 13:04:32.53]: 📋 /Users/runner/Library/Logs/gym/Project-Project.log
This is the the lane in Fastfile:
lane :azure_beta do |options|
label = options[:label].capitalize
git_url = "someURL"
match(
type: "appstore",
readonly: true,
git_url: git_url,
keychain_name: ENV["MATCH_KEYCHAIN_NAME"],
keychain_password: ENV["MATCH_KEYCHAIN_PASSWORD"],
verbose: true
)
build_app(
project: "../Project/Project.xcodeproj",
configuration: "#{label}Release",
scheme: label
)
# fails on the build_app step...
changelog = changelog_from_git_commits(
pretty: "- (%ae) %s",
date_format: "short",
merge_commit_filtering: "exclude_merges"
)
upload_to_testflight(
changelog: changelog,
app_identifier: label == "Project" ? idsProjectA : idsProjectB,
skip_waiting_for_build_processing: true
)
version_number = get_version_number(
xcodeproj: "../Project/Project.xcodeproj",
target: "Project", #Hardcoded because we have multiple targets, label is specificed in build_app configuration
configuration: "#{label}Release"
)
add_git_tag(
includes_lane: false,
prefix: "ios-#{label.downcase}-#{version_number}-",
build_number: number_of_commits
)
delete_keychain(name: ENV["MATCH_KEYCHAIN_NAME"])
end
And this is my pipeline YAML:
pool:
vmImage: 'macos-latest'
variables:
- group: fastlane
jobs:
- job: testflight
steps:
- task: Bash#3
displayName: fastlane update
inputs:
targetType: 'inline'
script: |
gem update fastlane
fastlane --version
- task: JavaToolInstaller#0
inputs:
versionSpec: '11'
jdkArchitectureOption: 'x64'
jdkSourceOption: 'PreInstalled'
- task: Bash#3
displayName: 'Update Dependencies'
inputs:
targetType: 'inline'
script: HOMEBREW_NO_AUTO_UPDATE=1 brew bundle
- task: Bash#3
displayName: "Set build properties"
inputs:
targetType: 'inline'
script: |
echo "sdk.dir=/Users/runner/Library/Android/sdk"
echo "INCLUDE_MOCKER=false" >> local.properties
echo "INCLUDE_ANDROID=false" >> local.properties
echo "INCLUDE_TESTER=false" >> local.properties
echo "APP_LABEL=$(APP_LABEL)" >> local.properties
env:
APP_LABEL: $(APP_LABEL)
- task: Gradle#2
displayName: 'Clean label common'
inputs:
workingDirectory: ''
tasks: "common:cleanLabel"
env:
APP_LABEL: $(APP_LABEL)
- task: Bash#3
displayName: fastlane ios
env:
MATCH_PASSWORD: $(MATCH_PASSWORD)
FASTLANE_PASSWORD: $(FASTLANE_PASSWORD)
FASTLANE_SESSION: $(FASTLANE_SESSION)
FASTLANE_APPLE_APPLICATION_SPECIFIC_PASSWORD: $(FASTLANE_APPLE_APPLICATION_SPECIFIC_PASSWORD)
inputs:
targetType: 'inline'
script: |
sudo xcode-select -s /Applications/Xcode_13.2.app
cd ios/Project
fastlane azure_beta label:Project app_identifier:project.bundle.id itc_team_id:itc.team.id team_id:team.id git_match_branch:master username:me#myself.com
As it turned out there was an error in building the common KMM layer, I would have found it when doing a clean checkout probably but I found out by using a self-hosted agent on Azure Devops so I could navigate to the /Users/runner/Library/Logs/gym/Project-Project.log as Pylyp Dukhov suggested.
Context: performing Android instrumentation (UI) tests with Azure Pipelines.
There are 2 jobs: one does the testing (launches an emulator and runs the tests), and the other job reports an error, if the previous job fails for some reason.
I have the following simple setup in my Azure Pipelines:
jobs:
- job: SmokeTesting
displayName: Smoke testing
timeoutInMinutes: 60
pool:
vmImage: 'macOS-latest'
steps:
- script: meta/scripts.sh launch_avd
displayName: Launch AVD
workingDirectory: ''
- task: Gradle#2
displayName: Run smoke tests
inputs:
workingDirectory: ''
gradleWrapperFile: 'gradlew'
publishJUnitResults: true
tasks: ':app:connectedAndroidTest'
- job: ReportFailure
displayName: Report failure
dependsOn:
- SmokeTesting
condition: or(failed(), canceled())
steps:
- script: meta/scripts.sh report_smoke_tests_error
workingDirectory: ''
env:
BUILD_ID: $(Build.BuildId)
It all works as expected: if there is an error, the second job is run. In this case, the log in Azure Pipelines Web contains very useful information, that I would like to have access to in the second job:
* What went wrong:
Execution failed for task ':app:stripDebugDebugSymbols'.
> No version of NDK matched the requested version 22.0.7026061. Versions available locally: 18.1.5063045, 21.3.6528147, 21.3.6528147
How do I get "What went wrong" message in my second job?
My idea is to use multi-stage variable to record the message in the first job, and then use it in the second one. Unfortunately, I haven't figured out how to get this message in the first place.
As a workaround, you can use the Build Timeline api to get detailed build information. The api response contains the property issues, you can check the results there if there is an errors.
https://dev.azure.com/{org}/{pro}/_apis/build/builds/3838/timeline/?api-version=6.0
If the issues does not contain the error message you want, you can retrieve the content related to Gradle#2 task in the response body. Obtain the log url according to the property log.
By calling this log url, you can get the log of Gradle#2 task, and then parse the log to get the desired message.
Pipeline yaml file is very simple: 1)make 2)run exe file.
The pipeline returns PASS even if exe file returns non-zero rc.
The rc is not even displayed in the job log.
How I check the execution return code and fail the pipeline ?
trigger:
- main
pool:
vmImage: 'windows-latest'
steps:
- script: make
displayName: '[step] make'
- script: main.exe
displayName: '[step] execute main()'
Situation
I'm working on an AzureDevops Server 2020 with only one agent.
I have 2 build pipelines:
Build pipeline (yaml)
Merge pipeline (yaml)
Each pipeline contains multiple stages that contains only one job (because each task of the stage must run on same agent).
Current behavior
If I run the two pipelines at the same time, the agent run the two in a "fake" parallel that make the two builds very slow.
Exemple of agent process order:
build-stage1, build-stage2, merge-stage1, merge-stage2, merge-stage3, merge-stage4, build-stage3...
Wanted behavior
This is not something unexpected if we have more agents than build executions. But this will never be my case.
So I will prefer to lock the agent for the current build (like built-in in Jenkins).
Exemple of agent wanted process order:
build-stage1, build-stage2, build-stage3, build-stage4, build-stage5(latest), merge-stage1, merge-stage2, merge-stage3, merge-stage4, merge-stage5(latest)
Is it possible to set the agent work attribution policy ?
This occurs because a job is not added to agent queue if it depends on something (and by default it depends on previous stage).
Using dependsOn: [] let Azure Devops know that it depends on nothing so each jobs are added to the queue and are executed in FIFO order.
Agree with Dom.
Based on my test, I could reproduce this behavior. But I am afraid that there is no method to lock the agent to complete the pipeline before taking another one.
For a workaround:
You could use Pipeline trigger to trigger the merge Pipeline.
Build Pipeline:
pool:
name: Default
stages:
- stage: Build_Stage1
displayName: Stage1
....
- stage: Build_Stage2
displayName: Stage2
dependsOn: Build_Stage1
....
- stage: Build_Stage3
displayName: Stage3
dependsOn: Build_Stage2
....
Merge Pipeline:
resources:
pipelines:
- pipeline: TestTrigger
source: ABC
trigger:
branches:
- '*'
pool:
name: Default
stages:
- stage: Merge_Stage1
displayName: Merge1
...
- stage: Merge_Stage2
displayName: Merge2
dependsOn: Merge_Stage1
...
- stage: Merge_Stage3
displayName: Merge3
dependsOn: Merge_Stage2
...
In this case, you could queue the Build Pipeline alone. Then the Merge Pipeline will be triggered after completing the Build Pipeline.
The Process: build-stage1, build-stage2, build-stage3, build-stage4, build-stage5(latest), -> Tiggeer -> merge-stage1, merge-stage2, merge-stage3, merge-stage4, merge-stage5(latest)
On the other hand, this requirement is valuable.
To get this feature, you could add your request for this feature on our UserVoice site, which is our main forum for product suggestions. Thank you for helping us build a better Azure DevOps.