I am trying to create, at buildtime on an x86 host, a docker container
, for runtime on an ARM host. To do this you need to use qemu for cross compilation. I also want to use alpine linux since the image size is so small.
However I am encountering an unusual error that only happens at build time - a problem
installing bash.
My understanding is when running apk -U add bash, apk updates the package list
from the repositories and then installs the latest version of the package requested.
In then runs post-install scripts. It seems these post install scripts fail.
However, when I built the image without bash and then ran interactively the container
on the ARM host, and did apk fix && apk -U add bash it did the trick. Doing this
command at build time fails however.
How can I add bash at buildtime?
Dockerfile
FROM armhf/alpine:3.5
ENV CONSUL_PREFIX __CONSUL_PREFIX__
ENV CONSUL_SECRET_PREFIX __CONSUL_SECRET_PREFIX__
ENV QEMU_EXECVE 1
COPY deploy/qemu/qemu-arm-static /usr/bin/
RUN ["qemu-arm-static","/sbin/apk","fix"]
RUN ["qemu-arm-static","/sbin/apk","add","-U","bash"]
RUN ["qemu-arm-static","/sbin/apk","-U","add", \
"postgresql-client",\
"curl","vim",\
"tzdata","bc"]
RUN ["qemu-arm-static","/bin/cp","usr/share/zoneinfo/America/Los_Angeles","/etc/localtime"]
RUN ["qemu-arm-static","/bin/echo","America/Los_Angeles",">","/etc/timezone"]
RUN ["qemu-arm-static","/bin/rm","-rf","/var/cache/apk/*"]
RUN ["qemu-arm-static","/bin/sh"]
COPY deploy /usr/local/deploy
COPY deploy/default/bashrc /root/.bashrc
COPY deploy/default/vimrc /root/.vimrc
COPY src /src
Build log / Error
#C02NN3NBG3QT:dev-resources $ ./publish-image
+ : router-logs
+ : quay.io
+ : quay.io/skilbjo/router-logs
+ : skilbjo#github.com
++ echo router-logs
++ tr - _
+ : router_logs/config
++ echo router-logs
++ tr - _
+ : router_logs/secrets
+ cat ../deploy/default/Dockerfile
+ sed 's;__CONSUL_PREFIX__;router_logs/config;'
+ sed 's;__CONSUL_SECRET_PREFIX__;router_logs/secrets;'
+ IMAGE_TAG=dev
+ cd ..
++ git rev-parse HEAD
+ echo 0a865e3918d584b4377fad9afe9ba28a1dbe5968
+ docker build --rm -t quay.io/skilbjo/router-logs:dev .
Sending build context to Docker daemon 8.713 MB
Step 1 : FROM armhf/alpine:3.5
---> 3ddfeafc01f0
Step 2 : ENV CONSUL_PREFIX router_logs/config
---> Using cache
---> e2aae782f6d8
Step 3 : ENV CONSUL_SECRET_PREFIX router_logs/secrets
---> Using cache
---> 71c863da2558
Step 4 : ENV QEMU_EXECVE 1
---> Using cache
---> a7e80415d0d4
Step 5 : COPY deploy/qemu/qemu-arm-static /usr/bin/
---> Using cache
---> 265df9b6575f
Step 6 : RUN qemu-arm-static /sbin/apk fix
---> Using cache
---> def74ac67891
Step 7 : RUN qemu-arm-static /sbin/apk add -U bash
---> Running in 6f62d2ecd6b3
fetch http://dl-cdn.alpinelinux.org/alpine/v3.5/main/armhf/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.5/community/armhf/APKINDEX.tar.gz
(1/5) Installing ncurses-terminfo-base (6.0-r7)
(2/5) Installing ncurses-terminfo (6.0-r7)
(3/5) Installing ncurses-libs (6.0-r7)
(4/5) Installing readline (6.3.008-r4)
(5/5) Installing bash (4.3.46-r5)
Executing bash-4.3.46-r5.post-install
ERROR: bash-4.3.46-r5.post-install: script exited with error 1
Executing busybox-1.25.1-r0.trigger
ERROR: busybox-1.25.1-r0.trigger: script exited with error 1
1 errors; 7 MiB in 16 packages
The command 'qemu-arm-static /sbin/apk add -U bash' returned a non-zero code: 1
Project repo is here: https://github.com/skilbjo/router-logs
I had a similar error using Buildx's multiarch option. It was fixed thanks to the following commands:
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
docker buildx rm builder
docker buildx create --name builder --driver docker-container --use
docker buildx inspect --bootstrap
Thanks to this answer right here.
It turns out FROM armhf/alpine:3.5 is not good and FROM resin/armhf-alpine:3.5 will do the trick! I'd love to be able to see the commants from scratch that resulted in the armhf image being borked, but for now, this works!
Related
I made a Dockerfile to build my spring boot project with GraalVm natively; everything went correctly.
Here is the Dockerfile
FROM ghcr.io/graalvm/graalvm-ce:22.3.1 AS buildnative
WORKDIR /app
COPY mvnw pom.xml ./
COPY .mvn/ .mvn
COPY src ./src
RUN ./mvnw clean package -Pnative
FROM ubuntu:23.04
EXPOSE 8080
COPY --from=buildnative /app/target/spring-boot-project /build/app
CMD ["/build/app"]
This runs perfectly locally, but in the GitLab runner, I always have the same error.
JAVA_HOME is not defined correctly.
We cannot execute /opt/graalvm-ce-java17-22.3.1/bin/java
The command '/bin/sh -c ./mvnw clean package -Pnative' returned a non-zero code: 1
So I decided to add some logs within the maven wrapper, and here is what I have :
Step 7/11 : RUN ./mvnw clean package -Pnative ---> Running in 81e0558130f3 ------------> /opt/graalvm-ce-java17-22.3.1/bin/java ------------> JAVA_HOME is /opt/graalvm-ce-java17-22.3.1 Error: JAVA_HOME is not defined correctly. We cannot execute /opt/graalvm-ce-java17-22.3.1/bin/java The command '/bin/sh -c ./mvnw clean package -Pnative' returned a non-zero code: 1 Cleaning up project directory and file based variables
Step 7/11 : RUN ./mvnw clean package -Pnative
---> Running in 81e0558130f3
------------> /opt/graalvm-ce-java17-22.3.1/bin/java
------------> JAVA_HOME is /opt/graalvm-ce-java17-22.3.1
Error: JAVA_HOME is not defined correctly.
We cannot execute /opt/graalvm-ce-java17-22.3.1/bin/java
The command '/bin/sh -c ./mvnw clean package -Pnative' returned a non-zero code: 1
Cleaning up project directory and file based variables
In the log I have added, we can see JAVA_HOME is defined and is adequately defined. It is the same as locally, where everything works perfectly.
I tried to add this line: RUN chmod +x mvnw before running it, but it did not change anything.
I need more ideas. Is there anyone have an idea of what is happening?
Edit:
I decided to dive deeper into the issue. I have added logs to know why it does not work. I modified the mvnw script to know what was happening.
I have added this to mvnw
if [ -e "$JAVACMD" ] ; then
echo "------------> THE FILE EXIST" >&2
else
echo "------------> THE FILE DOES NOT EXIST" >&2
fi
if [ -x "$JAVACMD" ] ; then
echo "------------> THE FILE IS EXECUTABLE" >&2
else
echo "------------> THE FILE IS NOT EXECUTABLE" >&2
fi
Results:
Here is in local:
------------> JAVACMD /opt/graalvm-ce-java17-22.3.1/bin/java
------------> THE FILE EXIST
------------> THE FILE IS EXECUTABLE
Here is in the gitlab-runner:
------------> JAVACMD /opt/graalvm-ce-java17-22.3.1/bin/java
------------> THE FILE EXIST
------------> THE FILE IS NOT EXECUTABLE
Makes no sense to me
Is your GitLab runner configured to use a non-root user when executing the Dockerfile?
As #jilliss pointed out, it seems that it's the Java binary that needs execute permission, but maybe only root has the permission (which is why it works locally as by default you will be running it as root).
If the Ops team have tried to run the Dockerfile as another user, then it could explain why /opt/graalvm-ce-java17-22.3.1/bin/java is no longer executable.
Try adding a whoami log and see which user is running when it runs in GL.
Correct compile command is
mvn -Pnative native:compile
You can see more details and full doc here. After that you will see graalvm build result in the logs. So you need to change your docker file like below
FROM ghcr.io/graalvm/graalvm-ce:22.3.1 AS buildnative
WORKDIR /app
COPY mvnw pom.xml ./
COPY .mvn/ .mvn
COPY src ./src
RUN ./mvnw native:compile -Pnative
FROM ubuntu:23.04
EXPOSE 8080
COPY --from=buildnative /app/target/spring-boot-project /build/app
CMD ["/build/app"]
example build log from my local build
[INFO] --- native-maven-plugin:0.9.19:compile (default-cli) # demo ---
Downloading: Component catalog from www.graalvm.org
Processing Component: Native Image
Downloading: Component native-image: Native Image from github.com
Installing new component: Native Image (org.graalvm.native-image, version 22.3.1)
[INFO] Found GraalVM installation from JAVA_HOME variable.
[INFO] [graalvm reachability metadata repository for ch.qos.logback:logback-classic:1.4.5]: Configuration directory not found. Trying latest version.
[INFO] [graalvm reachability metadata repository for ch.qos.logback:logback-classic:1.4.5]: Configuration directory is ch.qos.logback/logback-classic/1.4.1
[INFO] Executing: /opt/graalvm-ce-java17-22.3.1/bin/native-image -cp /app/target/classes:/root/.m2/repository/org/springframework/spring-aop/6.0.4/spring-aop-6.0.4.jar:/root/.m2/repository/org/springframework/boot/spring-boot-starter-logging/3.0.2/spring-boot-starter-logging-3.0.2.jar:/root/.m2/repository/org/springframework/spring-context/6.0.4/spring-context-6.0.4.jar:/root/.m2/repository/org/springframework/spring-core/6.0.4/spring-core-6.0.4.jar:/root/.m2/repository/org/apache/logging/log4j/log4j-api/2.19.0/log4j-api-2.19.0.jar:/root/.m2/repository/org/springframework/spring-expression/6.0.4/spring-expression-6.0.4.jar:/root/.m2/repository/org/apache/logging/log4j/log4j-to-slf4j/2.19.0/log4j-to-slf4j-2.19.0.jar:/root/.m2/repository/ch/qos/logback/logback-core/1.4.5/logback-core-1.4.5.jar:/root/.m2/repository/jakarta/annotation/jakarta.annotation-api/2.1.1/jakarta.annotation-api-2.1.1.jar:/root/.m2/repository/org/springframework/spring-beans/6.0.4/spring-beans-6.0.4.jar:/root/.m2/repository/ch/qos/logback/logback-classic/1.4.5/logback-classic-1.4.5.jar:/root/.m2/repository/org/springframework/boot/spring-boot-starter/3.0.2/spring-boot-starter-3.0.2.jar:/root/.m2/repository/org/springframework/spring-jcl/6.0.4/spring-jcl-6.0.4.jar:/root/.m2/repository/org/springframework/boot/spring-boot-autoconfigure/3.0.2/spring-boot-autoconfigure-3.0.2.jar:/root/.m2/repository/org/slf4j/jul-to-slf4j/2.0.6/jul-to-slf4j-2.0.6.jar:/root/.m2/repository/org/yaml/snakeyaml/1.33/snakeyaml-1.33.jar:/root/.m2/repository/org/springframework/boot/spring-boot/3.0.2/spring-boot-3.0.2.jar:/root/.m2/repository/org/slf4j/slf4j-api/2.0.6/slf4j-api-2.0.6.jar --no-fallback -H:Path=/app/target -H:Name=demo -H:ConfigurationFileDirectories=/app/target/graalvm-reachability-metadata/160481799c4b6c37cde925c9aebf513c32245dcf/ch.qos.logback/logback-classic/1.4.1
========================================================================================================================
GraalVM Native Image: Generating 'demo' (executable)...
========================================================================================================================
[1/7] Initializing... (6.6s # 0.23GB)
Version info: 'GraalVM 22.3.1 Java 17 CE'
Java version info: '17.0.6+10-jvmci-22.3-b13'
C compiler: gcc (redhat, x86_64, 11.3.1)
Garbage collector: Serial GC
1 user-specific feature(s)
- org.springframework.aot.nativex.feature.PreComputeFieldFeature
Field org.apache.commons.logging.LogAdapter#log4jSpiPresent set to true at build time
Field org.apache.commons.logging.LogAdapter#log4jSlf4jProviderPresent set to true at build time
Field org.apache.commons.logging.LogAdapter#slf4jSpiPresent set to true at build time
Field org.apache.commons.logging.LogAdapter#slf4jApiPresent set to true at build time
Field org.springframework.core.NativeDetector#imageCode set to true at build time
Field org.springframework.format.support.DefaultFormattingConversionService#jsr354Present set to false at build time
Field org.springframework.core.KotlinDetector#kotlinPresent set to false at build time
Field org.springframework.core.KotlinDetector#kotlinReflectPresent set to false at build time
Field org.springframework.cglib.core.AbstractClassGenerator#imageCode set to true at build time
Field org.springframework.boot.logging.log4j2.Log4J2LoggingSystem$Factory#PRESENT set to false at build time
Field org.springframework.boot.logging.java.JavaLoggingSystem$Factory#PRESENT set to true at build time
Field org.springframework.boot.logging.logback.LogbackLoggingSystem$Factory#PRESENT set to true at build time
Field org.springframework.boot.logging.logback.LogbackLoggingSystemProperties#JBOSS_LOGGING_PRESENT set to false at build time
Field org.springframework.context.event.ApplicationListenerMethodAdapter#reactiveStreamsPresent set to false at build time
[2/7] Performing analysis... [*******] (60.8s # 2.15GB)
8,903 (88.31%) of 10,082 classes reachable
13,147 (64.27%) of 20,456 fields reachable
40,485 (56.88%) of 71,181 methods reachable
365 classes, 115 fields, and 1,191 methods registered for reflection
64 classes, 70 fields, and 55 methods registered for JNI access
4 native libraries: dl, pthread, rt, z
[3/7] Building universe... (8.9s # 2.03GB)
[4/7] Parsing methods... [***] (9.6s # 0.82GB)
[5/7] Inlining methods... [***] (3.7s # 2.16GB)
[6/7] Compiling methods... [*******] (48.2s # 1.97GB)
[7/7] Creating image... (5.6s # 1.60GB)
17.53MB (49.48%) for code area: 25,460 compilation units
17.60MB (49.66%) for image heap: 215,829 objects and 25 resources
312.40KB ( 0.86%) for other data
35.44MB in total
------------------------------------------------------------------------------------------------------------------------
Top 10 packages in code area: Top 10 object types in image heap:
936.87KB java.util 3.72MB byte[] for code metadata
594.27KB java.lang.invoke 2.07MB java.lang.String
469.21KB c.s.org.apache.xerces.internal.impl.xs.traversers 2.06MB java.lang.Class
455.78KB java.lang 1.68MB byte[] for general heap data
423.05KB com.sun.org.apache.xerces.internal.impl 1.60MB byte[] for java.lang.String
407.79KB com.sun.crypto.provider 765.10KB com.oracle.svm.core.hub.DynamicHubCompanion
375.00KB org.springframework.beans.factory.support 576.38KB java.util.HashMap$Node
371.71KB java.io 513.09KB int[][]
354.25KB java.util.concurrent 394.88KB java.lang.String[]
328.57KB java.text 394.65KB byte[] for reflection metadata
12.74MB for 405 more packages 3.23MB for 1771 more object types
------------------------------------------------------------------------------------------------------------------------
12.0s (8.0% of total time) in 35 GCs | Peak RSS: 3.04GB | CPU load: 5.08
------------------------------------------------------------------------------------------------------------------------
Produced artifacts:
/app/target/demo (executable)
/app/target/demo.build_artifacts.txt (txt)
========================================================================================================================
Finished generating 'demo' in 2m 29s.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:11 min
[INFO] Finished at: 2023-01-27T18:05:23Z
[INFO] ------------------------------------------------------------------------
Never tried gitlab runner myself, but have you tried to force the JAVA_HOME env path before maven command?
ENV JAVA_HOME=/opt/graalvm-ce-java17-22.3.1
RUN ./mvnw clean package -Pnative
hope it helps
It's saying that it doesn't recognize ganache-cli as a command, despite installing it and everything else as directed.
Using:
brownie v1.17.2
node v17.2.0 (npm v8.1.4)
nvm 0.39.0
Python 3.9.7
Ganache CLI v6.12.2 (ganache-core: 2.13.2)
As part of the Solidity course here, specifically lesson 5. Github repo here.
x#y brownie_simple_storage % brownie run scripts/deploy.py
Brownie v1.17.2 - Python development framework for Ethereum
BrownieSimpleStorageProject is the active project.
Launching 'ganache-cli --port 8545 --gasLimit 12000000 --accounts 10 --hardfork istanbul --mnemonic brownie'...
File "brownie/_cli/__main__.py", line 64, in main
importlib.import_module(f"brownie._cli.{cmd}").main()
File "brownie/_cli/run.py", line 44, in main
network.connect(CONFIG.argv["network"])
File "brownie/network/main.py", line 50, in connect
rpc.launch(active["cmd"], **active["cmd_settings"])
File "brownie/network/rpc/__init__.py", line 93, in launch
raise RPCProcessError(cmd, uri)
RPCProcessError: Unable to launch local RPC client.
Command: ganache-cli
URI: http://127.0.0.1:8545
Looks like this can be resolved using nvm v 16.
nvm install 16
nvm use 16
node --version
v16.13.1
x#y brownie_simple_storage % brownie run scripts/deploy.py
Brownie v1.17.2 - Python development framework for Ethereum
BrownieSimpleStorageProject is the active project.
Launching 'ganache-cli --port 8545 --gasLimit 12000000 --accounts 10 --hardfork istanbul --mnemonic brownie'...
Running 'scripts/deploy.py::main'...
Hello!
Terminating local RPC client...
Most likely the issue you're dealing with is because ganache is already running in another active project, in order to have brownie recognize ganache is to make sure that's the only environment running ganache close to the project running the node. Which, is most likely the web3 simple storage file... not the newly created brownie file.
I want to build an android signed APK and receive release APK through the slack channel. Tried the below script but it's not working due to my app written with JDK 8.
This is the script which I used.
image: jangrewe/gitlab-ci-android
cache:
key: ${CI_PROJECT_ID}
paths:
- .gradle/
before_script:
- export GRADLE_USER_HOME=$(pwd)/.gradle
- chmod +x ./gradlew
stages:
- build
assembleDebug:
stage: build
only:
- development
- tags
script:
- ./gradlew assembleDebug
- |
curl \
-F token="${SLACK_CHANNEL_ACCESS_TOKEN}" \
-F channels="${SLACK_CHANNEL_ID}" \
-F initial_comment="Hello team! Here is the latest APK" \
-F "file=#$(find app/build/outputs/apk/debug -name 'MyApp*')" \
https://slack.com/api/files.upload
artifacts:
paths:
- app/build/outputs/apk/debug
view raw
But is showing some java classes not found. (That java files deprecated in Java 11)
First, you need to setup slack authentication keys.
Create App in Slack
Go to Authentication Section and Generate Authentication Key.
Get Channel Id which you want to receive messages.
Mention your app name in your slack thread and add the app to the channel.
Setup those keys in your GitLab ci setting variables.
SLACK_CHANNEL_ACCESS_TOKEN = Access Token Generated by Slack App
SLACK_CHANNEL_ID = Channel Id (Check URL Last section for the channel id)
8.Copy your existing Keystore file to the repository. (Please do this if your project is private.)
7.Change GitLab script's content to the below code.
Make sure to change certificate password,key password and alias.
image: openjdk:8-jdk
variables:
# ANDROID_COMPILE_SDK is the version of Android you're compiling with.
# It should match compileSdkVersion.
ANDROID_COMPILE_SDK: "29"
# ANDROID_BUILD_TOOLS is the version of the Android build tools you are using.
# It should match buildToolsVersion.
ANDROID_BUILD_TOOLS: "29.0.3"
# It's what version of the command line tools we're going to download from the official site.
# Official Site-> https://developer.android.com/studio/index.html
# There, look down below at the cli tools only, sdk tools package is of format:
# commandlinetools-os_type-ANDROID_SDK_TOOLS_latest.zip
# when the script was last modified for latest compileSdkVersion, it was which is written down below
ANDROID_SDK_TOOLS: "6514223"
# Packages installation before running script
before_script:
- apt-get --quiet update --yes
- apt-get --quiet install --yes wget tar unzip lib32stdc++6 lib32z1
# Setup path as android_home for moving/exporting the downloaded sdk into it
- export ANDROID_HOME="${PWD}/android-home"
# Create a new directory at specified location
- install -d $ANDROID_HOME
# Here we are installing androidSDK tools from official source,
# (the key thing here is the url from where you are downloading these sdk tool for command line, so please do note this url pattern there and here as well)
# after that unzipping those tools and
# then running a series of SDK manager commands to install necessary android SDK packages that'll allow the app to build
- wget --output-document=$ANDROID_HOME/cmdline-tools.zip https://dl.google.com/android/repository/commandlinetools-linux-${ANDROID_SDK_TOOLS}_latest.zip
# move to the archive at ANDROID_HOME
- pushd $ANDROID_HOME
- unzip -d cmdline-tools cmdline-tools.zip
- popd
- export PATH=$PATH:${ANDROID_HOME}/cmdline-tools/tools/bin/
# Nothing fancy here, just checking sdkManager version
- sdkmanager --version
# use yes to accept all licenses
- yes | sdkmanager --sdk_root=${ANDROID_HOME} --licenses || true
- sdkmanager --sdk_root=${ANDROID_HOME} "platforms;android-${ANDROID_COMPILE_SDK}"
- sdkmanager --sdk_root=${ANDROID_HOME} "platform-tools"
- sdkmanager --sdk_root=${ANDROID_HOME} "build-tools;${ANDROID_BUILD_TOOLS}"
# Not necessary, but just for surity
- chmod +x ./gradlew
# Make Project
assembleDebug:
interruptible: true
stage: build
only:
- tags
script:
- ls
- last_v=$(git describe --abbrev=0 2>/dev/null || echo '')
- tag_message=$(git tag -l -n9 $last_v)
- echo $last_v
- echo $tag_message
- ./gradlew assembleRelease
-Pandroid.injected.signing.store.file=$(pwd)/Certificate.jks
-Pandroid.injected.signing.store.password=123456
-Pandroid.injected.signing.key.alias=key0
-Pandroid.injected.signing.key.password=123456
- |
curl \
-F token="${SLACK_CHANNEL_ACCESS_TOKEN}" \
-F channels="${SLACK_CHANNEL_ID}" \
-F initial_comment="$tag_message" \
-F "file=#$(find app/build/outputs/apk/release -name 'app*')" \
https://slack.com/api/files.upload
artifacts:
paths:
- app/build/outputs/
I am trying to run basic commands given in https://github.com/iron-io/functions.
I created a fuc.go and funct.yaml file successfully but when I try to execute fn build, I get below error:
Running prebuild command: docker run --rm -v /home/evr:/go/src/github.com/x/y -w /go/src/github.com/x/y iron/go:dev go build -o func
can't load package: package github.com/x/y: C source files not allowed when not using cgo or SWIG: swap_sll.c
error running docker build: exit status 1
I got the cause for the above issue
Actually it was looking for data directory of iron function which I had specified a specific path instead of ${PWD} and was executing fn build in another working dirctory. So have to execute fn build in that directory where data directory is present (specific path which I specified).
I'm having trouble getting the Windows build agent to run a build. The agent is unable to checkout my source code. (Im using Windows 10) See GitHub issue
I am seeing the following error when running a build:
Buildkite Error: There was an error running `git clone -v -- git#github.com:myorg/myrepo.git .` (exec: "git": executable file not found in %PATH%)
I have installed git using chocolatey and git is accessible in CMD and Powershell on the agent's host and I can see it in my path if I run gci env:Path in Powershell. git's directory is at the end here:
C:\Program Files\Docker\Docker\Resources\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Git\cm...
From a build's logs in BK's web UI, I see the following environment variables printed out:
BUILDKITE=true
BUILDKITE_AGENT_ACCESS_TOKEN=xxx
BUILDKITE_AGENT_DEBUG=true
BUILDKITE_AGENT_ENDPOINT=https://agent.buildkite.com/v3
BUILDKITE_AGENT_ID=xxx
BUILDKITE_AGENT_NAME=DESKTOP-1
BUILDKITE_AGENT_PID=5180
BUILDKITE_ARTIFACT_PATHS=
BUILDKITE_BIN_PATH=C:\Users\Jason\Downloads\buildkite-agent-windows-amd64-3.0-beta.27
BUILDKITE_BRANCH=fix/build
BUILDKITE_BUILD_CHECKOUT_PATH=builds\DESKTOP-1\myorg\myrepo
BUILDKITE_BUILD_CREATOR=Jason
BUILDKITE_BUILD_CREATOR_EMAIL=myemail#gmail.com
BUILDKITE_BUILD_ID=xxx
BUILDKITE_BUILD_NUMBER=18
BUILDKITE_BUILD_PATH=builds
BUILDKITE_BUILD_URL=https://buildkite.com/myorg/myrepo/builds/18
BUILDKITE_COMMAND=msbuild
BUILDKITE_COMMAND_EVAL=true
BUILDKITE_COMMIT=HEAD
BUILDKITE_GIT_CLEAN_FLAGS=-fxdq
BUILDKITE_GIT_CLONE_FLAGS=-v
BUILDKITE_HOOKS_PATH=hooks
BUILDKITE_JOB_ID=xxx
BUILDKITE_MESSAGE=First build
BUILDKITE_ORGANIZATION_SLUG=myorg
BUILDKITE_PIPELINE_DEFAULT_BRANCH=master
BUILDKITE_PIPELINE_PROVIDER=github
BUILDKITE_PIPELINE_SLUG=myrepo
BUILDKITE_PLUGINS_PATH=plugins
BUILDKITE_PROJECT_PROVIDER=github
BUILDKITE_PROJECT_SLUG=myorg/myrepo
BUILDKITE_PULL_REQUEST=false
BUILDKITE_PULL_REQUEST_REPO=
BUILDKITE_REPO=git#github.com:myorg/myrepo.git
BUILDKITE_REPO_SSH_HOST=github.com
BUILDKITE_RETRY_COUNT=0
BUILDKITE_SCRIPT_PATH=msbuild
BUILDKITE_SOURCE=ui
BUILDKITE_SSH_FINGERPRINT_VERIFICATION=true
BUILDKITE_TAG=
BUILDKITE_TIMEOUT=false
CI=true
PATH=C:\Users\Jason\Downloads\buildkite-agent-windows-amd64-3.0-beta.27;
PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL
Note that PATH in that output is not the same as my PATH from the Powershell and does not include the path to gits binary.
Full build output:
Build environment variables 0s
BUILDKITE=true
BUILDKITE_AGENT_ACCESS_TOKEN=xxx
BUILDKITE_AGENT_DEBUG=true
BUILDKITE_AGENT_ENDPOINT=https://agent.buildkite.com/v3
BUILDKITE_AGENT_ID=xxx
BUILDKITE_AGENT_NAME=DESKTOP-1
BUILDKITE_AGENT_PID=5180
BUILDKITE_ARTIFACT_PATHS=
BUILDKITE_BIN_PATH=C:\Users\Jason\Downloads\buildkite-agent-windows-amd64-3.0-beta.27
BUILDKITE_BRANCH=fix/build
BUILDKITE_BUILD_CHECKOUT_PATH=builds\DESKTOP-1\myorg\myrepo
BUILDKITE_BUILD_CREATOR=Jason
BUILDKITE_BUILD_CREATOR_EMAIL= myemail#gmail.com
BUILDKITE_BUILD_ID=xxx
BUILDKITE_BUILD_NUMBER=18
BUILDKITE_BUILD_PATH=builds
BUILDKITE_BUILD_URL=https://buildkite.com/myorg/myrepo/builds/18
BUILDKITE_COMMAND=msbuild
BUILDKITE_COMMAND_EVAL=true
BUILDKITE_COMMIT=HEAD
BUILDKITE_GIT_CLEAN_FLAGS=-fxdq
BUILDKITE_GIT_CLONE_FLAGS=-v
BUILDKITE_HOOKS_PATH=hooks
BUILDKITE_JOB_ID=xxx
BUILDKITE_MESSAGE=First build
BUILDKITE_ORGANIZATION_SLUG=myorg
BUILDKITE_PIPELINE_DEFAULT_BRANCH=master
BUILDKITE_PIPELINE_PROVIDER=github
BUILDKITE_PIPELINE_SLUG=myrepo
BUILDKITE_PLUGINS_PATH=plugins
BUILDKITE_PROJECT_PROVIDER=github
BUILDKITE_PROJECT_SLUG=myorg/myrepo
BUILDKITE_PULL_REQUEST=false
BUILDKITE_PULL_REQUEST_REPO=
BUILDKITE_REPO=git#github.com:myorg/myrepo.git
BUILDKITE_REPO_SSH_HOST=github.com
BUILDKITE_RETRY_COUNT=0
BUILDKITE_SCRIPT_PATH=msbuild
BUILDKITE_SOURCE=ui
BUILDKITE_SSH_FINGERPRINT_VERIFICATION=true
BUILDKITE_TAG=
BUILDKITE_TIMEOUT=false
CI=true
PATH=C:\Users\Jason\Downloads\buildkite-agent-windows-amd64-3.0-beta.27;
PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL
Running global environment hook 0s
# Skipping, no hook script found at "hooks\environment.bat"
Running global pre-checkout hook 0s
# Skipping, no hook script found at "hooks\pre-checkout.bat"
Preparing build directory 0s
# Changing working directory to "builds\DESKTOP-1\myorg\myrepo"
⚠ Buildkite Warning: Could not performn `ssh-keygen` (exec: "ssh-keygen": executable file not found in %PATH%)
> git clone -v -- git#github.com:myorg/myrepo.git .
🚨 Buildkite Error: There was an error running `git clone -v -- git#github.com:myorg/myrepo.git .` (exec: "git": executable file not found in %PATH%)
Your build output doesn't seem to have those chocolatey paths:
PATH=C:\Users\Jason\Downloads\buildkite-agent-windows-amd64-3.0-beta.27;
You might need to add an agent environment hook which adds the right directories to the path. Or try updating to the latest beta which might fix the issue.