How to sign APK in gitlab CI and send release to slack - continuous-integration

I want to build an android signed APK and receive release APK through the slack channel. Tried the below script but it's not working due to my app written with JDK 8.
This is the script which I used.
image: jangrewe/gitlab-ci-android
cache:
key: ${CI_PROJECT_ID}
paths:
- .gradle/
before_script:
- export GRADLE_USER_HOME=$(pwd)/.gradle
- chmod +x ./gradlew
stages:
- build
assembleDebug:
stage: build
only:
- development
- tags
script:
- ./gradlew assembleDebug
- |
curl \
-F token="${SLACK_CHANNEL_ACCESS_TOKEN}" \
-F channels="${SLACK_CHANNEL_ID}" \
-F initial_comment="Hello team! Here is the latest APK" \
-F "file=#$(find app/build/outputs/apk/debug -name 'MyApp*')" \
https://slack.com/api/files.upload
artifacts:
paths:
- app/build/outputs/apk/debug
view raw
But is showing some java classes not found. (That java files deprecated in Java 11)

First, you need to setup slack authentication keys.
Create App in Slack
Go to Authentication Section and Generate Authentication Key.
Get Channel Id which you want to receive messages.
Mention your app name in your slack thread and add the app to the channel.
Setup those keys in your GitLab ci setting variables.
SLACK_CHANNEL_ACCESS_TOKEN = Access Token Generated by Slack App
SLACK_CHANNEL_ID = Channel Id (Check URL Last section for the channel id)
8.Copy your existing Keystore file to the repository. (Please do this if your project is private.)
7.Change GitLab script's content to the below code.
Make sure to change certificate password,key password and alias.
image: openjdk:8-jdk
variables:
# ANDROID_COMPILE_SDK is the version of Android you're compiling with.
# It should match compileSdkVersion.
ANDROID_COMPILE_SDK: "29"
# ANDROID_BUILD_TOOLS is the version of the Android build tools you are using.
# It should match buildToolsVersion.
ANDROID_BUILD_TOOLS: "29.0.3"
# It's what version of the command line tools we're going to download from the official site.
# Official Site-> https://developer.android.com/studio/index.html
# There, look down below at the cli tools only, sdk tools package is of format:
# commandlinetools-os_type-ANDROID_SDK_TOOLS_latest.zip
# when the script was last modified for latest compileSdkVersion, it was which is written down below
ANDROID_SDK_TOOLS: "6514223"
# Packages installation before running script
before_script:
- apt-get --quiet update --yes
- apt-get --quiet install --yes wget tar unzip lib32stdc++6 lib32z1
# Setup path as android_home for moving/exporting the downloaded sdk into it
- export ANDROID_HOME="${PWD}/android-home"
# Create a new directory at specified location
- install -d $ANDROID_HOME
# Here we are installing androidSDK tools from official source,
# (the key thing here is the url from where you are downloading these sdk tool for command line, so please do note this url pattern there and here as well)
# after that unzipping those tools and
# then running a series of SDK manager commands to install necessary android SDK packages that'll allow the app to build
- wget --output-document=$ANDROID_HOME/cmdline-tools.zip https://dl.google.com/android/repository/commandlinetools-linux-${ANDROID_SDK_TOOLS}_latest.zip
# move to the archive at ANDROID_HOME
- pushd $ANDROID_HOME
- unzip -d cmdline-tools cmdline-tools.zip
- popd
- export PATH=$PATH:${ANDROID_HOME}/cmdline-tools/tools/bin/
# Nothing fancy here, just checking sdkManager version
- sdkmanager --version
# use yes to accept all licenses
- yes | sdkmanager --sdk_root=${ANDROID_HOME} --licenses || true
- sdkmanager --sdk_root=${ANDROID_HOME} "platforms;android-${ANDROID_COMPILE_SDK}"
- sdkmanager --sdk_root=${ANDROID_HOME} "platform-tools"
- sdkmanager --sdk_root=${ANDROID_HOME} "build-tools;${ANDROID_BUILD_TOOLS}"
# Not necessary, but just for surity
- chmod +x ./gradlew
# Make Project
assembleDebug:
interruptible: true
stage: build
only:
- tags
script:
- ls
- last_v=$(git describe --abbrev=0 2>/dev/null || echo '')
- tag_message=$(git tag -l -n9 $last_v)
- echo $last_v
- echo $tag_message
- ./gradlew assembleRelease
-Pandroid.injected.signing.store.file=$(pwd)/Certificate.jks
-Pandroid.injected.signing.store.password=123456
-Pandroid.injected.signing.key.alias=key0
-Pandroid.injected.signing.key.password=123456
- |
curl \
-F token="${SLACK_CHANNEL_ACCESS_TOKEN}" \
-F channels="${SLACK_CHANNEL_ID}" \
-F initial_comment="$tag_message" \
-F "file=#$(find app/build/outputs/apk/release -name 'app*')" \
https://slack.com/api/files.upload
artifacts:
paths:
- app/build/outputs/

Related

How to setup CI/CD for nativescript using Visual studio online/Azure dev ops tools

Im try so setup a CI/CD Pipeline for a Nativescript app, added the commands to install node and npm install but nativescript has dependencies that it need. How do I go about on Azure dev ops dynamically without having to create a vm that has nativescript and all its dependencies installed and setup
So i have used a VM and install nativescript on it and used and agent to connect to the machine and build solution, i have done the same using jenkins but jenkins was running on the vm, no i want to move the whole pipeline on azure dev ops
command used in build step: tns build android
If you don't want to use a vm you'll have to install everything needed for nativescript before building it on their hosted agent each time you create a build for your app.
A couple of important things to note. First the name of your repository is changed to 's' this messes with the naming of your entitlement file... or at least it did for me. I fix this with a bash file I add to my repository that changes the name of the path in build.xcconfig for the CODE_SIGN_ENTITLEMENTS variable. I added an npm run entitle command in my package.json file to run this before building. Second you'll want to store all files and secure passwords in the library section under pipelines in you Azure Devops project. Third, Using the classic editor is your best friend for figuring out yaml as most jobs have an option to view the YAML. You can also use the classic editor as an alternative to the YAML file
The YAML and bash file below show an example of how you can build an ipa and apk file that is stored as an artifact. You can then use that trigger a release pipeline to push the the play and app store.
# YAML File
name: Release Build
trigger:
- release/* # will start build for pull request into release branch ie. realease/version_1_0_0, release/version_2_0_0
pool:
vmImage: 'macOS-10.13'
variables:
scheme: 's' # default name/scheme created on this machine for ipa
sdk: 'iphoneos'
configuration: 'Release'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.14'
displayName: 'Install Node.js'
# Download Secure File for Android
# Note: if multiple secure files are downloaded... variable name will change and break pipeline
- task: DownloadSecureFile#1
displayName: 'download android keystore file'
inputs:
secureFile: myKeystore.keystore
#Install Apple Certificate(Distrobution)
- task: InstallAppleCertificate#2
displayName: 'Install an Apple certificate Distribution (yourcertificate.p12)'
inputs:
certSecureFile: '00000000-0000-0000-0000-000000000000' # got id from viewing file in clasic editor for pipeline
certPwd: '$(myCertificatePasswordP12)' # password stored in Library
# Install Apple Provisioning Profile(Distrobution)
- task: InstallAppleProvisioningProfile#1
displayName: 'Apple Provisioning Profile(myProvisioningProfile.mobileprovision)'
inputs:
provisioningProfileLocation: 'secureFiles' # Options: secureFiles, sourceRepository
provProfileSecureFile: '00000000-0000-0000-0000-000000000000' # Required when provisioningProfileLocation == SecureFiles
# General Set Up
- script: |
npm install -g nativescript#latest
npm install
displayName: 'Install native script and node Modules'
# variable explination
# $DOWNLOADSECUREFILE_SECUREFILEPATH is keystore file downloaded earlier
# $KEYSTORE_PASSWORD refers to the environment variable in this script which references library variable
# $(MyPlayStoreAlias) refers to library variable for your apps alias
# $BUILD_SOURCESDIRECTORY location where apk is built to
# Android
- script: |
tns build android --env.production --release --key-store-path $DOWNLOADSECUREFILE_SECUREFILEPATH --key-store-password $KEYSTORE_PASSWORD --key-store-alias $(MyPlayStoreAlias) --key-store-alias-password $KEYSTORE_PASSWORD --bundle --copy-to $BUILD_SOURCESDIRECTORY #creates apk
displayName: 'Build Android Release apk'
env:
KEYSTORE_PASSWORD: $(MyPlayStoreKeystore)
# create apk artifact
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: '$(Build.SourcesDirectory)/app-release.apk'
artifactName: 'apkDrop'
displayName: 'Publishing apkDrop artifact'
# have to use xcode 10.1 to meet min standards for uploading ipa... default version for this machine was lower than 10.1
#changing xcode version
- script: |
xcodebuild -version
/bin/bash -c "echo '##vso[task.setvariable variable=MD_APPLE_SDK_ROOT;]'/Applications/Xcode_10.1.app;sudo xcode-select --switch /Applications/Xcode_10.1.app/Contents/Developer"
xcodebuild -version
displayName: 'changing xcode to 10.1'
# Optional... was running into build issues with latest version
#downgrading cocoapods version
- script: |
sudo gem uninstall cocoapods
sudo gem install cocoapods -v 1.5.3
displayName: 'Using cocoapods version 1.5.3'
#iOS
- script: |
xcodebuild -version # makeing sure the correct xcode version is being used
pip install --ignore-installed six # fixes pip 6 error
npm run entitle #custom bash script used to change entitlement file
tns run ios --provision #see what provisioning profile and certificate are installed... helpful for debugging
tns build ios --env.production --release --bundle #creates xcworkspace
displayName: 'Build ios Release xcworkspace'
#build and sign ipa
- task: Xcode#5
displayName: 'Xcode sign and build'
inputs:
sdk: '$(sdk)' # custom var
scheme: '$(scheme)' # must be provided if setting manual path to xcworkspace
configuration: '$(configuration)' # custom var
xcodeVersion: 'specifyPath'
xcodeDeveloperDir: '/Applications/Xcode_10.1.app' #using xcode 10.1
xcWorkspacePath: 'platforms/ios/s.xcworkspace'
exportPath: '$(agent.buildDirectory)/output/$(sdk)/$(configuration)' #location where ipa file will be stored
packageApp: true #create ipa
signingOption: manual
signingIdentity: '$(APPLE_CERTIFICATE_SIGNING_IDENTITY)' # distribution certificate
provisioningProfileUuid: '$(APPLE_PROV_PROFILE_UUID)' # distribution profile
#creating ipa artifact
- task: PublishBuildArtifacts#1
displayName: 'Publishing ipaDrop artifact'
inputs:
pathtoPublish: '$(agent.buildDirectory)/output/$(sdk)/$(configuration)/s.ipa'
artifactName: 'ipaDrop'
Bash file
#!/usr/bin/env bash
# filename: pipeline-entitlements.sh
echo "Editing build.xcconfig"
TARGET_KEY="CODE_SIGN_ENTITLEMENTS"
REPLACEMENT_VALUE="s\/Resources\/YOURENTITLEMENTFILENAME.entitlements"
CONFIG_FILE="./app/App_Resources/iOS/build.xcconfig"
echo "Editing $TARGET_KEY and replaceing value with $REPLACEMENT_VALUE"
sed -i.bak "s/\($TARGET_KEY *= *\).*/\1$REPLACEMENT_VALUE/" $CONFIG_FILE
echo "Finished editing build.xcconfig"

extract or unzip files in gitlab using ci/cd command line

Actually, I have my zip files on gitlab, I want to extract those files using gitlab CI/CD.I have tried this in .gitlab-ci.yml:
image: docker
stages:
- build
- test
services:
- docker:dind
build:
before_script:
- apk add p7zip
script:
- cd \kmfs
- 7z x -oChassisA ChassisA.zip
OUTPUT
$ apk add p7zip
fetch http://dl-cdn.alpinelinux.org/alpine/v3.8/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.8/community/x86_64/APKINDEX.tar.gz
(1/3) Installing libgcc (6.4.0-r9)
(2/3) Installing libstdc++ (6.4.0-r9)
(3/3) Installing p7zip (16.02-r3)
Executing busybox-1.28.4-r3.trigger
OK: 11 MiB in 17 packages
$ cd \kmfs
$ 7z x -oChassisA ChassisA.zip
7-Zip [64] 16.02 : Copyright (c) 1999-2016 Igor Pavlov : 2016-05-21
p7zip Version 16.02 (locale=C.UTF-8,Utf16=on,HugeFiles=on,64 bits,1 CPU Intel(R) Xeon(R) CPU # 2.30GHz (306F0),ASM,AES-NI)
Scanning the drive for archives:
1 file, 3638943 bytes (3554 KiB)
Extracting archive: ChassisA.zip
--
Path = ChassisA.zip
Type = zip
Physical Size = 3638943
Everything is Ok
Folders: 1
Files: 4
Size: 24070952
Compressed: 3638943
Job succeeded
It is executing successfully but extracted files are not reflected in gitlab repositories and I am not able to access those files using node js code written in test stage.
So it will be great if someone suggests me any way to extract the .zip files on gitlab itself either by using commands or from nodejs code...
You need to use artifacts as part of your job to declared the path of the extracted zip. This will allow the subsequent jobs to access those files.
artifacts:
when: always
expire_in: 1 week
paths:
- zip files folder structure after its extracted

Install bash on Alpine/Docker using qemu for ARM host

I am trying to create, at buildtime on an x86 host, a docker container
, for runtime on an ARM host. To do this you need to use qemu for cross compilation. I also want to use alpine linux since the image size is so small.
However I am encountering an unusual error that only happens at build time - a problem
installing bash.
My understanding is when running apk -U add bash, apk updates the package list
from the repositories and then installs the latest version of the package requested.
In then runs post-install scripts. It seems these post install scripts fail.
However, when I built the image without bash and then ran interactively the container
on the ARM host, and did apk fix && apk -U add bash it did the trick. Doing this
command at build time fails however.
How can I add bash at buildtime?
Dockerfile
FROM armhf/alpine:3.5
ENV CONSUL_PREFIX __CONSUL_PREFIX__
ENV CONSUL_SECRET_PREFIX __CONSUL_SECRET_PREFIX__
ENV QEMU_EXECVE 1
COPY deploy/qemu/qemu-arm-static /usr/bin/
RUN ["qemu-arm-static","/sbin/apk","fix"]
RUN ["qemu-arm-static","/sbin/apk","add","-U","bash"]
RUN ["qemu-arm-static","/sbin/apk","-U","add", \
"postgresql-client",\
"curl","vim",\
"tzdata","bc"]
RUN ["qemu-arm-static","/bin/cp","usr/share/zoneinfo/America/Los_Angeles","/etc/localtime"]
RUN ["qemu-arm-static","/bin/echo","America/Los_Angeles",">","/etc/timezone"]
RUN ["qemu-arm-static","/bin/rm","-rf","/var/cache/apk/*"]
RUN ["qemu-arm-static","/bin/sh"]
COPY deploy /usr/local/deploy
COPY deploy/default/bashrc /root/.bashrc
COPY deploy/default/vimrc /root/.vimrc
COPY src /src
Build log / Error
#C02NN3NBG3QT:dev-resources $ ./publish-image
+ : router-logs
+ : quay.io
+ : quay.io/skilbjo/router-logs
+ : skilbjo#github.com
++ echo router-logs
++ tr - _
+ : router_logs/config
++ echo router-logs
++ tr - _
+ : router_logs/secrets
+ cat ../deploy/default/Dockerfile
+ sed 's;__CONSUL_PREFIX__;router_logs/config;'
+ sed 's;__CONSUL_SECRET_PREFIX__;router_logs/secrets;'
+ IMAGE_TAG=dev
+ cd ..
++ git rev-parse HEAD
+ echo 0a865e3918d584b4377fad9afe9ba28a1dbe5968
+ docker build --rm -t quay.io/skilbjo/router-logs:dev .
Sending build context to Docker daemon 8.713 MB
Step 1 : FROM armhf/alpine:3.5
---> 3ddfeafc01f0
Step 2 : ENV CONSUL_PREFIX router_logs/config
---> Using cache
---> e2aae782f6d8
Step 3 : ENV CONSUL_SECRET_PREFIX router_logs/secrets
---> Using cache
---> 71c863da2558
Step 4 : ENV QEMU_EXECVE 1
---> Using cache
---> a7e80415d0d4
Step 5 : COPY deploy/qemu/qemu-arm-static /usr/bin/
---> Using cache
---> 265df9b6575f
Step 6 : RUN qemu-arm-static /sbin/apk fix
---> Using cache
---> def74ac67891
Step 7 : RUN qemu-arm-static /sbin/apk add -U bash
---> Running in 6f62d2ecd6b3
fetch http://dl-cdn.alpinelinux.org/alpine/v3.5/main/armhf/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.5/community/armhf/APKINDEX.tar.gz
(1/5) Installing ncurses-terminfo-base (6.0-r7)
(2/5) Installing ncurses-terminfo (6.0-r7)
(3/5) Installing ncurses-libs (6.0-r7)
(4/5) Installing readline (6.3.008-r4)
(5/5) Installing bash (4.3.46-r5)
Executing bash-4.3.46-r5.post-install
ERROR: bash-4.3.46-r5.post-install: script exited with error 1
Executing busybox-1.25.1-r0.trigger
ERROR: busybox-1.25.1-r0.trigger: script exited with error 1
1 errors; 7 MiB in 16 packages
The command 'qemu-arm-static /sbin/apk add -U bash' returned a non-zero code: 1
Project repo is here: https://github.com/skilbjo/router-logs
I had a similar error using Buildx's multiarch option. It was fixed thanks to the following commands:
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
docker buildx rm builder
docker buildx create --name builder --driver docker-container --use
docker buildx inspect --bootstrap
Thanks to this answer right here.
It turns out FROM armhf/alpine:3.5 is not good and FROM resin/armhf-alpine:3.5 will do the trick! I'd love to be able to see the commants from scratch that resulted in the armhf image being borked, but for now, this works!

GitLab CI. Path in yml for different users

I'm trying to set up GitLab CI for .net project. Now I'm writing script in yml file. What I want to know: the path to the msbuild.exe and mstest.exe may be different for the different team members, how the same yml script may work for different users?
Or may be I'm understand how GitLab CI work in wrong way?
The path to the mstest.exe and all other referenced executable and files is based on the machine that has the GitLab runner running.
What's on your machine or anyone else's doesn't matter; Only the build server matters, so write your gitlab .yml accordingly.
Sample .net yml file
##variables:
## increase indentation carefully, one space per cascade level.
## THIS IS YAML. NEVER USE TABS.
stages:
- build
- deploy
#BUILD
# Builds all working branches
working:
stage: build
except:
- master
script:
- echo "Build Stage"
- echo "Restoring NuGet Packages..."
- '"c:\nuget\nuget.exe" restore "SOLUTION PATH"'
# - '"c:\nuget\nuget.exe" restore "ANOTHER ABSOLUTE PATH TO YOUR SOLUTION"'
- ''
- echo "Building Solutions..."
- C:\Windows\Microsoft.NET\Framework64\v4.0.30319\msbuild.exe /consoleloggerparameters:ErrorsOnly /maxcpucount /nologo /property:Configuration=Release /verbosity:quiet "SOLUTION PATH"
# Builds all stable/master pushes
stable:
stage: build
only:
- master
script:
- echo "Build Stage"
- echo "Restoring NuGet Packages..."
- '"c:\nuget\nuget.exe" restore "SOLUTION PATH"'
# - '"c:\nuget\nuget.exe" restore "ANOTHER ABSOLUTE PATH TO YOUR SOLUTION"'
- ''
- echo "Building Solutions..."
- C:\Windows\Microsoft.NET\Framework64\v4.0.30319\msbuild.exe /consoleloggerparameters:ErrorsOnly /maxcpucount /nologo /property:Configuration=Release /verbosity:quiet "SOLUTION PATH"
#DEPLOY
stage: deploy
only:
- dev
script:
- echo "Deploy Stage"
#SEND TO YOUR DEV SERVER
## deploy latest master to the correct servers
stage: deploy
script:
- echo "Deploy Stage"
only:
- master
#SEND TO YOUR PRODUCTION SERVER
tags:
- .NET
#put tags here you put on your runners so you can hit the right runners when you push your code.

Xcode CI - Bot script for uploading to Fabric gives error "Failed to Detect Build Environment"

Trying to setup an Xcode CI Bot to build and upload my app to Fabric for beta distribution.
The bot builds and archives the app just fine, but fails on the Fabric upload script. Any suggestions?
Log:
IPA Path: /Users/XcodeServer/Library/Caches/XCSBuilder/Integration-c7216425c354c42adb04283fc31b6348/ExportedProduct/MyApp.ipa
2016-11-17 12:40:23.967 uploadDSYM[55991:2048496] Fabric.framework/run 1.6.2 (205)
2016-11-17 12:40:23.972 uploadDSYM[55991:2048496] Launched uploader in validation mode
error: Fabric: Failed to Detect Build Environment
Script:
IPA_PATH="${XCS_PRODUCT}"
echo "IPA Path: ${IPA_PATH}"
"${XCS_PRIMARY_REPO_DIR}"/MyApp/Pods/Fabric/run <API> <KEY> -ipaPath "${IPA_PATH}" -emails me#email.com
Solved it. I was using the wrong script (pulled from the app's build phase when setting up Fabric). You have to use the crashlytics script:
"${XCS_PRIMARY_REPO_DIR}"/MyApp/Pods/Crashlytics/submit <API> <KEY> -ipaPath "${IPA_PATH}" -emails me#test.com
I'm using this script in the Post-Integration Script Triggers
"${XCS_PRIMARY_REPO_DIR}/Pods/Crashlytics/submit" <API> <KEY> -ipaPath "${XCS_PRODUCT}"
Tested in Xcode Server 10
# Make the the encoding is correct
export LANG=en_US.UTF-8
# Remove & Copy assets
rm -r ${XCS_SOURCE_DIR}/ipa
cp -R ${XCS_OUTPUT_DIR}/ExportedProduct/Apps/ ${XCS_SOURCE_DIR}/ipa/
# Release the archive
${XCS_PRIMARY_REPO_DIR}/Pods/Crashlytics/submit <API> <Key> -ipaPath ${XCS_SOURCE_DIR}/ipa/AppName.ipa -groupAliases groupName -notifications YES
In my case, Xcode Server deletes all assets after archive.
So I added a 'copy' command in the script.

Resources