I am trying to build/deploy an angular app everytime I push my code to github. The travis CI build passes but it doesn't deploy to Surge for some reason. I added SURGE_LOGIN and SURGE_TOKEN environment variables to the repository settings but it still doesn't work. Should it not mention that deployment failed in the build log? Any idea what is going wrong here/how I can fix this? Also I don't get emails when a build passes/fails even though this is set in the config file and my email address i added to the SURGE_LOGIN environment variable
the end of my travis build log:
42.29s$ ng build -prod
1439
Date: 2017-11-02T11:04:25.130Z
1440
Hash: 5dadab3e49327d48aac1
1441
Time: 38060ms
1442
chunk {0} polyfills.d8d3d78a4deb2ab66856.bundle.js (polyfills) 66.1 kB {4} [initial] [rendered]
1443
chunk {1} styles.4d93494871bdc47b353f.bundle.css (styles) 115 kB {4} [initial] [rendered]
1444
chunk {2} main.725afabe80d80f10fd14.bundle.js (main) 8.08 kB {3} [initial] [rendered]
1445
chunk {3} vendor.4400ceca3ce00f041a26.bundle.js (vendor) 434 kB [initial] [rendered]
1446
chunk {4} inline.fe3295955bbd9314430c.bundle.js (inline) 1.45 kB [entry] [rendered]
1447
1448
1449
The command "ng build -prod" exited with 0.
1450
1451
Done. Your build exited with 0.
my .travis.yml code:
#travis CI build configuration
#build language
language: node_js
#node_js versions
node_js:
- "6.11.2"
#before running the build
before-script:
- npm install #install all dependencies
- npm install -g surge #global surge install
#actual build step
script:
- ng build -prod
#build only on push not on pull requests.
deploy:
provider: surge
skip_cleanup: true
project: ./dist/ #build output path
domain: gaping-feeling.surge.sh #surge domain
#notifications
notifications:
email:
on_success: change #default: change
on_failure: change #default: change
Your issue is the yaml indentation. Yaml is indentation specific. So
a:
b:
and
a:
b:
They both have different meaning. In first a and b are top level attributes which in later case b is a child attribute of a. Your yaml should be
#travis CI build configuration
#build language
language: node_js
#node_js versions
node_js:
- "6.11.2"
#before running the build
before-script:
- npm install #install all dependencies
- npm install -g surge #global surge install
#actual build step
script:
- ng build -prod
#build only on push not on pull requests.
deploy:
provider: surge
skip_cleanup: true
project: ./dist/ #build output path
domain: gaping-feeling.surge.sh #surge domain
notifications:
email:
on_success: change
on_failure: change
If you still face issues let me know. But this should fix the problem
Related
I made a Dockerfile to build my spring boot project with GraalVm natively; everything went correctly.
Here is the Dockerfile
FROM ghcr.io/graalvm/graalvm-ce:22.3.1 AS buildnative
WORKDIR /app
COPY mvnw pom.xml ./
COPY .mvn/ .mvn
COPY src ./src
RUN ./mvnw clean package -Pnative
FROM ubuntu:23.04
EXPOSE 8080
COPY --from=buildnative /app/target/spring-boot-project /build/app
CMD ["/build/app"]
This runs perfectly locally, but in the GitLab runner, I always have the same error.
JAVA_HOME is not defined correctly.
We cannot execute /opt/graalvm-ce-java17-22.3.1/bin/java
The command '/bin/sh -c ./mvnw clean package -Pnative' returned a non-zero code: 1
So I decided to add some logs within the maven wrapper, and here is what I have :
Step 7/11 : RUN ./mvnw clean package -Pnative ---> Running in 81e0558130f3 ------------> /opt/graalvm-ce-java17-22.3.1/bin/java ------------> JAVA_HOME is /opt/graalvm-ce-java17-22.3.1 Error: JAVA_HOME is not defined correctly. We cannot execute /opt/graalvm-ce-java17-22.3.1/bin/java The command '/bin/sh -c ./mvnw clean package -Pnative' returned a non-zero code: 1 Cleaning up project directory and file based variables
Step 7/11 : RUN ./mvnw clean package -Pnative
---> Running in 81e0558130f3
------------> /opt/graalvm-ce-java17-22.3.1/bin/java
------------> JAVA_HOME is /opt/graalvm-ce-java17-22.3.1
Error: JAVA_HOME is not defined correctly.
We cannot execute /opt/graalvm-ce-java17-22.3.1/bin/java
The command '/bin/sh -c ./mvnw clean package -Pnative' returned a non-zero code: 1
Cleaning up project directory and file based variables
In the log I have added, we can see JAVA_HOME is defined and is adequately defined. It is the same as locally, where everything works perfectly.
I tried to add this line: RUN chmod +x mvnw before running it, but it did not change anything.
I need more ideas. Is there anyone have an idea of what is happening?
Edit:
I decided to dive deeper into the issue. I have added logs to know why it does not work. I modified the mvnw script to know what was happening.
I have added this to mvnw
if [ -e "$JAVACMD" ] ; then
echo "------------> THE FILE EXIST" >&2
else
echo "------------> THE FILE DOES NOT EXIST" >&2
fi
if [ -x "$JAVACMD" ] ; then
echo "------------> THE FILE IS EXECUTABLE" >&2
else
echo "------------> THE FILE IS NOT EXECUTABLE" >&2
fi
Results:
Here is in local:
------------> JAVACMD /opt/graalvm-ce-java17-22.3.1/bin/java
------------> THE FILE EXIST
------------> THE FILE IS EXECUTABLE
Here is in the gitlab-runner:
------------> JAVACMD /opt/graalvm-ce-java17-22.3.1/bin/java
------------> THE FILE EXIST
------------> THE FILE IS NOT EXECUTABLE
Makes no sense to me
Is your GitLab runner configured to use a non-root user when executing the Dockerfile?
As #jilliss pointed out, it seems that it's the Java binary that needs execute permission, but maybe only root has the permission (which is why it works locally as by default you will be running it as root).
If the Ops team have tried to run the Dockerfile as another user, then it could explain why /opt/graalvm-ce-java17-22.3.1/bin/java is no longer executable.
Try adding a whoami log and see which user is running when it runs in GL.
Correct compile command is
mvn -Pnative native:compile
You can see more details and full doc here. After that you will see graalvm build result in the logs. So you need to change your docker file like below
FROM ghcr.io/graalvm/graalvm-ce:22.3.1 AS buildnative
WORKDIR /app
COPY mvnw pom.xml ./
COPY .mvn/ .mvn
COPY src ./src
RUN ./mvnw native:compile -Pnative
FROM ubuntu:23.04
EXPOSE 8080
COPY --from=buildnative /app/target/spring-boot-project /build/app
CMD ["/build/app"]
example build log from my local build
[INFO] --- native-maven-plugin:0.9.19:compile (default-cli) # demo ---
Downloading: Component catalog from www.graalvm.org
Processing Component: Native Image
Downloading: Component native-image: Native Image from github.com
Installing new component: Native Image (org.graalvm.native-image, version 22.3.1)
[INFO] Found GraalVM installation from JAVA_HOME variable.
[INFO] [graalvm reachability metadata repository for ch.qos.logback:logback-classic:1.4.5]: Configuration directory not found. Trying latest version.
[INFO] [graalvm reachability metadata repository for ch.qos.logback:logback-classic:1.4.5]: Configuration directory is ch.qos.logback/logback-classic/1.4.1
[INFO] Executing: /opt/graalvm-ce-java17-22.3.1/bin/native-image -cp /app/target/classes:/root/.m2/repository/org/springframework/spring-aop/6.0.4/spring-aop-6.0.4.jar:/root/.m2/repository/org/springframework/boot/spring-boot-starter-logging/3.0.2/spring-boot-starter-logging-3.0.2.jar:/root/.m2/repository/org/springframework/spring-context/6.0.4/spring-context-6.0.4.jar:/root/.m2/repository/org/springframework/spring-core/6.0.4/spring-core-6.0.4.jar:/root/.m2/repository/org/apache/logging/log4j/log4j-api/2.19.0/log4j-api-2.19.0.jar:/root/.m2/repository/org/springframework/spring-expression/6.0.4/spring-expression-6.0.4.jar:/root/.m2/repository/org/apache/logging/log4j/log4j-to-slf4j/2.19.0/log4j-to-slf4j-2.19.0.jar:/root/.m2/repository/ch/qos/logback/logback-core/1.4.5/logback-core-1.4.5.jar:/root/.m2/repository/jakarta/annotation/jakarta.annotation-api/2.1.1/jakarta.annotation-api-2.1.1.jar:/root/.m2/repository/org/springframework/spring-beans/6.0.4/spring-beans-6.0.4.jar:/root/.m2/repository/ch/qos/logback/logback-classic/1.4.5/logback-classic-1.4.5.jar:/root/.m2/repository/org/springframework/boot/spring-boot-starter/3.0.2/spring-boot-starter-3.0.2.jar:/root/.m2/repository/org/springframework/spring-jcl/6.0.4/spring-jcl-6.0.4.jar:/root/.m2/repository/org/springframework/boot/spring-boot-autoconfigure/3.0.2/spring-boot-autoconfigure-3.0.2.jar:/root/.m2/repository/org/slf4j/jul-to-slf4j/2.0.6/jul-to-slf4j-2.0.6.jar:/root/.m2/repository/org/yaml/snakeyaml/1.33/snakeyaml-1.33.jar:/root/.m2/repository/org/springframework/boot/spring-boot/3.0.2/spring-boot-3.0.2.jar:/root/.m2/repository/org/slf4j/slf4j-api/2.0.6/slf4j-api-2.0.6.jar --no-fallback -H:Path=/app/target -H:Name=demo -H:ConfigurationFileDirectories=/app/target/graalvm-reachability-metadata/160481799c4b6c37cde925c9aebf513c32245dcf/ch.qos.logback/logback-classic/1.4.1
========================================================================================================================
GraalVM Native Image: Generating 'demo' (executable)...
========================================================================================================================
[1/7] Initializing... (6.6s # 0.23GB)
Version info: 'GraalVM 22.3.1 Java 17 CE'
Java version info: '17.0.6+10-jvmci-22.3-b13'
C compiler: gcc (redhat, x86_64, 11.3.1)
Garbage collector: Serial GC
1 user-specific feature(s)
- org.springframework.aot.nativex.feature.PreComputeFieldFeature
Field org.apache.commons.logging.LogAdapter#log4jSpiPresent set to true at build time
Field org.apache.commons.logging.LogAdapter#log4jSlf4jProviderPresent set to true at build time
Field org.apache.commons.logging.LogAdapter#slf4jSpiPresent set to true at build time
Field org.apache.commons.logging.LogAdapter#slf4jApiPresent set to true at build time
Field org.springframework.core.NativeDetector#imageCode set to true at build time
Field org.springframework.format.support.DefaultFormattingConversionService#jsr354Present set to false at build time
Field org.springframework.core.KotlinDetector#kotlinPresent set to false at build time
Field org.springframework.core.KotlinDetector#kotlinReflectPresent set to false at build time
Field org.springframework.cglib.core.AbstractClassGenerator#imageCode set to true at build time
Field org.springframework.boot.logging.log4j2.Log4J2LoggingSystem$Factory#PRESENT set to false at build time
Field org.springframework.boot.logging.java.JavaLoggingSystem$Factory#PRESENT set to true at build time
Field org.springframework.boot.logging.logback.LogbackLoggingSystem$Factory#PRESENT set to true at build time
Field org.springframework.boot.logging.logback.LogbackLoggingSystemProperties#JBOSS_LOGGING_PRESENT set to false at build time
Field org.springframework.context.event.ApplicationListenerMethodAdapter#reactiveStreamsPresent set to false at build time
[2/7] Performing analysis... [*******] (60.8s # 2.15GB)
8,903 (88.31%) of 10,082 classes reachable
13,147 (64.27%) of 20,456 fields reachable
40,485 (56.88%) of 71,181 methods reachable
365 classes, 115 fields, and 1,191 methods registered for reflection
64 classes, 70 fields, and 55 methods registered for JNI access
4 native libraries: dl, pthread, rt, z
[3/7] Building universe... (8.9s # 2.03GB)
[4/7] Parsing methods... [***] (9.6s # 0.82GB)
[5/7] Inlining methods... [***] (3.7s # 2.16GB)
[6/7] Compiling methods... [*******] (48.2s # 1.97GB)
[7/7] Creating image... (5.6s # 1.60GB)
17.53MB (49.48%) for code area: 25,460 compilation units
17.60MB (49.66%) for image heap: 215,829 objects and 25 resources
312.40KB ( 0.86%) for other data
35.44MB in total
------------------------------------------------------------------------------------------------------------------------
Top 10 packages in code area: Top 10 object types in image heap:
936.87KB java.util 3.72MB byte[] for code metadata
594.27KB java.lang.invoke 2.07MB java.lang.String
469.21KB c.s.org.apache.xerces.internal.impl.xs.traversers 2.06MB java.lang.Class
455.78KB java.lang 1.68MB byte[] for general heap data
423.05KB com.sun.org.apache.xerces.internal.impl 1.60MB byte[] for java.lang.String
407.79KB com.sun.crypto.provider 765.10KB com.oracle.svm.core.hub.DynamicHubCompanion
375.00KB org.springframework.beans.factory.support 576.38KB java.util.HashMap$Node
371.71KB java.io 513.09KB int[][]
354.25KB java.util.concurrent 394.88KB java.lang.String[]
328.57KB java.text 394.65KB byte[] for reflection metadata
12.74MB for 405 more packages 3.23MB for 1771 more object types
------------------------------------------------------------------------------------------------------------------------
12.0s (8.0% of total time) in 35 GCs | Peak RSS: 3.04GB | CPU load: 5.08
------------------------------------------------------------------------------------------------------------------------
Produced artifacts:
/app/target/demo (executable)
/app/target/demo.build_artifacts.txt (txt)
========================================================================================================================
Finished generating 'demo' in 2m 29s.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:11 min
[INFO] Finished at: 2023-01-27T18:05:23Z
[INFO] ------------------------------------------------------------------------
Never tried gitlab runner myself, but have you tried to force the JAVA_HOME env path before maven command?
ENV JAVA_HOME=/opt/graalvm-ce-java17-22.3.1
RUN ./mvnw clean package -Pnative
hope it helps
I have a (1) Dockerfile and a (2) C++ project with Bazel
I want to create a docker image that has Bazel targets pre-built within the image, so as to when I power up new containers the Bazel targets are pre-built and I just do Bazel run //hello:hello_world from the container bash.
Dockerfile
# Copy my project with Bazel files to a Docker image, and the
...
RUN bazel --output_user_root=/tmp/hello_project/bazel build //...
...
Within the execution of the Dockerfile, I get the following output which is expected
Loading:
Loading: 0 packages loaded
Analyzing: 2 targets (1 packages loaded, 0 targets configured)
Analyzing: 2 targets (11 packages loaded, 18 targets configured)
INFO: Analyzed 2 targets (15 packages loaded, 60 targets configured).
INFO: Found 2 targets...
[0 / 11] [Prepa] BazelWorkspaceStatusAction stable-status.txt
INFO: Elapsed time: 6.333s, Critical Path: 0.37s
INFO: 11 processes: 6 internal, 5 processwrapper-sandbox.
INFO: Build completed successfully, 11 total actions
INFO: Build completed successfully, 11 total actions
When I run a new container form the Docker Image built previously and then on the container I run
bazel run //hello:hello_world
Instead of using existing pre-built targets it re-builds the targets, which is not required.
Result I expect (not get): Everything is pre-built and just needs to run
INFO: Analyzed target //hello:hello_world (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
Target //hello:hello_world up-to-date:
bazel-bin/hello/hello_world
INFO: Elapsed time: 0.163s, Critical Path: 0.01s
INFO: 1 process: 1 internal.
INFO: Build completed successfully, 1 total action
INFO: Build completed successfully, 1 total action
Hello World!
Result I get: I re-builds the binaries
[root#4a6bdb57fd79 test-rc]# bazel run //hello:hello_world
Extracting Bazel installation...
Starting local Bazel server and connecting to it...
INFO: Analyzed target //hello:hello_world (15 packages loaded, 60 targets configured).
INFO: Found 1 target...
Target //hello:hello_world up-to-date:
bazel-bin/hello/hello_world
INFO: Elapsed time: 6.255s, Critical Path: 0.38s
INFO: 7 processes: 4 internal, 3 processwrapper-sandbox.
INFO: Build completed successfully, 7 total actions
INFO: Build completed successfully, 7 total actions
Hello World!
How can I make sure, the the bazel run uses same pre-built targets and not build them again before run.
This sounds like non-determinism - in the second execution of Bazel, something is different causing a cache miss.
Some things that can cause it:
different options passed to bazel build vs. bazel test - check your .bazelrc file
actions that include VCS info or a timestamp - make sure you have rules_docker 0.22.0 or later to pick up https://github.com/bazelbuild/rules_docker/commit/2b35b2dd56f0be6cc6b8df957332a31435f6b3ce
One step to diagnose is to use the --explain=log.txt and --verbose_explanations flags to Bazel, then the log file will say why it was rebuilt. However it doesn't have much detail, just something like "a source file has changed".
If you want the power tool for this, there is a way to find out exactly why Bazel didn't get a cache hit - read https://georgi.hristozov.net/til/2020/04/20/compare-bazel-execlogs-to-find-non-deterministic-parts-of-the-build
I have a problem when I try to run ng deploy on a recently created Angular project. The version I'm using is 9.1.6 (I just upgraded from 9.1.5, where I got the same error). ng serve works probably.
(base) paul#Pauls-MacBook-Pro must-have % ng deploy
📦 Building "must-have"
Another process, with id 4426, is currently running ngcc.
Waiting up to 250s for it to finish.
Generating ES5 bundles for differential loading...
ES5 bundle generation complete.
chunk {0} runtime-es2015.1eba213af0b233498d9d.js (runtime) 1.45 kB [entry] [rendered]
chunk {0} runtime-es5.1eba213af0b233498d9d.js (runtime) 1.45 kB [entry] [rendered]
chunk {2} polyfills-es2015.690002c25ea8557bb4b0.js (polyfills) 36.1 kB [initial] [rendered]
chunk {3} polyfills-es5.9e286f6d9247438cbb02.js (polyfills-es5) 129 kB [initial] [rendered]
chunk {1} main-es2015.658a3e52de1922463b54.js (main) 467 kB [initial] [rendered]
chunk {1} main-es5.658a3e52de1922463b54.js (main) 545 kB [initial] [rendered]
chunk {4} styles.c86817c326e37bf011e3.css (styles) 62 kB [initial] [rendered]
Date: 2020-05-16T16:06:16.041Z - Hash: a4ca944a427d0323601e - Time: 68781ms
Cannot read property 'printf' of undefined
(base) paul#Pauls-MacBook-Pro must-have %
My versions:
Package Version
-----------------------------------------------------------
#angular-devkit/architect 0.900.7
#angular-devkit/build-angular 0.901.6
#angular-devkit/build-optimizer 0.901.6
#angular-devkit/build-webpack 0.901.6
#angular-devkit/core 9.1.6
#angular-devkit/schematics 9.1.6
#angular/cdk 9.2.3
#angular/cli 9.1.6
#angular/fire 6.0.0
#angular/material 9.2.3
#ngtools/webpack 9.1.6
#schematics/angular 9.1.6
#schematics/update 0.901.6
rxjs 6.5.5
typescript 3.8.3
webpack 4.42.0
The fix is available in version 6.0.0-rc.2 of #angular/fire. Upgrade to 6.0.0-rc.2 by
ng add #angular/fire#next
I had this problem and every ng add #angular/fire#next did not help.
I am using angular8, angular fire and firebase hosting. So, I build the project using ng build, and then deployed on firebase hosting using firebase deploy.
Im try so setup a CI/CD Pipeline for a Nativescript app, added the commands to install node and npm install but nativescript has dependencies that it need. How do I go about on Azure dev ops dynamically without having to create a vm that has nativescript and all its dependencies installed and setup
So i have used a VM and install nativescript on it and used and agent to connect to the machine and build solution, i have done the same using jenkins but jenkins was running on the vm, no i want to move the whole pipeline on azure dev ops
command used in build step: tns build android
If you don't want to use a vm you'll have to install everything needed for nativescript before building it on their hosted agent each time you create a build for your app.
A couple of important things to note. First the name of your repository is changed to 's' this messes with the naming of your entitlement file... or at least it did for me. I fix this with a bash file I add to my repository that changes the name of the path in build.xcconfig for the CODE_SIGN_ENTITLEMENTS variable. I added an npm run entitle command in my package.json file to run this before building. Second you'll want to store all files and secure passwords in the library section under pipelines in you Azure Devops project. Third, Using the classic editor is your best friend for figuring out yaml as most jobs have an option to view the YAML. You can also use the classic editor as an alternative to the YAML file
The YAML and bash file below show an example of how you can build an ipa and apk file that is stored as an artifact. You can then use that trigger a release pipeline to push the the play and app store.
# YAML File
name: Release Build
trigger:
- release/* # will start build for pull request into release branch ie. realease/version_1_0_0, release/version_2_0_0
pool:
vmImage: 'macOS-10.13'
variables:
scheme: 's' # default name/scheme created on this machine for ipa
sdk: 'iphoneos'
configuration: 'Release'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.14'
displayName: 'Install Node.js'
# Download Secure File for Android
# Note: if multiple secure files are downloaded... variable name will change and break pipeline
- task: DownloadSecureFile#1
displayName: 'download android keystore file'
inputs:
secureFile: myKeystore.keystore
#Install Apple Certificate(Distrobution)
- task: InstallAppleCertificate#2
displayName: 'Install an Apple certificate Distribution (yourcertificate.p12)'
inputs:
certSecureFile: '00000000-0000-0000-0000-000000000000' # got id from viewing file in clasic editor for pipeline
certPwd: '$(myCertificatePasswordP12)' # password stored in Library
# Install Apple Provisioning Profile(Distrobution)
- task: InstallAppleProvisioningProfile#1
displayName: 'Apple Provisioning Profile(myProvisioningProfile.mobileprovision)'
inputs:
provisioningProfileLocation: 'secureFiles' # Options: secureFiles, sourceRepository
provProfileSecureFile: '00000000-0000-0000-0000-000000000000' # Required when provisioningProfileLocation == SecureFiles
# General Set Up
- script: |
npm install -g nativescript#latest
npm install
displayName: 'Install native script and node Modules'
# variable explination
# $DOWNLOADSECUREFILE_SECUREFILEPATH is keystore file downloaded earlier
# $KEYSTORE_PASSWORD refers to the environment variable in this script which references library variable
# $(MyPlayStoreAlias) refers to library variable for your apps alias
# $BUILD_SOURCESDIRECTORY location where apk is built to
# Android
- script: |
tns build android --env.production --release --key-store-path $DOWNLOADSECUREFILE_SECUREFILEPATH --key-store-password $KEYSTORE_PASSWORD --key-store-alias $(MyPlayStoreAlias) --key-store-alias-password $KEYSTORE_PASSWORD --bundle --copy-to $BUILD_SOURCESDIRECTORY #creates apk
displayName: 'Build Android Release apk'
env:
KEYSTORE_PASSWORD: $(MyPlayStoreKeystore)
# create apk artifact
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: '$(Build.SourcesDirectory)/app-release.apk'
artifactName: 'apkDrop'
displayName: 'Publishing apkDrop artifact'
# have to use xcode 10.1 to meet min standards for uploading ipa... default version for this machine was lower than 10.1
#changing xcode version
- script: |
xcodebuild -version
/bin/bash -c "echo '##vso[task.setvariable variable=MD_APPLE_SDK_ROOT;]'/Applications/Xcode_10.1.app;sudo xcode-select --switch /Applications/Xcode_10.1.app/Contents/Developer"
xcodebuild -version
displayName: 'changing xcode to 10.1'
# Optional... was running into build issues with latest version
#downgrading cocoapods version
- script: |
sudo gem uninstall cocoapods
sudo gem install cocoapods -v 1.5.3
displayName: 'Using cocoapods version 1.5.3'
#iOS
- script: |
xcodebuild -version # makeing sure the correct xcode version is being used
pip install --ignore-installed six # fixes pip 6 error
npm run entitle #custom bash script used to change entitlement file
tns run ios --provision #see what provisioning profile and certificate are installed... helpful for debugging
tns build ios --env.production --release --bundle #creates xcworkspace
displayName: 'Build ios Release xcworkspace'
#build and sign ipa
- task: Xcode#5
displayName: 'Xcode sign and build'
inputs:
sdk: '$(sdk)' # custom var
scheme: '$(scheme)' # must be provided if setting manual path to xcworkspace
configuration: '$(configuration)' # custom var
xcodeVersion: 'specifyPath'
xcodeDeveloperDir: '/Applications/Xcode_10.1.app' #using xcode 10.1
xcWorkspacePath: 'platforms/ios/s.xcworkspace'
exportPath: '$(agent.buildDirectory)/output/$(sdk)/$(configuration)' #location where ipa file will be stored
packageApp: true #create ipa
signingOption: manual
signingIdentity: '$(APPLE_CERTIFICATE_SIGNING_IDENTITY)' # distribution certificate
provisioningProfileUuid: '$(APPLE_PROV_PROFILE_UUID)' # distribution profile
#creating ipa artifact
- task: PublishBuildArtifacts#1
displayName: 'Publishing ipaDrop artifact'
inputs:
pathtoPublish: '$(agent.buildDirectory)/output/$(sdk)/$(configuration)/s.ipa'
artifactName: 'ipaDrop'
Bash file
#!/usr/bin/env bash
# filename: pipeline-entitlements.sh
echo "Editing build.xcconfig"
TARGET_KEY="CODE_SIGN_ENTITLEMENTS"
REPLACEMENT_VALUE="s\/Resources\/YOURENTITLEMENTFILENAME.entitlements"
CONFIG_FILE="./app/App_Resources/iOS/build.xcconfig"
echo "Editing $TARGET_KEY and replaceing value with $REPLACEMENT_VALUE"
sed -i.bak "s/\($TARGET_KEY *= *\).*/\1$REPLACEMENT_VALUE/" $CONFIG_FILE
echo "Finished editing build.xcconfig"
Actually, I have my zip files on gitlab, I want to extract those files using gitlab CI/CD.I have tried this in .gitlab-ci.yml:
image: docker
stages:
- build
- test
services:
- docker:dind
build:
before_script:
- apk add p7zip
script:
- cd \kmfs
- 7z x -oChassisA ChassisA.zip
OUTPUT
$ apk add p7zip
fetch http://dl-cdn.alpinelinux.org/alpine/v3.8/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.8/community/x86_64/APKINDEX.tar.gz
(1/3) Installing libgcc (6.4.0-r9)
(2/3) Installing libstdc++ (6.4.0-r9)
(3/3) Installing p7zip (16.02-r3)
Executing busybox-1.28.4-r3.trigger
OK: 11 MiB in 17 packages
$ cd \kmfs
$ 7z x -oChassisA ChassisA.zip
7-Zip [64] 16.02 : Copyright (c) 1999-2016 Igor Pavlov : 2016-05-21
p7zip Version 16.02 (locale=C.UTF-8,Utf16=on,HugeFiles=on,64 bits,1 CPU Intel(R) Xeon(R) CPU # 2.30GHz (306F0),ASM,AES-NI)
Scanning the drive for archives:
1 file, 3638943 bytes (3554 KiB)
Extracting archive: ChassisA.zip
--
Path = ChassisA.zip
Type = zip
Physical Size = 3638943
Everything is Ok
Folders: 1
Files: 4
Size: 24070952
Compressed: 3638943
Job succeeded
It is executing successfully but extracted files are not reflected in gitlab repositories and I am not able to access those files using node js code written in test stage.
So it will be great if someone suggests me any way to extract the .zip files on gitlab itself either by using commands or from nodejs code...
You need to use artifacts as part of your job to declared the path of the extracted zip. This will allow the subsequent jobs to access those files.
artifacts:
when: always
expire_in: 1 week
paths:
- zip files folder structure after its extracted