fabric8: Add a configmap - maven
I've been able to build and deploy my spring boot service image into openshift using fabric8 plugin.
I need to add some configmap.
I've tried to add an striaghtforward configmap.yml into src/main/fabric8.
Currently, I'm getting this message:
[INFO] --- fabric8-maven-plugin:4.4.1:resource (default-cli) # connector ---
[INFO] F8: Using Container image name of namespace: arxius-linia
[INFO] F8: Running generator spring-boot
[INFO] F8: spring-boot: Using Container image fabric8/java-centos-openjdk11-jdk:1.6.3 as base / builder
[INFO] F8: using resource templates from /home/jeusdi/projects/arxius-linia/connector/src/main/fabric8
[INFO] F8: fmp-controller: Adding a default Deployment
[INFO] F8: fmp-service: Adding a default service 'connector' with ports [8080]
[WARNING] F8: fmp-git: Could not detect any git remote
[WARNING] F8: fmp-git: Could not detect any git remote
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 20.839 s
[INFO] Finished at: 2020-11-25T14:08:29+01:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal io.fabric8:fabric8-maven-plugin:4.4.1:resource (default-cli) on project connector: Execution default-cli of goal io.fabric8:fabric8-maven-plugin:4.4.1:resource failed.: NullPointerException -> [Help 1]
My configmap.yml is:
data:
application.properties: |
spring.profiles.active=dev
My current related pom.xml configuration is:
<build>
<plugins>
<plugin>
<groupId>io.fabric8</groupId>
<artifactId>fabric8-maven-plugin</artifactId>
<version>4.4.1</version>
</plugin>
</plugins>
</build>
It's zero configuration based.
Any ideas about how to add my configmap.yml?
I'm from Fabric8/Eclipse JKube team.
As I've mentioned in the comment, you should try switching to Eclipse JKube since it's going to be supported in future. Migrating to Eclipse JKube is as simple as running this goal:
$ mvn org.eclipse.jkube:kubernetes-maven-plugin:migrate
I tried out your ConfigMap resource fragment with this version of kubernetes-maven-plugin in one of my demo projects:
<plugin>
<groupId>org.eclipse.jkube</groupId>
<artifactId>kubernetes-maven-plugin</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<goals>
<goal>resource</goal>
</goals>
</execution>
</executions>
</plugin>
I added your ConfigMap to my src/main/jkube directory:
eclipse-jkube-sample-with-resource-fragments : $ cat src/main/jkube/second-configmap.yaml
data:
application.properties: |
spring.profiles.active=dev
When I did mvn k8s:resource k8s:apply, I was able to see ConfigMap being created:
eclipse-jkube-sample-with-resource-fragments : $ mvn k8s:resource k8s:apply
[INFO] Scanning for projects...
[INFO]
[INFO] -------< org.eclipse.jkube.quickstarts.maven:external-resources >-------
[INFO] Building Eclipse JKube :: Quickstarts :: Maven :: External Resources 1.1.0
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] --- kubernetes-maven-plugin:1.1.0:resource (default-cli) # external-resources ---
[INFO] k8s: Running generator spring-boot
[INFO] k8s: spring-boot: Using Docker image quay.io/jkube/jkube-java-binary-s2i:0.0.9 as base / builder
[INFO] k8s: Using resource templates from /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/src/main/jkube
[INFO] k8s: jkube-controller: Adding a default Deployment
[INFO] k8s: jkube-healthcheck-spring-boot: Adding readiness probe on port 8080, path='/health', scheme='HTTP', with initial delay 10 seconds
[INFO] k8s: jkube-healthcheck-spring-boot: Adding liveness probe on port 8080, path='/health', scheme='HTTP', with initial delay 180 seconds
[INFO] k8s: jkube-service-discovery: Using first mentioned service port '80'
[INFO] k8s: jkube-revision-history: Adding revision history limit to 2
[INFO] k8s: validating /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/target/classes/META-INF/jkube/kubernetes/ribbon-serviceaccount.yml resource
[INFO] k8s: validating /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/target/classes/META-INF/jkube/kubernetes/external-resources-service.yml resource
[INFO] k8s: validating /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/target/classes/META-INF/jkube/kubernetes/game-config-env-file-configmap.yml resource
[INFO] k8s: validating /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/target/classes/META-INF/jkube/kubernetes/second-configmap.yml resource
[INFO] k8s: validating /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/target/classes/META-INF/jkube/kubernetes/external-resources-deployment.yml resource
[INFO] k8s: validating /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/target/classes/META-INF/jkube/kubernetes/my-ingress-ingress.yml resource
[INFO]
[INFO] --- kubernetes-maven-plugin:1.1.0:apply (default-cli) # external-resources ---
[INFO] k8s: Using Kubernetes at https://192.168.39.102:8443/ in namespace default with manifest /home/rohaan/work/repos/eclipse-jkube-sample-with-resource-fragments/target/classes/META-INF/jkube/kubernetes.yml
[INFO] k8s: Creating a ServiceAccount from kubernetes.yml namespace default name ribbon
[INFO] k8s: Created ServiceAccount: target/jkube/applyJson/default/serviceaccount-ribbon-1.json
[INFO] k8s: Creating a Service from kubernetes.yml namespace default name external-resources
[INFO] k8s: Created Service: target/jkube/applyJson/default/service-external-resources-1.json
[INFO] k8s: Creating a ConfigMap from kubernetes.yml namespace default name game-config-env-file
[INFO] k8s: Created ConfigMap: target/jkube/applyJson/default/configmap-game-config-env-file-1.json
[INFO] k8s: Creating a ConfigMap from kubernetes.yml namespace default name second
[INFO] k8s: Created ConfigMap: target/jkube/applyJson/default/configmap-second-1.json
[INFO] k8s: Creating a Deployment from kubernetes.yml namespace default name external-resources
[INFO] k8s: Created Deployment: target/jkube/applyJson/default/deployment-external-resources-1.json
[INFO] k8s: Applying Ingress my-ingress from kubernetes.yml
[INFO] k8s: HINT: Use the command `kubectl get pods -w` to watch your pods start up
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6.821 s
[INFO] Finished at: 2021-02-09T12:37:38+05:30
[INFO] ------------------------------------------------------------------------
When I checked, I was able to see ConfigMap created in default namespace:
eclipse-jkube-sample-with-resource-fragments : $ kubectl get configmap second -o yaml
apiVersion: v1
data:
application.properties: |
spring.profiles.active=dev
kind: ConfigMap
metadata:
creationTimestamp: "2021-02-09T07:07:37Z"
...
Related
Skaffold dev works with minikube only. Other on-prem cluster fails
I have a Spring Boot app with jib-maven configured POM <plugin> <groupId>com.google.cloud.tools</groupId> <artifactId>jib-maven-plugin</artifactId> <version>2.1.0</version> <configuration> <from>  </from> <to>  <tags> <tag>${project.version}</tag> </tags> <tags> <tag>latest</tag> </tags> </to> <container> <jvmFlags> <jvmFlag>-XX:+UseContainerSupport</jvmFlag> <jvmFlag>-XX:MinRAMPercentage=60.0</jvmFlag> <jvmFlag>-XX:MaxRAMPercentage=90.0</jvmFlag> <jvmFlag> -XshowSettings:vm</jvmFlag> </jvmFlags> <mainClass>com.demo.DemoApplication</mainClass> </container> </configuration> SKAFFOLD.YAML apiVersion: skaffold/v2beta1 kind: Config metadata: name: springtokube build: artifacts: - image: registry.demo/springtokube jib: project: com.demo:springtokube local: push: true concurrency: 1 useBuildkit: false useDockerCLI: true deploy: kubectl: manifests: - deployment.yaml ALSO SET INSECURE REGISTRY skaffold config set --global insecure-registries registry.demo But when using minikube I can run successfully skaffold dev When using other cluster (ON-PREM) I get FATA[0016] exiting dev mode because first build failed: build failed: building [registry.demo/springtokube]: build artifact: getting image: GET http://registry.demo/v2/: : Not Found What might be the problem? I restarted today using kubectl context skaffold debug --no-prune=false --cache-artifacts=false And It Failed Listing files to watch... Generating tags... - registry.demo/springtokube -> registry.demo/springtokube:cf60c31 Found [minikube] context, using local docker daemon. Building [registry.demo/springtokube]... ............. ............... [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.294 s - in com.demo.springtokube.SpringtokubeApplicationTests 2020-04-15 08:45:48.277 INFO 30662 --- [extShutdownHook] o.s.s.concurrent.ThreadPoolTaskExecutor : Shutting down ExecutorService 'applicationTaskExecutor' [INFO] [INFO] Results: [INFO] [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] [INFO] --- maven-jar-plugin:3.1.2:jar (default-jar) # springtokube --- [INFO] Building jar: ....../springtokube/target/springtokube.jar [INFO] [INFO] --- spring-boot-maven-plugin:2.2.6.RELEASE:repackage (repackage) # springtokube --- [INFO] Replacing main artifact with repackaged archive [INFO] [INFO] --- jib-maven-plugin:2.1.0:build (default-cli) # springtokube --- [INFO] [INFO] Containerizing application to registry.demo/springtokube:cf60c31, registry.demo/springtokube... [WARNING] Base image 'openjdk:11-jre-slim' does not use a specific image digest - build may not be reproducible [INFO] Getting manifest for base image openjdk:11-jre-slim... [INFO] Building dependencies layer... [INFO] Building resources layer... [INFO] Building classes layer... [INFO] Using credentials from Docker config (~/.docker/config.json) for registry.demo/springtokube:cf60c31 [WARNING] Cannot verify server at https://registry.demo/v2/. Attempting again with no TLS verification. [WARNING] Cannot verify server at https://registry.demo/v2/springtokube/blobs/sha256:1fb3fb86aa52691fa3705554da5ba07dcb556f62a93ba7efab0e397ca3db092c. Attempting again with no TLS verification. [WARNING] Cannot verify server at https://registry.demo/v2/springtokube/blobs/sha256:88a7d9887f9fdeb5a4736d07c64818453e00e71fe916b13f413eb6e545445a68. Attempting again with no TLS verification. [WARNING] Cannot verify server at https://registry.demo/v2/springtokube/blobs/sha256:a6c851c4b90b9eb7af89d240dd4f438dba9feba5c78600fed7eadddf8cb7b647. Attempting again with no TLS verification. [INFO] The base image requires auth. Trying again for openjdk:11-jre-slim... [INFO] Using credentials from Docker config (~/.docker/config.json) for openjdk:11-jre-slim [INFO] Using base image with digest: sha256:01669f539159a1b5dd69c4782be9cc7da0ac1f4ddc5e2c2d871ef1481efd693e [INFO] [INFO] Container entrypoint set to [java, -XX:+UseContainerSupport, -XX:MinRAMPercentage=60.0, -XX:MaxRAMPercentage=90.0, -XshowSettings:vm, -cp, /app/resources:/app/classes:/app/libs/*, com.demo.springtokube.SpringtokubeApplication] [INFO] [INFO] Built and pushed image as registry.demo/springtokube:cf60c31, registry.demo/springtokube [INFO] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 20.058 s [INFO] Finished at: 2020-04-15T08:45:57+03:00 [INFO] ------------------------------------------------------------------------ Pruning images... FATA[0024] exiting dev mode because first build failed: build failed: building [registry.demo/springtokube]: build artifact: getting image: GET http://registry.demo/v2/: : Not Found I thought the minikube works. But disabling cache fails to build if I run skaffold debug OR skaffold dev Works Fine But if I run with cache disabled skaffold debug --no-prune=false --cache-artifacts=false FAILS it shows the logs above
After days of struggling I found a solution. Following Brian de Alwis suggestions I was able to make Skaffold work with Self Signed Certificate. Skaffold build or dev does not use certificate put in. /etc/docker/certs.d/myregistrydomain.com/ca.crt The path is used by docker client only. The solution was to put yout registry certificate into /usr/local/share/ca-certificates/myregistrydomain.com.crt Then update-ca-certificates Check The link for more info If you are using self signed certificate no need for insecure registry in your scaffold yaml file apiVersion: skaffold/v2beta1 kind: Config metadata: name: springtokube build: # insecureRegistries: # - myregistrydomain.com Or Running skaffold with skaffold dev --insecure-registry=myregistrydomain.com Hope this help someone else struggling to make skaffold works with self signed certificate
Override default jkube deployment name
Is it possible to override the default deployment naming in jkube? I want to do something similar to the docker image naming where I can provide a pattern. The deployment section in the resources documentation looked promising but those options are not present in the plugin. The default naming appears to be the maven ${project.artifactId} but I have not found that documented anywhere. Digging through the code I can see the ResourceConfig is out of sync with the documentation and the examples.
I'm from Eclipse JKube/FMP's development team. I think you should be able to override default controller name by either using jkube.enricher.jkube-controller.name property or by providing XML configuration for jkube-controller (Enricher which is responsible for default Deployment by plugin) like this: <plugin> <groupId>org.eclipse.jkube</groupId> <artifactId>kubernetes-maven-plugin</artifactId> <version>1.0.0-alpha-1</version> <configuration> <enricher> <config> <jkube-controller> <name>some-deployment</name> </jkube-controller> </config> </enricher> </configuration> </plugin> When I tried this, I was able to see Deployment's name being changed as per our configuration: ~/work/repos/eclipse-jkube-demo-project : $ mvn k8s:resource k8s:apply [INFO] Scanning for projects... [INFO] [INFO] ----------------------< meetup:random-generator >----------------------- [INFO] Building random-generator 0.0.1 [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- kubernetes-maven-plugin:1.0.0-alpha-1:resource (default-cli) # random-generator --- [INFO] k8s: Running generator spring-boot [INFO] k8s: spring-boot: Using Docker image fabric8/java-centos-openjdk8-jdk:1.5 as base / builder [INFO] k8s: jkube-controller: Adding a default Deployment [INFO] k8s: jkube-service: Adding a default service 'random-generator' with ports [8080] [INFO] k8s: jkube-healthcheck-spring-boot: Adding readiness probe on port 8080, path='/actuator/health', scheme='HTTP', with initial delay 10 seconds [INFO] k8s: jkube-healthcheck-spring-boot: Adding liveness probe on port 8080, path='/actuator/health', scheme='HTTP', with initial delay 180 seconds [INFO] k8s: jkube-revision-history: Adding revision history limit to 2 [INFO] [INFO] --- kubernetes-maven-plugin:1.0.0-alpha-1:apply (default-cli) # random-generator --- [INFO] k8s: Using Kubernetes at https://192.168.39.93:8443/ in namespace default with manifest /home/rohaan/work/repos/eclipse-jkube-demo-project/target/classes/META-INF/jkube/kubernetes.yml [INFO] k8s: Using namespace: default [INFO] k8s: Creating a Service from kubernetes.yml namespace default name random-generator [INFO] k8s: Created Service: target/jkube/applyJson/default/service-random-generator.json [INFO] k8s: Creating a Deployment from kubernetes.yml namespace default name some-deployment [INFO] k8s: Created Deployment: target/jkube/applyJson/default/deployment-some-deployment.json [INFO] k8s: HINT: Use the command `kubectl get pods -w` to watch your pods start up [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 8.076 s [INFO] Finished at: 2020-04-03T16:57:04+05:30 [INFO] ------------------------------------------------------------------------ ~/work/repos/eclipse-jkube-demo-project : $ kubectl get deploy NAME READY UP-TO-DATE AVAILABLE AGE some-deployment 0/1 1 0 8s ~/work/repos/eclipse-jkube-demo-project : $ kubectl get pods NAME READY STATUS RESTARTS AGE some-deployment-97495447b-z9p48 0/1 Running 0 15s
Fabric8: deploy java application with external pods
I am using a fabric8 maven plugin to create a docker image of the app written on spring-boot. I need also pods with Nginx,Db and Mail server. Can fabric 8 maven plugin help me create those pods as well ? If not what should I do?
I'm a maintainer of Fabric8 Maven Plugin. Fabric8 Maven Plugin has a concept of resource fragments(i.e you can add your additional resources in FMP source (src/main/fabric8 by default) directory and FMP would process and enrich them during resource goal). This goes for controller resources also, If you add a fragment of deployment with additional containers in your Deployment spec, FMP would For example, let me add a pod fragment in src/main/fabric8 directory: ~/work/repos/fmp-demo-project : $ cat src/main/fabric8/test-pod.yml apiVersion: v1 kind: Pod metadata: name: testkubee spec: containers: - name: testkubepod image: nginx ~/work/repos/fmp-demo-project : $ mvn fabric8:resource [INFO] Scanning for projects... [INFO] [INFO] ----------------------< meetup:random-generator >----------------------- [INFO] Building random-generator 0.0.1 [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- fabric8-maven-plugin:4.3.0:resource (default-cli) # random-generator --- [INFO] F8: Running generator spring-boot [INFO] F8: spring-boot: Using Container image fabric8/java-centos-openjdk8-jdk:1.5 as base / builder [INFO] F8: using resource templates from /home/rohaan/work/repos/fmp-demo-project/src/main/fabric8 [INFO] F8: fmp-controller: Adding a default Deployment [INFO] F8: fmp-service: Adding a default service 'random-generator' with ports [8080] [INFO] F8: f8-healthcheck-spring-boot: Adding readiness probe on port 8080, path='/actuator/health', scheme='HTTP', with initial delay 10 seconds [INFO] F8: f8-healthcheck-spring-boot: Adding liveness probe on port 8080, path='/actuator/health', scheme='HTTP', with initial delay 180 seconds [INFO] F8: fmp-revision-history: Adding revision history limit to 2 [INFO] F8: validating /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/kubernetes/random-generator-service.yml resource [INFO] F8: validating /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/kubernetes/testkubee-pod.yml resource [INFO] F8: validating /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/kubernetes/random-generator-deployment.yml resource [INFO] F8: using resource templates from /home/rohaan/work/repos/fmp-demo-project/src/main/fabric8 [INFO] F8: fmp-controller: Adding a default DeploymentConfig [INFO] F8: fmp-service: Adding a default service 'random-generator' with ports [8080] [INFO] F8: f8-healthcheck-spring-boot: Adding readiness probe on port 8080, path='/actuator/health', scheme='HTTP', with initial delay 10 seconds [INFO] F8: f8-healthcheck-spring-boot: Adding liveness probe on port 8080, path='/actuator/health', scheme='HTTP', with initial delay 180 seconds [INFO] F8: fmp-revision-history: Adding revision history limit to 2 [INFO] F8: validating /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/openshift/random-generator-service.yml resource [INFO] F8: validating /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/openshift/testkubee-pod.yml resource [INFO] F8: validating /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/openshift/random-generator-route.yml resource [INFO] F8: validating /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/openshift/random-generator-deploymentconfig.yml resource [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 4.210 s [INFO] Finished at: 2020-01-26T14:32:09+05:30 [INFO] ------------------------------------------------------------------------ ~/work/repos/fmp-demo-project : $ mvn fabric8:apply [INFO] Scanning for projects... [INFO] [INFO] ----------------------< meetup:random-generator >----------------------- [INFO] Building random-generator 0.0.1 [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- fabric8-maven-plugin:4.3.0:apply (default-cli) # random-generator --- [INFO] F8: Using Kubernetes at https://192.168.39.39:8443/ in namespace default with manifest /home/rohaan/work/repos/fmp-demo-project/target/classes/META-INF/fabric8/kubernetes.yml [INFO] F8: Using namespace: default [INFO] F8: Using namespace: default [INFO] F8: Creating a Service from kubernetes.yml namespace default name random-generator [INFO] F8: Created Service: target/fabric8/applyJson/default/service-random-generator-1.json [INFO] F8: Using namespace: default [INFO] F8: Creating a Deployment from kubernetes.yml namespace default name random-generator [INFO] F8: Created Deployment: target/fabric8/applyJson/default/deployment-random-generator-1.json [INFO] F8: Using namespace: default [INFO] F8: Creating a Pod from kubernetes.yml namespace default name testkubee [INFO] F8: Created Pod result: Pod(apiVersion=v1, kind=Pod, metadata=ObjectMeta(annotations=null, clusterName=null, creationTimestamp=2020-01-26T09:02:18Z, deletionGracePeriodSeconds=null, deletionTimestamp=null, finalizers=[], generateName=null, generation=null, labels={app=random-generator, group=meetup, provider=fabric8, version=0.0.1}, managedFields=[], name=testkubee, namespace=default, ownerReferences=[], resourceVersion=200332, selfLink=/api/v1/namespaces/default/pods/testkubee, uid=9959a629-7dff-4a66-802a-c3c24107ce6b, additionalProperties={}), spec=PodSpec(activeDeadlineSeconds=null, affinity=null, automountServiceAccountToken=null, containers=[Container(args=[], command=[], env=[EnvVar(name=KUBERNETES_NAMESPACE, value=null, valueFrom=EnvVarSource(configMapKeyRef=null, fieldRef=ObjectFieldSelector(apiVersion=v1, fieldPath=metadata.namespace, additionalProperties={}), resourceFieldRef=null, secretKeyRef=null, additionalProperties={}), additionalProperties={})], envFrom=[], image=nginx, imagePullPolicy=IfNotPresent, lifecycle=null, livenessProbe=null, name=testkubepod, ports=[ContainerPort(containerPort=8080, hostIP=null, hostPort=null, name=http, protocol=TCP, additionalProperties={}), ContainerPort(containerPort=9779, hostIP=null, hostPort=null, name=prometheus, protocol=TCP, additionalProperties={}), ContainerPort(containerPort=8778, hostIP=null, hostPort=null, name=jolokia, protocol=TCP, additionalProperties={})], readinessProbe=null, resources=ResourceRequirements(limits=null, requests=null, additionalProperties={}), securityContext=SecurityContext(allowPrivilegeEscalation=null, capabilities=null, privileged=false, procMount=null, readOnlyRootFilesystem=null, runAsGroup=null, runAsNonRoot=null, runAsUser=null, seLinuxOptions=null, windowsOptions=null, additionalProperties={}), stdin=null, stdinOnce=null, terminationMessagePath=/dev/termination-log, terminationMessagePolicy=File, tty=null, volumeDevices=[], volumeMounts=[VolumeMount(mountPath=/var/run/secrets/kubernetes.io/serviceaccount, mountPropagation=null, name=default-token-qx85s, readOnly=true, subPath=null, subPathExpr=null, additionalProperties={})], workingDir=null, additionalProperties={})], dnsConfig=null, dnsPolicy=ClusterFirst, enableServiceLinks=true, hostAliases=[], hostIPC=null, hostNetwork=null, hostPID=null, hostname=null, imagePullSecrets=[], initContainers=[], nodeName=null, nodeSelector=null, preemptionPolicy=null, priority=0, priorityClassName=null, readinessGates=[], restartPolicy=Always, runtimeClassName=null, schedulerName=default-scheduler, securityContext=PodSecurityContext(fsGroup=null, runAsGroup=null, runAsNonRoot=null, runAsUser=null, seLinuxOptions=null, supplementalGroups=[], sysctls=[], windowsOptions=null, additionalProperties={}), serviceAccount=default, serviceAccountName=default, shareProcessNamespace=null, subdomain=null, terminationGracePeriodSeconds=30, tolerations=[Toleration(effect=NoExecute, key=node.kubernetes.io/not-ready, operator=Exists, tolerationSeconds=300, value=null, additionalProperties={}), Toleration(effect=NoExecute, key=node.kubernetes.io/unreachable, operator=Exists, tolerationSeconds=300, value=null, additionalProperties={})], volumes=[Volume(awsElasticBlockStore=null, azureDisk=null, azureFile=null, cephfs=null, cinder=null, configMap=null, csi=null, downwardAPI=null, emptyDir=null, fc=null, flexVolume=null, flocker=null, gcePersistentDisk=null, gitRepo=null, glusterfs=null, hostPath=null, iscsi=null, name=default-token-qx85s, nfs=null, persistentVolumeClaim=null, photonPersistentDisk=null, portworxVolume=null, projected=null, quobyte=null, rbd=null, scaleIO=null, secret=SecretVolumeSource(defaultMode=420, items=[], optional=null, secretName=default-token-qx85s, additionalProperties={}), storageos=null, vsphereVolume=null, additionalProperties={})], additionalProperties={}), status=PodStatus(conditions=[], containerStatuses=[], hostIP=null, initContainerStatuses=[], message=null, nominatedNodeName=null, phase=Pending, podIP=null, qosClass=BestEffort, reason=null, startTime=null, additionalProperties={}), additionalProperties={}) [INFO] F8: HINT: Use the command `kubectl get pods -w` to watch your pods start up [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 7.207 s [INFO] Finished at: 2020-01-26T14:32:22+05:30 [INFO] ------------------------------------------------------------------------ ~/work/repos/fmp-demo-project : $ kubectl get pods NAME READY STATUS RESTARTS AGE random-generator-86496844ff-2tzn2 1/1 Running 0 43s testkubee 1/1 Running 0 43s ~/work/repos/fmp-demo-project : $ You can either provide pod fragments like I did in src/main/fabric8 directory or you can also provide a customized Deployment resource fragment (if you want all pods as a part of your Deployment like in this example)
dcos: springboot application, deployUCR : Unauthorized
When running the below command to deploy a springboot application, I endup getting Unauthorized error. What config is missing? mvn package dcos:uploadArtifact dcos:deployUCR [INFO] --- dcos-maven-plugin:0.6-SNAPSHOT:uploadArtifact (default-cli) # springreactorcontext --- [INFO] About to execute DC/OS pushArtifact. [INFO] Uploading this file: /Volumes/data/projects/espre05/javanotes/springreactorcontext/target/springreactorcontext-1.0.jar [INFO] Response from DC/OS [201] [INFO] [INFO] --- dcos-maven-plugin:0.6-SNAPSHOT:deployUCR (default-cli) # springreactorcontext --- [INFO] About to execute DC/OS deployUCR [INFO] app definition: /Volumes/data/projects/esp/javanotes/springreactorcontext/application.json [INFO] legacy default app definition: /Volumes/data/projects/esp/javanotes/springreactorcontext/app-definition.json [INFO] dcos token: /Volumes/data/projects/esp/javanotes/springreactorcontext/.dcos-token [INFO] dcos url: http://172.17.0.3 [INFO] ignore ssl certificate: true [INFO] deployable: EMPTY [INFO] TempFile: /var/folders/86/9nj0h71j6kz9grdp6zkg2tkm0000gq/T/ucr-deploy14723186433955340084json [INFO] Calculated url: http://172.17.0.3/service/marathon/v2/apps/springreactorcontext [INFO] Response from DC/OS [401] <!DOCTYPE html> <html> <head> <title>Unauthorized</title>
Fail to copy with Jenkins (MDEP-187)
I'm using Jenkins to build the Jenkins code in the GitHub repository but I'm having a problem: [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Jenkins war 1.626 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) # jenkins-war --- [INFO] Deleting C:\Program Files (x86)\Jenkins\workspace\jenkins_scc12_test\war\target [INFO] [INFO] --- maven-enforcer-plugin:1.0.1:enforce (enforce-maven) # jenkins-war --- [INFO] [INFO] --- maven-enforcer-plugin:1.0.1:display-info (display-info) # jenkins-war --- [INFO] Maven Version: 3.3.3 [INFO] JDK Version: 1.7.0_79 normalized as: 1.7.0-79 [INFO] OS Info: Arch: amd64 Family: windows Name: windows 8.1 Version: 6.3 [INFO] [INFO] --- maven-enforcer-plugin:1.0.1:enforce (default) # jenkins-war --- [INFO] Adding ignore: org.eclipse.jetty.spdy.* [INFO] [INFO] --- maven-enforcer-plugin:1.0.1:enforce (enforce-banned-dependencies) # jenkins-war --- [INFO] [INFO] --- build-helper-maven-plugin:1.7:timestamp-property (timestamp-property) # jenkins-war --- [INFO] Setting property 'now' to '08/30/2015 18:58 GMT'. [INFO] [INFO] --- build-helper-maven-plugin:1.7:regex-property (version-property) # jenkins-war --- [INFO] No match to regex '-SNAPSHOT' found in '1.626'. [INFO] Setting property 'build.version' to '1.626'. [INFO] [INFO] --- build-helper-maven-plugin:1.7:add-source (add-source) # jenkins-war --- [INFO] Source directory: C:\Program Files (x86)\Jenkins\workspace\jenkins_scc12_test\war\target\generated-sources\localizer added. [INFO] [INFO] --- gmaven-plugin:1.5-jenkins-3:generateStubs (default) # jenkins-war --- [INFO] No sources found for Java stub generation [INFO] [INFO] --- maven-dependency-plugin:2.8:list (list-dependencies) # jenkins-war --- [INFO] [INFO] --- maven-dependency-plugin:2.8:unpack-dependencies (executable-war-header) # jenkins-war --- [INFO] Unpacking C:\.m2\repository\org\jenkins-ci\executable-war\1.30\executable-war-1.30.jar to C:\Program Files (x86)\Jenkins\workspace\jenkins_scc12_test\war\target\jenkins with includes "**/*.class" and excludes "" [INFO] [INFO] --- maven-dependency-plugin:2.8:copy (resgen) # jenkins-war --- [INFO] Configured Artifact: org.jenkins-ci.main:remoting:?:jar [INFO] Configured Artifact: org.jenkins-ci.main:remoting:?:jar [INFO] Configured Artifact: org.jenkins-ci.main:cli:jar-with-dependencies:?:jar [INFO] Configured Artifact: org.jenkins-ci:winstone:?:jar [INFO] Configured Artifact: org.jenkins-ci.main:maven-plugin:?:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:ssh-slaves:1.9:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:credentials:1.18:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:ssh-credentials:1.10:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:subversion:1.54:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:cvs:2.11:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:ant:1.2:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:javadoc:1.1:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:translation:1.10:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:external-monitor-job:1.4:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:ldap:1.11:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:pam-auth:1.1:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:mailer:1.11:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:matrix-auth:1.1:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:windows-slaves:1.0:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:antisamy-markup-formatter:1.1:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:matrix-project:1.4.1:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:script-security:1.13:hpi [INFO] Configured Artifact: org.jenkins-ci.plugins:junit:1.2-beta-4:hpi [INFO] Copying remoting-2.52.jar to C:\Program Files (x86)\Jenkins\workspace\jenkins_scc12_test\war\target\jenkins\WEB-INF\remoting.jar [INFO] Copying remoting-2.52.jar to C:\Program Files (x86)\Jenkins\workspace\jenkins_scc12_test\war\target\jenkins\WEB-INF\slave.jar [INFO] Copying classes to C:\Program Files (x86)\Jenkins\workspace\jenkins_scc12_test\war\target\jenkins\WEB-INF\jenkins-cli.jar [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Jenkins main module ................................ SUCCESS [ 2.497 s] [INFO] Jenkins cli ........................................ SUCCESS [ 6.507 s] [INFO] Jenkins core ....................................... SUCCESS [12:32 min] [INFO] Jenkins war ........................................ FAILURE [ 10.954 s] [INFO] Test harness for Jenkins and plugins ............... SKIPPED [INFO] Jenkins plugin POM ................................. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 12:55 min [INFO] Finished at: 2015-08-31T02:58:30+08:00 [INFO] Final Memory: 179M/740M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:2.8:copy (resgen) on project jenkins-war: Artifact has not been packaged yet. When used on reactor artifact, copy should be executed after packaging: see MDEP-187. -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :jenkins-war Build step 'Invoke top-level Maven targets' marked build as failure Recording test results Finished: FAILURE The code I'm using is of the last release: https://github.com/jenkinsci/jenkins/tree/8d8703a781f83be19086592e0c8fa4b4c14da13c I'm using Maven to build. I'm very new to this. Please help me to solve this problem. JDK version: 1.7.0 Maven version: 3.3.3