Unable to run the sample application for google cloud platform - maven

I am just starting off on google-cloud-platform. I have created an account and a project on cloud console. I was trying to run some of the sample apps provided.
I started with the sample app for cloud storage provided at:
https://github.com/GoogleCloudPlatform/java-docs-samples/tree/master/storage/cloud-client
I have installed apache maven 3.5.0 on my PC.
I followed the steps provided in the link, that is I gave the following commands:
mvn clean package -DskipTests
and then
mvn exec:java -Dexec.mainClass=com.example.storage.QuickstartSample -Dexec.args="my-bucket-name"
The first command succeeded. However, the second command failed.
I got the following error:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:06 min
[INFO] Finished at: 2017-06-15T18:27:55+05:30
[INFO] Final Memory: 15M/172M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project storage-google-cloud-samples: An exception occured while executing the Java class. connect timed out -> [Help 1]
Now, the computer where I was running this command on, is behind a proxy. However, my proxy settings have been set in the file conf/settings.xml, also when I ran the first command, it successfully downloaded some packages, so I'm not sure if it is due to some proxy issue, however to check, I tried it on another machine, which is not behind a proxy.
I gave the same two commands. The first one succeeded and the second failed again, with the following (different) error:
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:02 min
[INFO] Finished at: 2017-06-15T18:22:31+05:30
[INFO] Final Memory: 13M/32M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (d
efault-cli) on project storage-google-cloud-samples: An exception occured while
executing the Java class. 401 Unauthorized -> [Help 1]
So my queries are:
What could be the cause for the error in the first case? Is it something to do with proxy settings? If yes, then where/how should I specify the settings?
What could be the cause for the error in the second case where I am not behind any proxy?
Am i missing some step here?
Also , if you look at the source for this sample app, there is just a single file which basically creates a bucket. The bucket is passed as an argument from command line. Now, as per my understanding, first there needs to a project in the cloud console to create any resources. So where will this bucket be created? As in shouldn't we be specifying the project-ID where this bucket is to be created?

TL;DR - You're missing the credentials, the example relies on to invoke the desired Google Cloud APIs. Using Application Default Credentials is the recommended approach when using the Google Cloud APIs using any of the Client libraries.
Longer version
The example relies on Application Default Credentials (as explained in the README.md of the github repo you're using).
How the Application Default Credentials work
You can get Application Default Credentials by making a single client
library call. The credentials returned are determined by the
environment the code is running in. Conditions are checked in the
following order:
The environment variable GOOGLE_APPLICATION_CREDENTIALS is checked. If this variable is specified it should point to a file that
defines the credentials. The simplest way to get a credential for this
purpose is to create a Service account key in the Google API Console:
a. Go to the API Console Credentials page.
b. From the project drop-down, select your project.
c. On the Credentials page, select the Create credentials drop-down,
then select Service account key.
d.From the Service account drop-down, select an existing service
account or create a new one.
e. For Key type, select the JSON key option, then select Create. The
file automatically downloads to your computer.
f. Put the *.json file you just downloaded in a directory of your
choosing. This directory must be private (you can't let anyone get
access to this), but accessible to your web server code.
g. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to
the path of the JSON file downloaded.
If you have installed the Google Cloud SDK on your machine and have run the command gcloud auth application-default login, your
identity can be used as a proxy to test code calling APIs from that
machine.
If you are running in Google App Engine production, the built-in service account associated with the application will be used.
If you are running in Google Compute Engine production, the built-in service account associated with the virtual machine instance
will be used.
If none of these conditions is true, an error will occur.

Related

Get NOT_BUILT build result in Jenkins when building maven project with selective dependencies

I have a multi-component maven project in Jenkins. This project has a Send Files over FTP Post-Build step. I have set my mvn goals to build just my desired component; not all of them:
clean install -pl component-x,component-y -P develop -X
All the dependencies in my project are built successfully;
[INFO] component-x ............................ SUCCESS [ 5.026 s]
[INFO] component-y ............................ SUCCESS [ 16.912 s]
but the Jenkins says:
FTP: Current build result is [NOT_BUILT], not going to run.
EDIT 1:
Yes, I have read this issue. People suggestions include:
Do it manually.
Use Execute Shell instead.
But there were no solution for How to do it manually.
BTW I have an FTP server which I want to put files on; it's not possible for me to use Execute Shell.
Looks like because of this bug https://issues.jenkins-ci.org/browse/JENKINS-16240
Either you can set the status to success manually or use execute shell for ftp instead of the plugin.
You can run a post build groovy script manager.buildSuccess()
See here
The problem is, that the Publish over FTP plugin checks if the build was successful. Unstable is accepted, too, but NOT_BUILT isn't. If the build is not considered successful, the plugin refuses to run.
It is arguable, if that is an expected behavior. The user might want to transfer files, even if the build was not successful. Besides, the NOT_BUILT obviously refers only to the last build step, not to the overall result which is still SUCCESS.
There is an issue filed under JENKINS-55816.
I've created a patch that does not check for the build result which can be downloaded from here (use at your own risk, with no warranty whatsoever).

Alfresco SDK run seems stuck at "Processing overlay"

I use the Alfresco SDK with the following command:
mvn install -Ddependency.surf.version=6.3 -Prun
All is fine, except when it gets stuck at this step of Building Alfresco Share WAR Aggregator:
[INFO] --- maven-war-plugin:2.6:war (default-war) # share ---
[INFO] Packaging webapp
[INFO] Assembling webapp [share] in [/home/nico/aegif/projects/60_townpage/townpage-filing/townpage-filing/share/target/share-1.0-SNAPSHOT]
[info] Copying manifest...
[INFO] Processing war project
[INFO] Processing overlay [ id org.alfresco:share]
In such cases I just perform a clean and the problem is solved, but that takes time.
Is there anything I can do to avoid it getting stuck?
alfresco.version is 5.1.g
Ubuntu 2016.10 LTS
Given the parameters you are using, I assume you are on Alfresco SDK 2.2, and trying to use a more recent version of alfresco (5.1.f or newer) on a All In One project.
Using Alfresco SDK AIO projects always adds some overhead during restarts because the SDK is actually building your modules, fetching the wars, fetching additional modules referenced and applies the modules to the wars (as in unzipping the war and unzipping the amps on the same folder before re-packaging the wars), then it starts up an embedded tomcat with some special config from the runner project with the new wars! A complicated approach, if you ask me, and it is definitely expected to take a considerable amount of time and performance (especially on Disk IO), especially when you clean before you rebuild...
Back to your question, the step you are hanging on if when the SDK is trying to unzip the OOTB share war prior to applying amps to it, and there is a lot of reasons why things could go south there! And unless you rovide some more detailed steps (as in adding -X or -e to your mvn command) I doubt any one would be able to catch precisely what is going wrong !
Be careful with running your project without cleaning, as you might end up with some risidual files that give you a different behaviour from the one to be expected from final artifacts... I can imagine at least a couple of these scenarios !
Alternatively, may I suggest that you switch from AIO approach to seperate projects for Repo and Share ? You can install multiple tomcats on your machine: Let's say a tomcat for repo on port 8080 and a tomcat for share on 8081, then you can develop on one tier while having a tomcat service provide the other one (Stop the share tomcat service, and start up a share amp from the SDK pointing to the local Alfresco Repo service on the the other locally installed tomcat) that way you can rapidly always clean and run with this command for running share:
mvn clean install -PampToWar -Dmaven.tomcat.port=8081 -Ddependency.surf.version=6.3

Jenkins "POM_VERSION" variable does not reset from build to build

Following Jenkins documentation jenkins docs
I can use the POM_VERSION environment variable to take the current version from the pom file.
But for some reason the POM_VERSION environment variable is not being refreshed from build to build. for example:
I changed the version number manually but the build number that exported was the last from the previous build.
From my python script:
print 'Current version is ' + os.environ['POM_VERSION']
which gives the following log Current version is 0.1.5 which is clearly wrong because i changed it. you can further see it in my maven versions goal output:
[INFO]
[INFO] --- versions-maven-plugin:2.1:set (default-cli) # ep-reporter ---
[INFO] Searching for local aggregator root...
[INFO] Local aggregation root: /var/lib/jenkins/jobs/exchange-planner- reporter/workspace
[INFO] Processing com.exelate:ep-reporter
[INFO] Updating project com.company:ep-reporter
[INFO] from version 0.1.8 to 0.1.6
Props: {project.version=0.1.6, project.artifactId=ep-reporter, project.groupId=com.company}
note that it went from 0.1.5 because i have a script that advances it. but maven clearly states that its changing it from 0.1.8 to 0.1.6 which is not what POM_VERSION says
Thanks.
That environment variable is set at the time Jenkins first reads your POM file. You are clearly changing it during the build.
If you are changing it during the build, you know what you are changing to, and you should use the same mechanism to display it later.
I had the same problem and I resolved it by setting "Check-out Strategy" option to "Always checkout a fresh copy" in Source Code Management section.

Maven - dynamic pom file?

I'm using the Maven plugin was6-maven-plugin to deploy to websphere. When installing an application, there's a configuration value named "updateExisting" that should be false if I am installing a new application, and true if I am updating an existing application. I don't like having to manually toggle this value if I am fresh-installing/updating the application.
The way I see it, I can either add an uninstallApp goal to always uninstall the application before installing it, but this seems a rather silly way to do it.
I've noticed that this plugin also has a goal wsListApps that outputs all applications installed on the server. The output looks like this:
[INFO] [wsadmin] WASX7209I: Connected to process "server1" on node 1234Node02 using SOAP connector; The type of process is: UnManagedProcess
[INFO] [wsadmin] DefaultApplication
[INFO] [wsadmin] IBMUTC
[INFO] [wsadmin] MyApplicationEAR
[INFO] [wsadmin] ivtApp
[INFO] [wsadmin] query
Is it possible for Maven to scan this output for the string "MyApplicationEAR" and set "updateExisting" to "true" if it is found, and leave it "false" otherwise?
What you need is to be able to update a maven property during the life-cycle, before the phase binded with your was6-maven-plugin. (and using this property as a value for <updateExisting>)
Unfortunately, maven properties are static and cannot be changed at runtime. So at first sight it's impossible to do.
But, there is a plugin : properties-maven-plugin you can use to define new properties at runtime. The value of the property can be defined by a groovy script. Now the question is more about how can you write a groovy script telling if your app is already there or not.
Honestly, I don't know if it's a good idea to use it. I think running the uninstall goal everytime with failOnError set to false is probably the simplest way (and so probably the best, but maybe I am missing something ?)

Jenkins multi configuration passing parameters when running Maven job

I'm trying to set up a Jenkins multi configuration job for the selenium tests of my project that runs against multiple browsers. I checked the different options and the multi configuration job seems to be a good fit, but I cannot make maven pass the parameters correctly to maven.
I have a few parameters I need to pass to maven, mainly browserName and appDomain, and also a Profile to run the tests. To configure the job I do the following:
Define the SVN repository from where the code will be checked out.
Set up browserName as a user define axis, with values FIREFOX, CHROME, IE.
Create a build step of type "Invoke top-level Maven targets", and here's where I get the problems. The configuration of this part is different from other job types, usually there's a field called Goals and Options where to put everything, but in this case is divided in different fields. So I don't know where exactly put the properties and the profile.
a) The logical thing, I put the goals in Goals field and the parameters and options in the properties field, like in the image:
In this case the job runs normally without executing the tests, because the profile is not executed.
b) If I put just the profile in the Goals field, the maven call in the log is:
/opt/apache-maven-2.2.1/bin/mvn -DbrowserName=CHROME "-D-Dappdomain=0 -Dtestenv=test -Drc=true -DsuiteXmlFile=testOne.xml -U -Dapp.instance.key=jenkins -Denv=default" clean verify -Pwebtests
And the exception is:
[INFO] [enforcer:enforce {execution: enforce-property}]
[WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireProperty failed with message:
You must pass the appdomain as parameter! Example: -Dappdomain=20
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireProperty failed with message:
You must pass the test environment as parameter! Example: -Dtestenv=beta
So is not getting the properties
c) Finally, if I put everything in the goals field, I get the following exception:
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] One or more required plugin parameters are invalid/missing for 'property:merge'
[0] Inside the definition for plugin 'property-maven-plugin' specify the following:
<configuration>
...
<environment>VALUE</environment>
</configuration>
-OR-
on the command line, specify: '-Denv=VALUE'
I tried with a normal and parametrized job and works perfectly...
Jenkins version is: 1454 and Maven is 2.2.1
I found out that the Jenkins machine wasn't properly configured. I tried in another instance and was all good with the following configuration:
-Goals: clean verify Pwebtests
-Paramenters: (properties file format)
appDomain=0
testenv=test
env=default
....
And as a sidenote, the other jobs were working because they were using the Jenkins Maven plugin, which seems to use java to launch a Hudson class that calls Maven, instead of calling directly the mvn command, which is what happens when a build step of type "Invoke top-level Maven Targets"

Resources