Spring: Why I cant check for spring boot version without an error? - spring-boot

I am follow the guide for setting up spring boot with the following link.
http://docs.spring.io/spring-boot/docs/1.4.1.RELEASE/reference/htmlsingle/#getting-started-installing-the-cli
section 10.2.2
when I type $ spring --version
I receive the error below.
/cygdrive/c/Users/Jesse/Documents/.sdkman/candidates/springboot/current/bin/spring: line 83: [: C:\Program: binary operator expected
Error: Could not find or load main class org.springframework.boot.loader.JarLauncher

You need to set the SPRING_HOME variable.
After setting, SPRING_HOME was not resolving correctly for me even though it was set in windows as a user and a system variable, and was also visible when running export via Git Bash. I ended up replacing the last line in my spring.sh file, essentialy forcing the classpath for the java command:
"${JAVA_HOME}/bin/java" ${JAVA_OPTS} -cp "/drive_letter/dir/to/spring/spring-x.x.x.RELEASE/lib/spring-boot-cli-x.x.x.RELEASE.jar" org.springframework.boot.loader.JarLauncher "$#"

Related

configuring springXD Horthonworks

I try to Configuring Spring XD to use Hadoop (horthonworks) but when I excute this line In a terminal "./xd-singlenode --hadoopDistro hadoop11" ,I get error 'hadoop11' is not a valid value for option --hadoopDistro Possible values are [cdh5, hdp22, phd21, hadoop27, phd30]
The error message is pretty clear, the Spring Xd version your are trying to use is wrong. For horthonworks configuration you should use hdp22.
Regards

AdminTask.listTCPEndPoints('abc(abc)') throws exception: ADMF0007E: target object is required

I'm working on deploying application to WebSphere 7 using python script and the script is throwing exception at this line:-
AdminTask.listTCPEndPoints('abc(abc)')
If I run the above command before I run the python script, it works fine. It gives me an error ADMF0003E: Invalid parameter value. But the same command fails in the python script with this error:
wsadmin>AdminTask.listTCPEndPoints('abc(abc)')
WASX7015E: Exception running command: "AdminTask.listTCPEndPoints('abc(abc)')"; exception information: com.ibm.websphere.management.cmdframework.CommandValidationException: ADMF0007E: target object is required.
I can guess that there something in the python script that is causing this issue, but I don't understand why is the AdminTask.listTCPEndPoints command is not able to see the parameter being passed. I'm new to WebSphere, I have only used it in past but never configured it. Any help/insight would be highly appreciated.
Thanks!
Added stack trace of interactive mode option
wsadmin>print AdminTask.listTCPEndPoints('-interactive')
List NamedEndPoints that can be used by a TCPInboundChannel
Lists all NamedEndPoints that can be associated with a TCPInboundChannel
*TCPInboundChannel: abc(abc)
excludeDistinguished (excludeDistinguished): 0
WASX7435W: Value 0 is converted to a boolean value of false.
unusedOnly (unusedOnly): 0
WASX7435W: Value 0 is converted to a boolean value of false.
List NamedEndPoints that can be used by a TCPInboundChannel
F (Finish)
C (Cancel)
Select [F, C]: [F] F
WASX7278I: Generated command line: AdminTask.listTCPEndPoints('[-excludeDistinguished false -unusedOnly false]')
WASX7015E: Exception running command: "AdminTask.listTCPEndPoints('-interactive')"; exception information:
com.ibm.websphere.management.cmdframework.CommandValidationException: ADMF0007E: target object is required.
Follow this link. It appears that you have not specified the target object that's why that error is coming.
I suggest use the following command as a starter
print AdminTask.listTCPEndPoints('-interactive')
Note: Instead of copying and pasting the command, type it on the command line. sometimes command editor does not take the command after pasting it directly.
Okay, I was able to fix the error. I was getting that error because as part of the application deployment script, I was copying few of my application jars to WebSphere's java/jre/lib/ext directory so that those are available in classpath. In one of those jar, I had bundled an IBM class (Base64Coder.class) which was required by a class in my jar and it was corrupting the WebSphere AdminTask utility. When I removed that Base64Coder.class from my jar, python script worked fine. I believe, the reason it corrupted WebSphere was that there was a duplication of the same class in the JVM as the class comes with IBM WebSphere installation and was present in AppServer/runtimes/com.ibm.ws.webservices.thinclient_7.0.0.jar

Unrecognized option: --spring.profiles.active=prod Openshift

When i deployed jhipster web application to OpenShift, my app page getting "503 Service Temporarily Unavailable" and when I look at the log files, having the following problem;
==> app-root/logs/mapp.log <==
Unrecognized option: --spring.profiles.active=prod
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Is there any solution you know?
That is not an option recognized by the JVM. In order to set system properties use use -D. From the Java 7 reference:
-Dproperty=value
Sets a system property value.
If value is a string that contains spaces, then you must enclose the
string in double quotation marks:
java -Dmydir="some string" SomeClass
In this instance you would use:
-Dspring.profiles.active=prod
It can then be acquired in your application by using the System class:
System.getProperty("spring.profiles.active")

Setting elasticsearch properties in spark-submit

I'm trying to launch Spark jobs that use Elastic Search input via command line using spark-submit as described in http://www.elasticsearch.org/guide/en/elasticsearch/hadoop/current/spark.html
I'm setting the properties in a file, but when launching spark-submit it gives the following warnings:
~/spark-1.0.1-bin-hadoop1/bin/spark-submit --class Main --properties-file spark.conf SparkES.jar
Warning: Ignoring non-spark config property: es.resource=myresource
Warning: Ignoring non-spark config property: es.nodes=mynode
Warning: Ignoring non-spark config property: es.query=myquery
...
Exception in thread "main" org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed
My config file looks like (with correct values):
es.nodes nodeip:port
es.resource index/type
es.query query
Setting the properties in the Configuration object in the code works, but I need to avoid this workaround.
Is there a way to set those properties via command line?
I don't know if you resolved your issue (if so, how?), but I found this solution:
import org.elasticsearch.spark.rdd.EsSpark
EsSpark.saveToEs(rdd, "spark/docs", Map("es.nodes" -> "10.0.5.151"))
Bye
When you pass a config file to spark-submit, it only loads configs that start with 'spark.'
So, in my config I simply use
spark.es.nodes <es-ip>
and in the code itself I have to do
val conf = new SparkConf()
conf.set("es.nodes", conf.get("spark.es.nodes"))

Debug jboss app in Intellij idea

I am using intellij idea and jboss server.
When I run it it works fine but when I debug it it gives me:
ERROR: transport error 202: connect failed: Connection refused
ERROR: JDWP Transport dt_socket failed to initialize, TRANSPORT_INIT(510)
JDWP exit error AGENT_ERROR_TRANSPORT_INIT(197): No transports initialized [../../../src/share/back/debugInit.c:690]
FATAL ERROR in native method: JDWP No transports initialized, jvmtiError=AGENT_ERROR_TRANSPORT_INIT(197)
Disconnected from server
What is the problem?
Make sure the jboss vm is running with the parameters Idea shows in Debug dialog - specifically the port number seems incorrect to me.
If you're running on OS X Mountain Lion you could try popping -d64 into the VM options, that seemed to work for me. Not passing the variables, I think, will just prevent you from debugging.
For more information check here.
Unchecking the pass variables didn't work for me. What I ended up doing running the jBOSS separately and using 'Remote jBOSS' configuration in intelliJ.
To make debugging work I ened up adding the following lines to the "standalone.sh"
DEBUG_JAVA_OPTS="-Xdebug -Xrunjdwp:transport=dt_socket,address=localhost:62307,suspend=n,server=y "
JAVA_OPTS="${DEBUG_JAVA_OPTS} $JAVA_OPTS"
The value of DEBUG_JAVA_OPTS has to be copy and pasted from IntelliJ dialog box. These lines have to be inserted into the script right after the place where script sets up "JAVA_OPTS"
Update:
Looks like IDE generates a new port number for each project. So you have to edit the standalone.sh file every time you switch project. Hopefully somebody can suggest a fix for it.
Another option might be to comment out the 'if' statement in standalone.conf as below.
Note the first $JAVA_OPTS is added (it is the one passed in from IntelliJ). This way there is no need to remember to change the port number for every project.
#Specify options to pass to the Java VM.
#
#if [ "x$JAVA_OPTS" = "x" ]; then
JAVA_OPTS="$JAVA_OPTS -Xms64m -Xmx512m -XX:MaxPermSize=256m - Djava.net.preferIPv4Stack=true -Dorg.jboss.resolver.warning=true -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.server.gcInterval=3600000"
JAVA_OPTS="$JAVA_OPTS -Djboss.modules.system.pkgs=$JBOSS_MODULES_SYSTEM_PKGS -Djava.awt.headless=true"
JAVA_OPTS="$JAVA_OPTS -Djboss.server.default.config=standalone.xml"
#else
#echo "JAVA_OPTS already set in environment; overriding default settings with val ues: $JAVA_OPTS"
#fi

Resources