base) stephen#stephen-Aspire-5250:~$ java --version
java 13.0.1 2019-10-15
Java(TM) SE Runtime Environment (build 13.0.1+9)
Java HotSpot(TM) 64-Bit Server VM (build 13.0.1+9, mixed mode, sharing)
(base) stephen#stephen-Aspire-5250:~$ find . -name h2o.jar
./R/x86_64-pc-linux-gnu-library/3.4/h2o/java/h2o.jar
then from R
> h2o.init()
H2O is not running yet, starting it now...
Error in .h2o.checkJava() :
Cannot find Java. Please install the latest JRE from
http://www.oracle.com/technetwork/java/javase/downloads/index.html
I have a feeling conda is messing up the landscape, but don't know how to fix.
I started h2o from terminal with:
java -jar ~/R/x86_64-pc-linux-gnu-library/3.4/h2o/java/h2o.jar
and then the h2o.init() from R works. Still, don't why h2o cannot find java, which is on my PATH.
H2O gets its path to java runtime from JAVA_HOME environment variable, so make sure to set it properly for/from R if you have to use h2o.init() rather than system shell/bash (e.g. with java -Xmx1g -jar ./h2o.jar).
More info
After several years of experience with H2O in Ubuntu/Centos/RHEL I now start H2O only from bash (issuing richly parameterized commands at H2O docker container startup), rather than with R or python API functions (it led to all sorts of problems, such as using all server CPU cores that yielded huge performance degradation for the inexperienced users or exposing passwordless REST API with root file access over standard H2O port to the entire corporate network...).
As a side note, Java 13 is supported by latest H2O versions, but I would still recommend using LTS versions, currently 11, for security reasons. The same of course applies to Ubuntu itself.
Related
I've installed databricks-connect on Windows 10 with the instructions here: https://docs.databricks.com/dev-tools/databricks-connect.html
After running databricks-connect configure and entering all values, i'm running databricks-connect test. This is the output I'm getting, and it hangs:
* PySpark is installed at c:\users\user\.conda\envs\myenv\lib\site-packages\pyspark
* Checking SPARK_HOME
* Checking java version
java version "1.8.0_251"
Java(TM) SE Runtime Environment (build 1.8.0_251-b08)
Java HotSpot(TM) 64-Bit Server VM (build 25.251-b08, mixed mode)
* Skipping scala command test on Windows
* Testing python command
The system cannot find the path specified.
Digging a bit deeper, it seems that the underlying pyspark package fails to initialize. It fails on this line:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
When I try to run this manually, it hangs. I guess this is a problem is either the local Spark or the required Hadoop (and winutils.exe) installations, but databricks-connect requires a fresh pyspark installation (doc says to uninstall pyspark prior to installation).
Would be happy for any references for:
Fixing the databricks-connect issue
Fixing the underlying pyspark installation issue
I downloaded elastic search today. When I try to run it, it is immediately killed with the following message:
Johnathans-MacBook-Pro:Downloads jward$ ./elasticsearch-7.6.1/bin/elasticsearch
./elasticsearch-7.6.1/bin/elasticsearch-env: line 71: 12909 Killed: 9
"$JAVA" -cp "$ES_CLASSPATH" org.elasticsearch.tools.java_version_checker.JavaVersionChecker
My java version is:
java version "11.0.6" 2020-01-14 LTS
Java(TM) SE Runtime Environment 18.9 (build 11.0.6+8-LTS)
Java HotSpot(TM) 64-Bit Server VM 18.9 (build 11.0.6+8-LTS, mixed mode)
Why won't the elastic search service start?
Elasticsearch isn't able to recognize your Java version and that's the reason it is failing. Please see in your logs, the error is thrown from the below class and you can have a look at from below source code.
https://github.com/elastic/elasticsearch/blob/master/distribution/tools/java-version-checker/src/main/java/org/elasticsearch/tools/java_version_checker/JavaVersionChecker.java#L28 and you can have a look at this class, All it does is checks if the runtime Java version is at least 1.8..
In your case its java 11, so java version isn't a problem, the problem lies in Elasticsearch not recognizing that.
You need to set proper JAVA_HOME in your ~/.zshrc if using latest Mac OS Catalina, as they moved to ~/.zshrc and I see you just mention JAVA_HOME=$(/usr/libexec/java_home) but don't see EXPORT before this. So please add below line.
EXPORT JAVA_HOME=$(/usr/libexec/java_home)
After that source ~/.zshrc and then close the terminal and see the output of java -version , if it shows java 11 version then you are good to go and run the elasticsearch again.
Hope this helps and let me know if you have further questions.
If your java version is different from the one that comes with Elasticsearch bundle, it'll not start. Refer to the document below:
https://www.elastic.co/guide/en/elasticsearch/reference/current/setup.html
According to the requirements document for sonarqube:
The only prerequisite for running SonarQube is to have Java (Oracle JRE 7 onwards or OpenJDK 7 onwards) installed on your machine.
I want now to use a NetBSD 7.0 machine to run a sonarqube server.
OpenJDK8 is installed:
openjdk version "1.8.0_77-internal"
OpenJDK Runtime Environment (build
1.8.0_77-internal-pkgsrc_1.8.77-b00)
OpenJDK 64-Bit Server VM (build 25.77-b00, mixed mode)
However, sonarqube is using wrapper, and that software does not support NetBSD (FreeBSD is supported, but this is not close enough to serve as a working substitute).
I already tried using linux emulation mode. But, having a NetBSD native java being started from a linux emulated wrapper is not giving a usable configuration for runtime environment (libc version clashes, et. al.).
And installing a linux native openjdk8 and get the complete setup running in emulation mode is also not to be recommended.
With sonarqube 4.x (long ago) I had used war distribution
and this worked OK. But with this new environment I hoped for being able to use newer versions of sonarqube.
Questions:
Is there a way to bypass wrapper and start sonarqube relying on java only?
Alternatively, is there a way to get a NetBSD version of wrapper?
Would I be better of dropping sonarqube alltogether,
given my target platform?
The version of Java Service Wrapper used by SonarQube does not support NetBSD. That's not as critical as for MSWindows. SonarQube can be easily started as a unix daemon without it. You just have to execute java -jar lib/sonar-application-{version}.jar from the installation directory.
I'm not sure if this is a java problem or is related to ubuntu. I'm on an ubuntu 14.04 machine and I have both Java 6 (jdk_1.6.0_43) and Java 7(jdk_1.7.0_51). JAVA_HOME is set to java 7 and that is what my PATH env var is referring to. So Java -version would give me the following:
java version "1.7.0_51"
Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)
Now I installed jprofiler (v 8.0.7) for linux by downloading the setup script and running it (https://www.ej-technologies.com/download/jprofiler/files). But when I run jprofiler from command line, I get a non workable gui showing up with all window sizes messed up. I cannot resize the windows so I have no access to the functionality. The quickstart window shows up but clicking anywhere on that window makes the whole window disappear.
Any idea what is going on?
This could happen if the GraphicsConfiguration returned the wrong screen bounds. JProfiler 8.1 will add a safeguard against that case.
Update 2015-05-11
The problem was not fixed in all cases in 8.1, a more comprehensive fix will be available in JProfiler 9
While trying to install Informix's JDBC driver, I get this error:
java -cp /home/ics/sandbox/jdbc/setup.jar run -console
The wizard cannot continue because of the following error:
could not load wizard specified in /wizard.inf (104)
I have pointed to a newer Java from Sun using:
export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_05
export PATH=$JAVA_HOME/bin:$PATH
java -version
java version "1.8.0_05"
Java(TM) SE Runtime Environment (build 1.8.0_05-b13)
Java HotSpot(TM) Server VM (build 25.5-b02, mixed mode)
Pointing to a newer Java and, as IBM/Informix support told me, getting away from OpenJDK should allow the installer to run, but that does not work. I also saw using Sun's or IBM's java elsewhere, when I searched for posts on the specific error. On another Centos System, OpenJDK is installed, and I can install the JDBC driver successfully
I also tried removing tty settings from my environment, which also did not work.
Here is the so post where this error is mentioned.
If anyone has heard of a solution, I'd love to hear it, 'cause without the jdbc driver, there's no Clojure database work with Informix, and, when working, it works well. And, I have thought of tarring and zipping the good install and moving it, but that's kind of cheating.
This problem occurs on
cat /etc/redhat-release
CentOS release 6.4 (Final)
The installed java version, not my workaround newer version is
java -version
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.11.90) (rhel-1.62.1.11.11.90.el6_4-i386)
OpenJDK Server VM (build 20.0-b12, mixed mode)
Until an answer arrives that allows an install, I took #MichaĆ Niklas 's suggestion and manually installed the driver under /opt/ on the new system. This worked.
I am still going to pursue the cause of this problem. I edited the OP to reflect #ceinmart 's suggestion to remove tty settings. This did not work, but I found it a useful suggestion.