I am trying to go through the basic quickstart for setting up kafka (found here: http://kafka.apache.org/07/quickstart.html). I have run the sbt update and package commands. Unfortunately, when I then run:
bin/zookeeper-server-start.sh config/zookeeper.properties
I see the error, "Could not find the main class ...QuorumPeerMain"
Similarly, when I run:
bin/kafka-server-start.sh config/server.properties
I see the error, "Could not find the main class ... kafka.Kafka"
Has anyone seen similar problems? I have checked and it looks like the zookeeper jar and kafka jar are in the classpath. If it helps, I have version 7.1 and am running on windows. Thanks a lot!
Related
I have Windows 10 and I followed this guide to install Spark and make it work on my OS, as long as using Jupyter Notebook tool. I used this command to instantiate the master and import the packages I needed for my job:
pyspark --packages graphframes:graphframes:0.8.1-spark3.0-s_2.12 --master local[2]
However, later, I figured out that any worker wasn't instantiated according to the aforementioned guide and my tasks were really slow. Therefore, taking inspiration from this, since I could not find any other way to connect workers to the Cluster manager due to the fact it was run by Docker, I tried to set up everything manually with the following commands:
bin\spark-class org.apache.spark.deploy.master.Master
The master was correctly instantiated, so I continued by the next command:
bin\spark-class org.apache.spark.deploy.worker.Worker spark://<master_ip>:<port> --host <IP_ADDR>
Which returned me the following error:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/01 14:14:21 INFO Master: Started daemon with process name: 8168#DESKTOP-A7EPMQG
21/04/01 14:14:21 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
java.lang.ExceptionInInitializerError
at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
at org.apache.spark.internal.config.package$.<init>(package.scala:1006)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
at org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:57)
at org.apache.spark.deploy.master.Master$.main(Master.scala:1123)
at org.apache.spark.deploy.master.Master.main(Master.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module #60015ef5
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
... 6 more
From that moment on, none of the commands I used to run before were working anymore, and they returned the message you can see. I guess I messed up some Java stuff, but I do not understand what and where, honestly.
My java version is:
java version "16" 2021-03-16
Java(TM) SE Runtime Environment (build 16+36-2231)
Java HotSpot(TM) 64-Bit Server VM (build 16+36-2231, mixed mode, sharing)
I got the same error just now, the issue seems with Java version.
I installed java, python, spark etc. All latest versions... !
Followed steps mentioned in the below link..
https://phoenixnap.com/kb/install-spark-on-windows-10
Got the same error as you.. !
Downloaded Java SE 8 version from Oracle site..
https://www.oracle.com/java/technologies/javase/javase-jdk8-downloads.html
Downloaded jdk-8u281-windows-x64.exe
Reset the JAVA_HOME.
Started spark-shell - it opened perfectly without any issues.
FYI: I don't have neither java or spark experience, if anyone feels something is wrong please correct me. Just that it worked for me, so providing the same solution here.. :)
Thanks,
Karun
I got a similar error on MacOS. The problem was with Java (I was using JDK 17), had to downgrade or use a different version.
Ended up using this:
https://adoptium.net/releases.html?variant=openjdk11
Download and install. Might have to remove your JDK17 version.
Easiest solution :
Latest version of Java (JDK) is not supported by Spark.
Please try installing JDK version 8. This will solve the error.
I downloaded Confluent Platform in my local windows machine & tried to start zookeeper, but it is giving me below error:
c:\confluent>.\bin\windows\zookeeper-server-start.bat .\etc\kafka\zookeeper.prop
erties
Classpath is empty. Please build the project first e.g. by running 'gradlew jarA
ll'
Confluent does not test their products on Windows, last I heard.
The recommendation is to install WSL or use the Confluent Docker containers.
I have a job in Jenkins (Jenkins is running from a container) which uses a groovy script with the line:
import groovy.sql.Sql;
def driver = "oracle.jdbc.pool.OracleDataSource"
when running it, I'm getting the exception:
java.lang.ClassNotFoundException: oracle.jdbc.pool.OracleDataSource
I know what is the root cause, but don't know how to fix it.
The root cause is that all our Jenkins containers are installed without Oracle, and ojdbc7.jar can be found in the java classpath under Oracle:
classpath /oravl01/oracle/12.1.0.1/jdbc/lib/ojdbc7.jar
in regular Jenkins server, it runs without any issues.
any idea how to fix it?
I did 2 things:
1. copied the file ojdbc7.jar to the path mentioned in java.ext.dirs (in illinXXX:XXX/systemInfo)
2. Ran the docker run command with --env classpath=[path of ojdbc7-12.1.0.2.jar]
I'm trying to deploy a test java program on mesos cluster using marathon. I've created a tar ball with all required jars and config files. The tar also has a cmd.sh that launches the app. Cmd.sh snippet
chmod a+rx *.jar
java -ea -Dlog4j.configuration="file:./log4j.prod.properties" -cp my-app-0.1-SNAPSHOT.jar:lib/* package.name.main.class.name
This tar is provided as an URI. I can see in log file that tar is downloaded and unpacked correctly. But execution fails with error
I0331 23:00:35.135365 30558 exec.cpp:134] Version: 0.27.1
I0331 23:00:35.137852 30588 exec.cpp:208] Executor registered on slave 11aaafce-f12f-4aa8-9e5c-200b2a657225-S1
./cmd.sh: line 5: java: command not found
Any idea why its not finding java? I'm not using any custom container. Only parameters I've set are id, cpu, mem, instance, uri and cmd=cmd.sh
Is this the recommended way to run java apps? My java program is stateless(uses zookeeper for state) and I intend to run several instance of this app. Please feel free to suggest alternate ways of launching or deploying such an app.
Please check the path when you run sh -c vs running the from tar.
I am trying to resolve very basic issue with Quick Start guide of Spring XD. But have already spent more than an hour.
I am following guide at http://projects.spring.io/spring-xd/#quick-start
But when I try create stream using following
**stream create --definition "time | log" --name ticktock --deploy**
It does not find standard module "log".
**Command failed org.springframework.xd.rest.client.impl.SpringXDException: Could not find module with name 'log' and type 'sink'**
I tried changing XD_HOME values to
/Users/sudhir/MyPrograms/spring-xd-1.2.0.RELEASE
/Users/sudhir/MyPrograms/spring-xd-1.2.0.RELEASE/spring-xd/xd
/Users/sudhir/MyPrograms/spring-xd-1.2.0.RELEASE/spring-xd
Tried to run xd-singlenode and xd-shell from XD_HOME using complete path.
Well, in any case you should just be able to cd into
$INSTALLDIRECTORY/spring-xd-1.2.0.RELEASE/xd/bin and then run ./xd-singlenode.
cd $INSTALLDIRECTORY/spring-xd-1.2.0.RELEASE/xd/bin
./xd-singlenode
Once you singlenode container is up and running start up xd-shell through $INSTALLDIRECTORY/spring-xd-1.2.0.RELEASE/shell/bin/xd-shell
./xd-shell
And you should at least be able to get tictoc up and running. The ENV stuff is probably just your own environment.