How to stop Spark master on windows? - windows

to start master on windows 10 use:
spark-class org.apache.spark.deploy.master.Master .
What is the command to close it?

Related

Failed to start master for spark in windows 10

I am new to Spark, I am trying to start master manually (using MINGW64 in windows 10).
When I do this,
~/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin
$ ./start-master.sh
I got these logs,
ps: unknown option -- o
Try `ps --help' for more information.
starting org.apache.spark.deploy.master.Master, logging to /c/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin/../logs/spark--org.apache.spark.deploy.master.Master-1-RINKU-CISPL.out
ps: unknown option -- o
Try `ps --help' for more information.
**failed to launch org.apache.spark.deploy.master.Master:**
Spark Command: C:\Program Files\Java\jre1.8.0_77\bin\java -cp C:/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin/../conf\;C:/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/lib/spark-assembly-1.5.1-hadoop2.4.0.jar;C:\Users\Raunak\Downloads\spark-1.5.1-bin-hadoop2.4\spark-1.5.1-bin-hadoop2.4\lib\datanucleus-api-jdo-3.2.6.jar;C:\Users\Raunak\Downloads\spark-1.5.1-bin-hadoop2.4\spark-1.5.1-bin-hadoop2.4\lib\datanucleus-core-3.2.10.jar;C:\Users\Raunak\Downloads\spark-1.5.1-bin-hadoop2.4\spark-1.5.1-bin-hadoop2.4\lib\datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip RINKU-CISPL --port 7077 --webui-port 8080
What am I doing wrong , Should I have to configure Hadoop package also for Spark?
Just found answer here: https://spark.apache.org/docs/1.2.0/spark-standalone.html
"Note: The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."
I think windows is not a good choice for Spark, anyway good luck!

How to Start and Stop Cloudera Cluster CD5 Using Command Line or Shell Script

I have installed Cloudera Cluster on AWS EC2 instances.
Easily I can start or stop it using cloudera manager.
But now I want to make a shell script that can start or stop it.
What is the command line to start and stop the cluster and all its services?

Start hbase in CDH5 VM in standalone mode

How can I start my Hbase in Standalone mode in a CDH5 VM. In CDH3 VM, I used to run
'sudo sh start-hbase.sh'
in the below path:
/usr/lib/hbase/bin
But, I can only see 'start-hbase.cmd' in the above path in CDH5 VM. Please let me know how can I start my HBase instance by invoking the above '.cmd' file
We can use the following command to start a service in CDH5 VM
sudo service <(service name)> start
eg:
sudo service zookeeper-server start
or we can also go to the path
/etc/init.d
and execute the same command as above!

Unable to start Mesos slave on single node cluster

From what I know I am able to set up Mesos master, slave, zookeeper, marathon on a single node.
But once I execute the command to start mesos-master and after that I am trying to start mesos-slave as well but I don't have any way to continue to execute other commands else where. I have to stop the running and run but the problem is mesos-master already stop running.
Don't execute the commands directly from your shell, you want to start all of those components (zookeeper, mesos-master, mesos-slave, and marathon) as services.
/etc/init.d/zookeeper start
start mesos-master
start mesos-slave
start marathon
I forget if zookeeper creates the init script as part of the install for you or not, you may have to find it in the Hadoop docs.
As for the other 3, they all use 'upstart' and you can find the configuration files in /etc/init/

Could not find and execute start-all.sh and Stop-all.sh on Cloudera VM for Hadoop

How to start / Stop services from command line CDH4 --. I am new to Hadoop. Installed VM from Cloudera. Could not find start-all.sh and stop-all.sh . How to stop or start the task tracker or data node if I want. It is a single node cluster which I am using on Centos. I haven't dont any modifications.
More over I see there are changes in the directory structures in all flavours. I could not locate these sh files on the VM for my installation.
[cloudera#localhost ~]$ stop-all.sh
bash: stop-all.sh: command not found
Highly appreciate your support.
use Sudo su hdfs to start and to stop just type exit it will stop all the services.

Resources