JPS command shows only JPS - hadoop

I installed hadoop and tried to run it. The terminal shows that everything has been started but when i run jps command it shows only jps. I am new to ubuntu and we need to use for academics can anyone help me run it.
I installed java using sudo apt-get install open-jdk
My usr/lib/jvm directory looks like this
The following are my hadoop configuration files:

Its probably due to the users you are using . I can see start-all.sh with different user and jps with a different user. Run both commands with the same user

Related

error while installing kylo specific services for nifi

I am trying to install kylo 0.8.4.
There is a step to install kylo specific components after installing Nifi using command,
sudo ./install-kylo-components.sh /opt /opt/kylo kylo kylo
but getting follwing error.
Creating symlinks for NiFi version 1.4.0.jar compatible nars
ERROR: spark-submit not on path. Has spark been installed?
I have spark installed.
need help.
The script calls which spark-submit to check if Spark is available. If available, it uses spark-submit --version to determine the version of Spark that is installed.
The error indicates that spark-submit is not available on system path. Can you please execute which spark-submit on the command line and check the result? Please refer to the screenshot below for expected result on Kylo sandbox.
If spark-submit is not available on the system path, you can fix it by updating the PATHvariable in .bash_profile file by providing the location of your Spark installation.
As a next step, you can also verify the installed version of Spark by running spark-submit --version. Please refer to screenshot below for an example result.

Run HDFS as sudo user

Due to a script I am using to install Hadoop, I am having to run all commands as sudo. I am trying to start HDFS by typing "start-dfs.sh", but it says
sudo: start-dfs.sh: command not found
I typed the full path name i.e. sudo /home/ubuntu/hadoop-2.2.0/sbin/start-dfs.sh, but that did not work either. Does anyone know how I can start the HDFS service as sudo? Thanks in advance!

Namenode is not running without errors in cygwin

I have installed hadoop2.2, ssh, java in my cygwin. When I tried to run the namenode using this sbin/hadoop.daemon.sh start namenode", I'm getting no errors but when I run jps command once, I can see the namenode running once. After 10 sec it is not running.

Not able to run Hadoop daemons

When I run the jps command:
I only see jps as the running java program in return.
When I run start-all.sh command, I receive errrors like:Connection to port 22 refused
Hadoop start-all.sh script uses SSH for managing it's services. Looks like that your computer/cluster:
has no ssh daemon installed. You can install it on Ubuntu with sudo apt-get install ssh.
(less likely) sshd uses non-default port. Check your sshd configuration.

JPS command not showing all the working daemons of hadoop

I have installed hadoop via cygwin in window 7 .All daemons of hadoop has started but jps failed to show tasktraker,secondarynamenode,datanode. I checked in logs , all are working fine. What could be the problem?
Thanks in advance.

Resources