Cannot install Storm on HDP 2.2 Sandbox - hadoop

When I acces the ambari dashboard on the HDP 2.2 Sandbox VM from the url http://127.0.0.1:8080/ the ambari service shows all the services with a "?" simbol, including storm. When i try to install, I cant see the install wizard link anywhere, and if I go to the url http://127.0.0.1:8080/#/installer/step0 it returns to the dash.
In the tutorial, they just open the dashboard and everything is set up, but i'm unable to start my storm cluster.
Thanks in advance.

You shouldn't have to install anything.
On the services (or hosts) page, you should find an Actions button that lets you stop and then start all services. Sometimes a particular service doesn't start well and you'll need to dig in to why that's happening.

Related

Kibana service is running but can not access via browser to console

Good afternoon community,
I have the following problem, which will surely be silly, but I do not give with the key. I have a Debian 9 machine where I have ELK installed (Elasticsearch, Logstash and Kibana). It has been configured as it comes in the documentation. The installation versions correspond to the 6.x.
The services start correctly each and every one of them, the problem is when accessing the Kibana console through port 5601, when I access the browser and enter the url : 5601, I get a message that tells me the following "Kibana server is not ready yet"
I have not configured much more than what the official documentation says, so I do not understand why I do not lift the console to start configuring it.
Thanks in advance.

Can't access ambari UI

I want to use Hadoop. Unfortunately, I cannot access the Ambari login. How do I fix this?
Looks like you have some other web server running on port 8080 for Apache + Postgres
Also, the link in the VM says 1080. Can you get there? Can you access the other UI's on the HDP quickstart dashboard? If so, then Ambari isn't the issue.

h2o cluster comes up successfully, but web ui won't work

I am trying to instal h20 on hortonworks sandbox 2.4 by following http://www.h2o.ai/download/h2o/hadoop. Everything runs well, I see the messages
"Blocking until the H2O cluster shuts down..." and "open h2o web flow through 10.0.2.15:54321".
But when I go to that page, it is not loaded giving ERR_CONNECTION_TIMED_OUT error.
What should I do to connect h2o web page?
Thanks.
This sounds like it could be an issue with using a private vs public IP. See if you can ping 10.0.2.5. If that is getting a timeout then the machine is not reachable. If you can indeed reach the machine then see if you can reach the service: wget http://10.0.2.15:54321

amabri metrics collector error

We have 5 node hortonworks cluster with ambari monitors installed in all nodes and metrics collector installed in master node.
I am getting Connection failed: [Errno 111] Connection refused to 0.0.0.0:6188
PFA for error.
https://drive.google.com/file/d/0B85rPUe3-QPXbXJSRzJmdUwwQU0/view?usp=sharing
I followed the below document and tried removing the service and added it.
https://cwiki.apache.org/confluence/display/AMBARI/Moving+Metrics+Collector+to+a+new+host
First of all, I am not able to find the origin of the error. Please share your experience if you ever faced this problem.
This happens sometimes that port is already being use by another process when we try to move collector to new host with 'Curl' commands specified on apache wiki.
Istead of doing using this you can leverage the feature which Ambari provides from it's GUI to move components from one host to another host .
'Move Master Wizard'
Follow the steps stated at Move Master Wizard , Ambari will take care rest of things for you.
I have fixed this issue by killing the process running in that port and restart the service. You can also do a manual reboot of the machine to fix this issue.

Administer NFS gateway (Hortonworks stack)

I believe in the latest version of Ambari (2.1.1) you can administer NFS gateways via the front-end. When I navigate to the HDFS service summary I see the NFSGateways hyperlink (NFSGateways 0/0 Started) but when clicking on this link there is no option to start a new NFS3 gateway and I have to resort to the command line (Hadoop nfs3) command to start it up.
I've had a trawl around but can find no documentation on how to start NFS up via Ambari, does anybody know how I can do this?
The "NFSGateways 0/0 Started" means you have no such components installed.
On hosts page, go to some host page, choose "Add component" and select NFS Gateway. After installing start it. You should be able to use it then

Resources