Ambari HDP Files View setup Connection Refused error 500 - hadoop

I installed Ambari HDP 2.3.6.0 on my ubuntu machine and i'm able to web login to the ambari and start all hdp services. However when I setup Files View setup accoriding to Hortonworks instructions, I'm getting an error '500 User: root is not allowed to impersonate admin'
I have setup hadoop.proxyuser.root.groups and hadoop.proxyuser.root.host according to HDFS setup instructions and also configured Files View
Please help as I'm new to this and just getting started.
Thanks

Related

Can't access ambari UI

I want to use Hadoop. Unfortunately, I cannot access the Ambari login. How do I fix this?
Looks like you have some other web server running on port 8080 for Apache + Postgres
Also, the link in the VM says 1080. Can you get there? Can you access the other UI's on the HDP quickstart dashboard? If so, then Ambari isn't the issue.

How to reinstall Ambari Server on a crashed node and migrate the Cluster settings?

Two of my drives crashed on the Ambari Server node so I have to re-migrate my Ambari Cluster. No real data was lost (due to a different backup strategy) but the configuration files of the node, including Ambari Server configuration, are gone.
Because two drives crashed, I can not access any files from that node anymore (RAID 5).
I am now in the process of reinstalling the Ambari Server on the same node and would like to have my agents seamlessly reconnect to the "new" Ambari Server.
Is there a way to migrate the existing Cluster settings to the Ambari Server? I am thinking of Cluster settings that were distributed to the agents or similar.
If there is no such way to migrate the cluster, how would I go and install the Ambari Server? Do a fresh install and setup everything again? Will the Ambari agents be able to connect to the "new" Cluster without problems? Note that the Ambari Server will run on the same hostname/ip.

Cannot install Storm on HDP 2.2 Sandbox

When I acces the ambari dashboard on the HDP 2.2 Sandbox VM from the url http://127.0.0.1:8080/ the ambari service shows all the services with a "?" simbol, including storm. When i try to install, I cant see the install wizard link anywhere, and if I go to the url http://127.0.0.1:8080/#/installer/step0 it returns to the dash.
In the tutorial, they just open the dashboard and everything is set up, but i'm unable to start my storm cluster.
Thanks in advance.
You shouldn't have to install anything.
On the services (or hosts) page, you should find an Actions button that lets you stop and then start all services. Sometimes a particular service doesn't start well and you'll need to dig in to why that's happening.

What is the difference between apache Ambari Server and Agent

What is the difference between Apache Ambari Server and Agent?
What is the role\tasks of the server vs Agent?
Ambari server collects informations from all Ambari clients and sends operations to clients (start/stop/restart a service, change configuration for a service, ...).
Ambari client sends informations about machine and services installed on this machine.
You have one Ambari server for your cluster and one Ambari agent per machine on your cluster.
If you need more details, Ambari architecture's is explained here

How to Start withHadoop in Oracle Virtual box

I have configured Hadoop with a Hortonworks Sandbox and mounted it with Oracle Virtual Box. Now when I am starting the virtual box machine, the Linux system for Hadoop is booting up and there is an option Alt+F5 for start. But when I press Alt+F5, it asks me for a username and password.
I haven't specified any username/password during the time of installation when the virtual box is starting the Hortonworks Sandbox is running locally on my machine. So I am confident that my Hadoop is installed successfully.
How do I proceed?
You can login/SSH to the hadoop with the below credentials
hue/hadoop
root/hadoop
I've listed out these and other handy URL's that can help you get started in my (full disclosure) blog here. Go through it if you are interested.
For Hortonworks Sandbox the default username and password if I remember correctly is root/hadoop. Hope this helps.

Resources