Recently i heard that apache Ambari support Hue (which is a Cloudera component) but I'm not sure if I can use it on my HDP 2.5, and if it can work well for my cluster with no problem (I use Ambari 2.5)
Does HDP support Hue?
We believe they officially stopped, but as Hue is compatible with any Hadoop, people still install it the clusters: http://gethue.com/hadoop-hue-3-on-hdp-installation-tutorial/
Based on HDP 2.5.0 Release Notes it does. At the bottom of the page, you'll see:
Additional component versions:
Cascading 3.0.0
Hue 2.6.0
Related
I would love to have hadoop and few other packages in newer versions that the current ambari with HDP2.6.3 allows.
Is there an option for this kind of single components version upgrades?
This feature will not be ready until Ambari 3.0. See AMBARI-18678 & AMBARI-14714
Depending on what you want to upgrade, though, I wouldn't suggest it.
For example, HBase, Hive and Spark do not yet support Hadoop 3.0. The streaming components of HDP like Spark, Kafka, NiFi seem to release versions more frequently, and there are ways outside of Ambari to upgrade those.
You don't need HDP, or Ambari to manage your Hadoop, but it does make a nice package and central management component for the cluster. If you upgrade pieces on your own, you risk incompatibility.
HDP is tested as a whole unit. The Hortonworks repos that you setup in Ambari limit what component versions that are available to you, but this does not stop you from using your own repositories plus Puppet/Chef from installing additional software into your Hadoop environment. The only thing you lose at that point, is management and configuration from Ambari.
You could try to define your own Ambari Mpacks to install additional software, but make sure you have the resources to maintain it.
Ambari upgrade steps are well documented in the Hortonworks documentation, You may follow the below link for upgrading Ambari + Hadoop components
https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-upgrade/content/upgrading_ambari.html
https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-upgrade/content/upgrading_hdp_stack.html
All 2.6 package urls are available in the below
https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-installation/content/hdp_26_repositories.html
You can do single components (HADOOP -include both HDFS and YARN, HIVE, OOZIE etc, ) upgrades using yum or apt-get or other package managers, however Single component upgrade in the Hadoop cluster is not recommended due to dependency issues - services might fails sometimes, So it is better to have to complete HDP stack upgraded instead of upgrading individual components.
Also you need to check Amabari version compatibility in the hortonworks documents, If you planning to upgrade only Hadoop core packages without Ambari, other wide cluster monitoring might fails
I'm trying to install ambari server + agents.
I have a doubt regarding ambari.
I tried to install ambari.
It always gets link with hortonwork
My doubt is that I have hadoop cluster of my own in Ubunu 16.0.Will ambari only work with HDP or is it possible to also make it work with custom built clusters?
Or if possible please share me detailed descriptive documentation
It's not clear where you downloaded Ambari from, but it sounds like you used the Hortonworks version of it. Not directly from https://ambari.apache.org
Ambari works with the concept of stacks. Each stack has a set of services and components. HDP is such a stack, but there are others, or you can even define your own, so yes, you can manage your own Hadoop installation components, but that really would be not much different from what Hortonworks already provides.
Besides, the HDP services and components have been tested to work together more throughly than off the shelf Hadoop installation.
If you don't want HDP components, there is also the Apache Bigtop project that provides installation packs for many Hadoop related services
Ambari expects Java and Hadoop to be installed in a certain way. I'm not sure how easy it is to setup for an existing Hadoop install.
I have a 6-node-cluster running Hortonworks HDP 2.5.3 and Ambari 2.4.2.0
I want to install Apache NiFi on this cluster. When looking in the documentation, the following line jumps to my eyes:
1.1. Interoperability Requirements
You cannot install HDF on a system where HDP is already installed.
I wonder how I can install NiFi on my cluster. I would like to manage it with Ambari too, if possible.
Should I just go ahead and install the standalone version of NiFi and changing the port to something else than 8080, which is in use by Ambari? The problem is that I'd have to install it on every node and this process is not automated.
Currently you can only install one stack into a given Ambari instance, and there is an HDP stack which does not include NiFi, and an HDF stack which includes NiFi, Kafka, Storm, and Ranger. So you need a second Ambari instance where you can install the HDF stack. You also can't share nodes between two Ambaris because there can only be one Ambari agent running on a node.
There might be enhancements in future Ambari releases to improve this situation, but for now if you are limited to using your 6 HDP nodes then you would have to install/manage NiFi manually using the RPM or TAR.
As of HDP 2.6.1 it is possible to install HDF components on an HDP cluster. See https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.1.1/bk_installing-hdf-and-hdp/content/ch_install-ambari.html
Since the latest HDP 3.0, it can add HDF 3.2 and work together with NiFi
I have a Ambari cluster to manage my hadoop/spark jobs. I want to schedule my workflows using oozie editor. Hue is the most popular and easy to use one. How do I install hue on top of an existing hadoop cluster managed by Ambari.
Thanks
Hue is a service created by Cloudera. You cannot install it using Ambari, but you can download a package with Hue and install it based on official documentation. You should check this article - Installing Hue 3.9 on HDP 2.3
Based on what I have read, the mandatory tools for FIWARE COSMOS using HDP 2.5 are:
COSMOS (HAAS engine for shared Hadoop e.g. tested with HDP 2.2)
HDPF-based storage cluster (tested with HDP 2.2)
Cosmos GUI installation and configuration & GitHub
Questions:
Could the FIWARE team please share the instruction on how to install and configure Cosmos GUI to HDP 2.2 or 2.5 (installed using HortonWorks Sandbox)
Please provide the Light-version Cosmos Big Data architecture using HDP 2.2.
I am not sure if the HDP (2.2 or 2.5) comes with HttpFS server?
Reference: BigData Analysis - Installation and Administration Guide
(Posted on behalf of the OP).
I am gathering information here for HDP 2.5 user who would like to use the FIWARE framework and Cosmos GUI. In the process of questioning, I found other links that help to address the issue I have in setting up HDP 2.5 for FIWARE
Answer: HDP does not come with HttpFS, [instruction][3] to [install][4] are provided.
Suggestion: perhaps provide a Docker file to COSMOS GUI to simplify the installation to HDP? The user is expected to install HttpFS to HDP.